Be careful what you wish for – the future of HFT


Weng Cheah, Managing Director of Xinfin, continues his discussion with prop and quant traders, looking at what the future holds for high frequency trading.

It was inevitable that the world of brokerage, in particular execution, would become faster and more automated. Competition drove the world of physics into brokerage and shaped many new services. However, the challenges of today have switched from achieving nanosecond execution, to maintaining profitability, with volumes that have halved in 2011, and continue to decline in 2012.

Even from traders there is clarity that the race to faster execution has all but ended.  An Asia based proprietary trader shared his thoughts on execution latency, saying “we are several fold past ridiculous,” but, more importantly adding that “speed is not where innovation needs to occur.”

It’s not about the speed of execution…

Magazines, professional literature and business plans are all littered with thoughts and opinions about the future, that vary only by how quickly, or how wrong they are. However, it maybe worthwhile to frame the discussion with the following observations:

·         The last three years have seen significant investment in Low Latency services from technology vendors and some brokers, to the extent that a sub-millisecond average round trip is no longer an achievement; sub-microsecond is normal.
·         In a world with a sub-microsecond norm, there are fewer participants unhappy with this performance. It’s rational that there will be fewer resources dedicated to even faster gateway solutions.

·         Although, it is difficult to foresee the total exclusion of research into hardware acceleration, the cold, harsh reality that is the economics of this business will halt even the most technically promising research project.

·         Data has resumed importance.  However, it is interesting that there is a hierarchy of criticality, where historical data volumes, in particular for new markets or contracts, will be more valuable than low latency market data.

·         Excluding pre-trade risk controls is no longer a valid route to speeding an order to market.  This regulatory arbitrage is no longer a selling point by brokers looking to differentiate their services.

·         Intraday risk assessment and management are the least developed links in the chain. This is true for the technology, but also for the organization structure and methodologies employed.

·         Productivity and back testing tools and for quantitative analysts, were specialized and usually built bespoke for a strategy. However, there is an increasing number of generalized frameworks and back testing products from software vendors.

The key takeaway from these observations is that it is not about speed, and hasn’t really ever been about speed, but about the quality of the decision making processes. The world needs to forget about faster, and start getting a lot more original in idea generation.

… It’s the rate at which you can manage the change of speed

Regardless of your role in today’s modern markets, the development of higher frequency trading has permanently changed the competitive landscape; none more impacted than the intermediary. The broker has changed from chasing technical innovation to differentiate themselves and capture clients, to having an abundance of capacity. It wouldn’t be unreasonable to suggest that brokers will not want (or in fact need) to commit further capital to execution product development.

Furthermore, differentiating an execution product is increasingly more difficult as the relative performance amongst brokers narrow. The result of this will be that price competition is likely to continue and margins narrow.

There was almost a peaceful acceptance in a conversation with a London based quantitative fund manager that “the role a broker in the future… it’s looking doubtful.” Going forwards it is worth pausing, pondering on that, and then adding “at least in this form”.

Although we are all familiar with SEC Rule 15c3-5, it codifies the broker’s obligation to provide independent controls to market access. 

This kind of clarity, after a time immense technology investment, and while we are experiencing declining economics will hopefully lead to some important questioning.

At what level does it make sense to cease investing in your own execution gateways and use the technology provided by a broker?
If you had to account for the regulatory obligations and costs to operate as a broker/dealer or NCM, at what level would this become unattractive?

You would have to expect that in the provision of execution, the participants have reached a plateau. One that favors brokers that can provide at least competitively fast gateways, strong operational controls, and intraday risk management that is supported in the organizational structure and methodology.

Unlike growing technology and infrastructure, the changes of today are organizational, and will continue to change. The fact of the matter is that the shape of the desks is not working. Having a large number of people in product development who are solely focused on speed is not helpful; the people need to be focused on developing the right range of products, to either improve the value of the broker’s technology and the links between the elements in the system, or to branch out into separate technology. People are needed on trade risk management and to manage the relationships between the elements of the trading chain.

“Speed is not where innovation needs to occur”

These thoughts were echoed in the words of a proprietary trader in the US, who said that “No matter how good your models, you cannot run them if you don’t understand them. There is a difference between people who create models and people who only operate machines. We know a bigger and badder computer doesn’t make the system any better.”

Aside from the technical challenges of managing super large volumes of data, as well as ensuring the analysis is statistically representative we must recognize that we still need to deal with the logistics of capturing financial data and opinion information. To move data en-mass will vary in difficultly, from the easy already digitized information such as prices, spreads, volatility etc. The more difficult would be financial data that is encapsulated in reports, analyst’s opinions, regulatory filings and various statistical submissions.

Protocols and standards based communications will have a profound impact on future analysis. In particular, the rising adoption of XBRL for regulatory filings and some statistical returns may become a cornerstone to adding fundamental data.

The common thread in all the conversations is the need to know and study the markets. However, there is much more data and derived data in finance than ever before. Managing data has always been important to find a signal, that “needle in the haystack.” As data volumes rise – we find the rise of Big Data to find a “needle in a needle stack.”

Faster isn’t more transparent

Regulators have amended and clarified rules regarding market access. However, they are yet to make advances in systematic tools that aid in the supervision of markets.

This is critical in re-establishing the credibility of markets and integrity of participants, and in so doing attracting greater participation.

Piece by piece, the rules and tools of twenty-first century trading are getting much closer to providing the efficient market that many academic papers in the past had assumed. In real life, it’s always disappointing to meet your heroes – and it is also the same in trading; disappointing to meet a really efficient market.