Type to search

Archive Equities

Driving Latency Monitoring Across The Marketplace

With David Snowdon, Founder, CTO, Metamako
David SnowdonA hundred microseconds has become the target accuracy for trading systems’ trading reports – after much debate that is what the regulators have settled on. However modern markets operate at a much faster pace, and for the regulators to be able to understand the ordering of events the accuracy of those timestamps must be much more accurate. They want to be able to reconstruct what happened in the market with confidence, but the current greater requirement of 100 microseconds is just too broad to be able to do so.
Without becoming too technical, a system using 10G Ethernet – the predominant standard for communication in modern markets – gets 64 bits of data every 6.4 nanoseconds – 6.4 billionths of second. It’s a fundamental part of the way that communication occurs. That means that Every 6.4 ns a block of data is transferred – part of a message. Within that 6.4 ns, the timing is much less significant. Effectively, two packets which begin to be transmitted during the same 6.4 ns period will appear as simultaneous to the system which is receiving them. It’s a position which is arguable from the fundamentals of the way in which computers communicate.
That’s about 10,000 times more accurate than MiFID II’s hundred microsecond requirement. In a hundred microseconds you can easily get an entire order to the exchange, and on some exchanges it’s possible to execute an entire order, get the response back, and place a second order and both orders would be considered to be simultaneous within the regulations!
While the present requirements are not sufficient for the stated purpose, the required accuracy can be adjusted. In some sense, this is only the beginning…
As time goes on and firms become more comfortable with the technology, the nirvana of traceability can be achieved. While this won’t achieve the goal in the near term, this is a huge step forward.
Competing with HFT
There’s been a lot of debate surrounding market makers and high frequency trading generally. To compete in the modern world firms need to have good information and analytics to base their optimisation decisions on. Can you imagine a floor trader relaxing with a coffee during a market crash? It might be comfortable, but it’s hardly the way to succeed. A firm without network visibility is a firm which can’t understand its own behaviour, since the boundary of the network is the definitive point at which the trade can no longer be affected.
Firms have a responsibility to their customers to get it right, and the requirement of accurate time reporting is a chance for these to have a solid understanding of what works and what doesn’t.
Behaviour versus cost
The technology is not the issue for the majority of firms. If a house is looking to be absolutely at the cutting edge and they’re only relying on latency to get an advantage, then, yes, they need to put a lot of money into the technology. But on a basic level, the technology that’s required to be competitive is not expensive.
The HFTs are the disruptors of the trading world. They are the Ubers, Googles and Facebooks. They are using technology to do things better than the slow firms. Their company culture is agile and they are technically aggressive. Their employees are highly motivated to perform and more… they’re willing to take technical risks.
Driving the change
The current regulatory drive is bringing significant attention to the issue of accuracy of network monitoring and time stamping, and if that drives firms down to the microsecond or nanosecond accuracy level, which is achievable without a massive investment, then it’s a very positive step. Forward looking firms are looking at MiFID II not as a burden, to meet the letter of the regulation, but an opportunity to push the technology internally and get a number of other benefits from the data. These firms are going much further than what’s actually required to meet the specification.
More than that, firms are not just looking at Europe. They’re looking worldwide. Again, this is an opportunity to understand what’s going on and apply that knowledge to improve the process.
Future latency
The speed of light is constant, given distance and medium. The debate though is who hits the end of that fibre first, which is where you get to start trading. You can have a substantial response time from the exchange but still care about how consistent your own internal switches and trading systems are.
Once firms get their heads around the problem of basic latency monitoring and consistency, they will start to push further than the regulation requires. Without measurement, it’s impossible to implement effective improvements.
The regulators have done well to bootstrap the process, but if we want to understand who did what, when, and in what order, then this is just the beginning. We’ve got four orders of magnitude to go.
We’d love to hear your feedback on this article. Please click here