The Post Trade Debate Deserves to be Data Driven

By Phil Mackintosh,
SVP, Chief Economist, Nasdaq

Phil Mackintosh, SVP, Chief Economist, Nasdaq

Phil Mackintosh,
SVP, Chief Economist, Nasdaq

In the past 20 years, markets globally have automated and modernized. Computers are now responsible for the majority of executions in stocks around the world, and the quantity of data that is available to analyze trading has exploded. There are now more than over 50 venues for trading, and over 35 million trades every day in the US equity market. Just recently FINRA highlighted that they had 135 billion records in a single day. At the same time, transaction costs have also fallen significantly. That makes squeezing the last basis points out of your trading desk increasingly more complicated. We are arguably in the last mile of institutional TCA improvement, where routing, signaling and opportunity costs are a material factor to measure. Unfortunately, regulatory solutions like the new 606 and existing 605 rules remain aggregated, making them inadequate for institutions to quantify these costs. Institutions need granular access to their routing and trading data – timestamps in microseconds for order send, receipt and cancellation across all trading and IOI venues. Once you have that, harnessing the modern powers of “big data” will make it easier to find patterns and quantify costs. Only then can they improve strategies that reduce trading frictions. Enhancing FIX to do this is a compelling idea. FIX is already widely used globally for sending order level data back to institutions. A globally consistent file format will allow other experts to cost-effectively build analytic solutions. Most importantly, institutions will have access to all their data, giving them the power to analyze what they choose. The economics of routing deserves to be data driven.

recommend to friends
  • gplus
  • pinterest