Towards A Consolidated Tape

As the regulatory juggernaut gathers pace, Alexander Neil, Head of Equity and Derivatives Trading at EFG Bank examines the issues behind the tape, and what the buy-side wants. 
Many of my buy-side peers have given up hope on a consolidated tape (CT), but the success of a CT is absolutely paramount now, even more so than it was a few years ago. Not just for industry insiders, but for politicians and the outside world to be shown that these can be transparent markets and that we are not penalised by misguided efforts to force volume onto lit markets, abolish dark pools or volume-enhancing factions such as certain HFT activities. In such a low-volume, low-commission environment I feel the stakes are especially high to get this right from the first day, and not let it drag on and into MiFID III . It shouldn’t be this hard to track trades in Europe and it’s funny to think that whilst we’ve seen a real race to zero in pre-trade latency, it feels like the post-trade space is being drawn out over years!
The Holy Grail is a ‘quality’ tape of record at ‘reasonable cost’. But what is a fair price for market data? Is it really something that can be left to market forces, or is it one of those things that should be regulated like electricity prices. After all, there are social responsibility aspects to market data, as ultimately higher costs for the buy-side are implicitly passed on to the broader (investing) public.
There were initially three routes that the European Commission (EC) wanted to take us down. The first route was to employ the same model as they did for execution and let the invisible hand of the market find the best solution and pricing through healthy competition. The second option was for a prescribed non-profit seeking entity to manage the CT, and the third option would be a public tender with just one winner. The EC seems to be leaning towards the fully commercial approach, and it has set the stage for a basic workflow where APAs (approved publication arrangements) collect and pass on the data to Consolidated Tape Provider(s) (CTP). But, if market forces alone could find a compromise between cost and implementation, we would have an affordable and reliable European Consolidated Tape (ECT) in place already, and MiFID II could instead concentrate on new problems.
So my first concern with the purely commercial approach is that so far, it hasn’t worked; incumbent exchanges are still charging pre-MiFID levels for their data (despite, or indeed because of, their diminished market share in execution), and the only real effort to break the stalemate (namely, the MTFs throwing down the gauntlet) will end up just penalising the buy-side more in the short-term. If the regulator doesn’t address data pricing head on, the buy-side may well end up suffering the effects of a scorched-earth move (wasn’t ESMA granted more powers than its predecessor CESR after all?).
Industry Initiatives
However, it’s not all bad. The COBA Project has recently announced a proposal which promises to address these commercial obstacles and has initial support from exchanges and other venues which contribute more than 50% of total turnover. Their solution establishes a new commercial model for Consolidated Tape data which lowers the cost and incorporates the best practices recommendations developed by FPL and other industry working groups. The best practices provide details on how trade condition flags should be normalised thereby enabling consolidation of trade data across exchanges, MTFs, OTC and even BOAT. FPL’s best practices recommendations also bring together wide representation from across the industry, and has been concentrating on data standardisation (including timestamp synchronisation and a clear distinction between execution times stamps and reporting time stamps).
The Coba Project is spearheaded by two former exchange and MTF people, and seems to be the most ambitious in terms of setting a deadline (Q2 ’13). For their sake I would like to see that good work recognised, but the EC has not officially endorsed them and I see this as one of the main failings so far. Without this endorsement or intervention, I worry that the whole effort will run out of steam. And if that happens (if the regulator doesn’t give the industry a nudge) I worry it will ironically signal the failure of the freemarket approach and the regulator will have to make an embarrassing U-turn and go for the prescribed, utility model. Remember the case of BOA T, which had the potential to become an ECT, but it perhaps wasn’t endorsed enough.
So we’re in a position where the exchanges and data vendors are rushing to try and come to a mutually beneficial solution BEFORE the regulator steps in and forces a US-style consolidated tape, and by doing so, potentially remove the commercial benefits for exchanges and vendors.
Being a CTP in itself will be a tough business though, and I wonder if there’s such a thing as a commercially-viable CTP proposition: Not only will they operate in a highly regulated business, but a few years down the line there’s the possibility that Europe goes the same way as the US and starts looking at moving away from a CT and instead getting direct fees from the exchanges (a sort of parallel industry, not quite direct competition). Not only that, but because under current proposals their product will be free after 15 minutes, I expect more investors might just accept a 15 minute lag and get the data for free.

Pre-Trade BBO
The lack of a pre-trade BBO only lessens the commercial prospects, at least if we look at the US as an example, where SEC-mandated best execution requires referencing of the NBBO and the business model directly links to the pre-trade snapshot. I would welcome a US-style model where exchanges provide their data free of charge to a (single) Consolidated Tape Provider, in exchange for a cut of the revenue based on use by final clients. In the US, the basic time and sales data is treated almost as a commodity, and non-proprietary (meaning it belongs to all exchanges, rather than a single exchange), and exchanges then get paid for the revenue from subscriptions to the consolidated tape according to an SEC-approved formula, and some of that revenue is paid back to customers. This practice of redistribution of CT revenues back to users/clients has propagated the rise of HFT though, and European politicians would surely not want to be seen to be encouraging this.
Pre-trade figures heavily in the US model, which so far hasn’t been needed in Europe because we don’t have the same ‘tradethrough’ rule here, but rather a more principles-driven model of ‘best execution’ that is more open to ad-hoc interpretation. So yes, the lack of pre-trade data might make it less profitable and whilst the US commercial model might be viable in Europe, I don’t think market forces alone will get us there. I would welcome the regulator to come in a bit stronger, just as they did in the US.
As a buy-side trader on a global execution desk, I would welcome proposals that used the US consolidated tape pricing model as a rough benchmark (whilst recognising the much more complex picture over here). I understand that a full set of data costs something like eight times less stateside.

Multi-vendor solution
My gut feeling is that the multiple operator commercial approach might see the light of day, given recent industry work on data harmonisation and my assumption that the regulator will eventually set clear rules over data ownership and pricing. But with multiple providers comes the possibility that we will be penalised by high implementation costs: Just as execution fragmentation required a lot of IT investment by mid-tier brokerages (SOR, etc), I worry that a multiple-vendor consolidated tape solution will lead to fragmentation of data that will require large IT investment by mid-tier buy-sides (aggregating what, in my mind, should be a commoditised product, in the post-trade space at least). Secondly, I worry with a purely commercial model we might see the same pitfalls as with execution fragmentation, i.e. an initial scramble to gain market share (with free data from the MTFs, some sort of rebate or other incentivisation, but then a maturing market with lower rebates, data being charged for by all parties and 90% of the market is controlled by two players). This boom and bust cycle might be good long-term for the industry, but in the short-term can end up penalising the buy-side.
If we step back a bit, I wonder if fundamentally a commercial model is really going to work, for what will surely be a very difficult business, in a potentially persistently low volume environment, squeezed at both ends (exchanges keeping prices high, and the buy-side not confident enough in the data to pay a premium), and the data being free after 15 minutes. I wonder if people will be falling over themselves to become CTPs.
Whatever model we end up with I would want to avoid the old model where you pay the exchange to report your trade, and then you pay them again to get the information back! This raises the question of who owns the rights to that data.
Who pays?
Who will pay? Who should pay? Who should be making money off the back of this data and what margins are acceptable? This point might need a more heavy-handed approach from regulators to gently encourage incumbent exchanges to revise their pricing structure. The incumbent exchanges insist that their data is worth the premium, sometimes pointing to the crucial closing volume prints (essentially still a monopoly business). But there have been calls for the regulators to allow MTFs to step in at the closing auction if the primary market cannot fulfil that duty due to IT failures for example, and I would support this initiative. If this happened, the incumbent exchange would probably find it harder to justify their current pricing model.
On pricing and data ‘ownership’ I feel we need a full-on regulator approach to ensure that prices don’t just come down marginally, but come down ‘enough’. No business is going to accept to take a hit on their top-line or margins unless they’re forced to. I believe the MEP Kay Swinburne has been vocal in doubting whether a purely principlesdriven approach is going to be enough, and I support her in this.
The bottom line
I would like to see something that trickles down to benefit the real economy, or end users, and not just boost market share for exchanges, HFT and a (potentially) commercial consolidated tape enterprise, or any other mutually beneficial arrangements that do not allow for some sort of wealth creation across the industry (whether it be profit sharing, or implicit cost savings).
Just as trading fragmentation helped ultimately lower explicit execution costs, I would hope that data fragmentation would do the same, and not just create new industries with little immediate benefit to longterm investors. Especially because market data costs are trickier to pass on to our final private banking clients, and ultimately we want our final clients to benefit from lower overall equity investing costs (execution, clearing and market data).
Something I would also like to see is more transparency on ETFs, which currently feels like they are even further away from getting a consolidated tape, and for which reporting obligations are even laxer than equities (for example, sometimes they are double reporting, once at creation or redemption of new units and then again if a secondary-market trade is concluded). So this is another area that I would want the regulator to help forge a complete view of trading activity.
Besides data pricing and ownership, a fundamental problem is data quality, including flag harmonisation. Here, I don’t believe we’ll hit a brick wall, like we have on pricing, thanks to the industry initiatives mentioned earlier. Ultimately, high quality data (that doesn’t unfairly penalise any part of the investing community, whilst allowing for ECwide rules on delayed reporting for large blocks) is something I think everyone is looking for. There’s nothing to be lost by having it.
In terms of data quality, I would be happy to see something that helped us to form an accurate view of OTC trades. We still can’t accurately spot OTC in the market, so how are we supposed to judge whether they are meaningful and make informed trading decisions on the back of that? Will the new OFT classification kill or help OTC/dark pool trading? I welcome the proposal that Approved Publication Arrangements (APAs) will harmonise OTC reporting, but I worry that the current highlevel principles are too vague and could see us right back in the murky position we find ourselves in today (have you tried to track a midcap Swiss stock across lit, dark , OTC recently?), with multiple mechanisms and websites and flags and myriad ways to obfuscate prints that are important to the price-discovery process (sometimes it is very difficult for us to answer the question today, ‘how much traded and at what price’?).
So I would like to see very clear identification of whether a trade was lit or dark, or OTC and I would want there to be pan-European standards for delayed reporting. It would be unreasonable to expect all OTC trades to suddenly become onexchange and reported instantly, but we need to get a handle on what’s happened, with an acceptable delay.
Two things might break the price stalemate: MTFs starting charging for data, and unbundling by incumbent exchanges of pre/post-trade data. Chi-X has already started and is ostensibly promoting an industry-led commercial route and incumbent exchanges have started unbundling, (although I would like to see further unbundling, of say large cap versus midcap, continuous prints and auction prints) almost a pay-as-you-go model? The exchanges may well welcome a more prescriptive approach to unbundling recommendations. They are unlikely to do it on their own and in any case it would be difficult to enforce prices caps, but perhaps a ratio of pre/post data prices relative to each other and then a ratio of the bundled data versus unbundled could be set by the regulator.
I appreciate that consistently lossmaking businesses, like many of the MTFs, have to make money somewhere in order to stay in business, and that is in all of our interests. At some point, if they don’t turn into standalone goingconcerns, we’ll be right back stuck with a monopoly exchange offering. But instead of propping the business up with equity data fees, perhaps their future will be ensured by the move into the different asset classes that MIFID II and MIFIR will tackle.
Future improvements?
Radical thinking, but in order for the CT provider (or providers) to be paid a reasonable amount (and in turn ensure a reasonable income/fee structure across industry and ensure the survival of exchanges and MTFs) perhaps the regulator needs to rethink the 15 minute free-data model (how much demand is there for actual real-time post-trade data after all). Perhaps instead of just two pricing models (a premium one for no delay and a free one for 15 minute delay), there could be a premium one (for live data), an ‘economy premium’ one (where a very low nominal amount would be charged for data with a 5 minute delay, a fee low enough to attract a wider client base than currently, so lower margins, but this should be compensated by more users?) and then a free model that was for data that’s 30+minutes old (perhaps an acceptable delay for most Private Bank users).

Related Articles

Latest Articles