Mizuho Securities’ Spyridon Mentzas discusses the status of the Japanese exchange merger and offers thoughts on how well the two systems will merge and the benefits investors can expect.
Compatibility The merger of Tokyo Stock Exchange (TSE) and Osaka Securities Exchange (OSE) is not yet finalized, but it appears they will merge in the beginning of 2013, with the details yet to be specified. The first impression is that they have nearly identical trading rules with some minor differences, such as the OSE trading until 3:10, while the TSE closes at 3:00. When TSE decided to shorten the lunch time in November, the OSE did the same. When one of the exchanges (usually, the TSE) changes the rules, then the other moves in tandem: for example, changing the tick sizes. If the merger does go ahead, it is likely that they are going to use the TSE’s cash system, arrowhead, and OSE’ J-GATE for derivatives. They will use the old systems in parallel, which will achieve a reduction in cost because they will not have to maintain two systems.
Further Industry Consolidation The ECN’s in the US enjoyed technological superiority versus the classic exchanges, where NYSE’s latency was significantly slower than Arca’s. This would have been reason enough for TSE to consider buying a PTS, but with arrowhead’s current latency of less than 2 milliseconds (and another upgrade in the next few months to target less than a millisecond), simply buying a PTS would not give them a noticeable advantage because the TSE and OSE are on par with the PTSs. The reason why PTSs are increasing their market share is that, unlike in the UK and US, where Reg NMS and MiFID have required trading on the exchange with the best price, in Japan the PTSs draw volume through decimal points and smaller tick sizes than the incumbents.
For example, Mizuho Financial Group might trade on the TSE at 105 yen bid, 106 yen offer. That one yen spread is close to 100 basis points or almost one percent, whereas the PTS trades at 0.1 yen. This is a major incentive for investors to buy and sell on the PTSs with their smaller increments to reduce market impact and trading costs. From the beginning, the regulators have not been overly concerned with the PTSs deciding to trade in decimal places and have 0.1 yen ticks. It was always up to the PTSs to decide and the TSE could do the same. If anything, I think the new exchange would rather reduce their tick sizes, than merge again.
However, not all participants would be happy to see new tick sizes, for example, some of the proprietary houses or small firms that trade with retail, as altering their downstream systems to handle decimal places would be costly.
This will also create a fragmentation of liquidity in tick sizes. The bids and offers on the TSE are often thick, with something like 50 billion shares sitting on the bid side, so with 0.1 yen ticks, the average order size might move to 3 million or 1 million shares. Traders who want to buy a large lot will have to scroll up and down to find out how much they have to go up to absorb the available liquidity. I think for the traditional long-only traders, this might mean an increased scattering of liquidity. There is sufficient liquidity in the market at present; even for stocks trading at a low price – there are market makers trying to make 1% during the day. If smaller tick sizes are introduced, that liquidity will likely be scattered or disappear.
Daiwa Capital Markets’ David deGraw catalogs the movements of Japanese markets in 2011 and discusses the various approaches Japan could take with regard to dark pools and High Frequency Trading (HFT).
Volume and Liquidity in Japan Right now, contagion from Europe and the turmoil from the United States have depressed equity transaction volumes across the globe. Once a recovery starts to gain steam, Asia will be the driver for growth and Japan will be a quality play. Due to the perennial underweighting of Japan, I expect volumes in Japan will quickly surpass pre-crisis levels in such a scenario. With exchange volumes being so low, non-traditional liquidity is playing an increasingly important role. We have seen transaction volumes on our nondisplayed liquidity pool as well as PTS volumes continue to grow relative to exchange volumes. We are trying to bring the benefits of crossing to as many client types as possible and our unique position as a principal domestic investment bank enables us to access semi- to non-professional liquidity sources, such as corporate and religious entities, educational endowments, quasipublic institutions, agricultural cooperatives, and retail investors.
Role of PTSs in Japan The role of PTSs has increased steadily since the start of this year and has accounted for as high as 7-8% of market share. The success of SBI Japannext and Chi-X Japan PTS shows that the market is rewarding innovation and efficiency that is created as a result of increased openness and competition. Conversely, the closing of Kabu.com shows that a PTS’s revenue model may not be sustainable over an extended period of low trading volume. Therefore it is critical for participants to carefully evaluate the viability of a venue so that the large upfront technology investments are not wasted.
The implementation of centralized clearing through JSCC was critical for the existing PTSs to rapidly and dramatically expand their share in 2011. However, since August, growth has slowed somewhat along with the rest of the market. Having said that, there are still very good reasons to expect future growth in PTS market share. Both PTSs are working aggressively to on-board new participants and Chi-X has recently announced the introduction of liquidity rebates in Japan. Chi-X have a successful record of growing their market share in Europe with liquidity rebates, and such economic incentives are sure to be strong drivers for growth in Japan as well. In fact, it should open the door for a totally new class of venue fee arbitrageurs to trade Japanese equities. Furthermore, domestic institutions are expected to allow smart order routing to PTSs once regulations are amended to exempt PTSs from the 5% TOB rule.
In this article, Equiduct Trading’s Joint CEO Artur Fischer argues that in times of extreme structural and economic change, there is an even greater requirement for transparency. He believes that in an increasingly fragmented market, there’s an even greater risk that organisations will need even more help if they are to avoid effectively trading in the dark with no clear consolidated view of market pricing. Here he identifies the growing requirement for a new generation of virtual order book that can consolidate all the visible pre-trade information generated from significant relevant markets, effectively delivering transparency and providing firms with access to a single, unbiased source of pan-European equity price data.
The European equity markets have undergone a period of rapid and unprecedented change over the past two years. While some of these shifts have been mainly related to the still-evolving current global economic situation – leading to the disappearance or restructuring of some of the biggest names in finance – others have centred around newlyintroduced regulation, with the arrival of new types of execution venues and cross border clearing venues being among the most obvious and significant.
These changes have created some huge challenges (and it should be said equally huge opportunities) for market participants, whether they be the large broker dealers having to connect to all the new trading venues in search of liquidity, or a pension fund simply trying to understand what the “Best Execution” he has been promised actually means.
Each of the incumbent Exchanges, the new Multilateral Trading Facilities (MTF), and the growing number of Dark Pools or Crossing Networks provides an alternative USP for execution of equity orders, and each operates with a slightly different business model - both pre and post trade. This has understandably stimulated competition for order flow liquidity, introducing alternatives in the post trade space, and leading to a major shake up in fees. Not surprisingly, this has also irreversibly fragmented liquidity. However, this fragmentation is an evolving process; the picture is far from complete or even stable, and can be expected to go through several consolidation and subsequent fragmentation phases before the next “Big Bang”.
Opening up the European equity markets With new entrants into the execution space, Europe’s equity market is opening up for investors from across the world. FIX-compliant technology is enabling easier connectivity to the new venues and providing an opportunity for a wider range of firms to get access to venues over and above the incumbent. In Vol 2 Issue 8 December 2008 of FIXGlobal, John Palazzo of Cheuvreux stated “FIX affords every broker the ability to get into these markets at an unprecedented pace” – at Equiduct we certainly agree, but there are still some considerable challenges.
How, for example, do “sell-side” firms determine whether they should connect to these new venues? How do they then prioritise which to connect to? How do they choose where to actually send their order? Also, how do “buy-side” investors understand which venues their brokers should be connected to, if they are to ensure them the mythical Best Execution? What price should they be using to markto- market at the end of each day and for intraday position risk purposes?
At Equiduct, we’re hoping to provide some of the answers to these important questions. We hope to be able to shed some light on the situation and show how to achieve best execution on the various available platforms with a range of analytical tools. Uniquely, the toolset includes a Pan-European aggregated feed.
Ensuring execution on the most appropriate platform Firms across the trading spectrum, whether small or large, are increasingly using sophisticated smart-order-routing solutions and algorithmic trading systems to “slice” orders and to determine where they should distribute the pieces across the Dark Pools, MTF and Exchanges. However, in order for these systems and indeed an individual trader to start to effectively predict the future, it is important to understand the present and the past. Information providers such as Markit or Fidessa with their Fragmentation Index can confirm the common knowledge that liquidity fragmentation is a reality once a trade has been executed. However they do not have the ability to see how the market should have performed by examining the pre-trade order and price information that was available at the time of trade.
At Equiduct we have been collating all visible pre-trade information (Level II data) for the top 700 shares across Belgium, France, Germany, The Netherlands and the UK from the major European venues (BATS, Chi-X, NYSE Euronext, London Stock Exchange, Nasdaq OMX, Turquoise, Xetra) since April 2008. Yes, a significant percentage of order flow has moved away from the incumbent exchanges but what is not such common knowledge is that trades are still not always executed on the most appropriate platform. Indeed our analysis shows that in April 2009 a significant proportion of trades executed on the incumbent exchanges should have been transacted on an alternative venue, and approximately 35% of executed trades are still not transacted on the best price venue. Significant price improvement could have been achieved if this had happened. (See Diagram 1)
The Oxford English Dictionary defines 'challenge' as the “move from one system or situation to another”. It is a word Toby Corballis, CEO of Rapid Addition, believes, all too easily describes the financial turmoil of the past six months. With a particular focus on EMEA, Corballis examines these challenges and the associate risks facing firms and software vendors across the financial marketplace over the coming 12 months.
Given the continuing volatility of global markets, large-scale movement of people, data and systems, challenge looks set to be the order of the day. Or, in the words of Robert Zimmerman, 'The Times They Are A-Changing', in the world of electronic trading, like many others, 'movement begets change and change begets risks'. Over the following year, the industry looks set to face a series of challenges, change and risk.
Change #1: Consolidation of software
With apologies to any egg-sucking grannies, once upon a time, to attract order flow, the sell-side gave away various software packages to the buy-side. It was the same kind of logic applied by proponents of the ubiquitous loyalty card, and buy-side firms saved money, as they didn't have to pay for their software upfront. Actually, costs were partially hidden in sellside fees and partially realised through the inconvenience of having to use a multitude of different solutions that did more-or-less the same thing but connected to different counterparties.
This may seem trivial, but it represents a seismic shift in the way systems make it to market. Money for these systems will no-longer flow from the sell-side to the software vendors. The relationship is now owned by the buy-side. No one wants to pay for something that used to be free, so it's unsurprising that there are mutterings in the corridors. Expect this crescendo to peak as many existing 'free' contracts expire, creating a drive to consolidate on as few solutions as possible. This consolidation carries a number of risks to all sides of the financial Rubik cube.
Buying software requires an understanding of the software procurement process, a process that is itself currently the subject of change. Questions like, “how long has the vendor been trading?” are likely to matter less than “who is the ultimate parent?” and “how solvent are they?”. Ownership is likely to say much about the chances of software being able to support your requirements later on. Where safety in numbers was once seen as good (vendors had more clients so less chance of going under), it doesn’t work so well if you’re in a long queue of creditors to a failed business.
Knowing who the ultimate parent of a company is gives other insights. Do the company's principals really understand my industry (they need to if you want some assurance that your system will change to keep up with the times)? Do they have a reputation for innovation? What is their commitment to Research & Development?
Change #2: Increased regulation as a by-product of political oratory
Politicians love to claim credit, deflect blame, and be perceived as tough guys. A quick scan of the local media is enough to see that the current political culture is very much one of blame. Fallout from political rhetoric tends to materialise in legislation which almost always beats the drum for “greater transparency and accountability”. What is actually meant, of course, is greater auditability, or the ability to recreate a moment in time so as to prove that the course of action taken was fair and in the best interest of the client. This is all good and there are many ways to achieve it, however it would be hard to argue that computer records were anything other than the most accessible of these.
Imagine trying to model all of the Credit Default Swaps (CDS) contracts into which Lehman’s entered. One problem is that a CDS could be created in so many ways: phone, instant messenger, and so on. Finding and recreating all of these would be a nightmare. That there is an appetite by the regulators for things to be more auditable is evident in recent speeches by Charlie McCreevy, the European Commissioner, who has been looking to introduce legislation to create more on-exchange trading of CDS trades.
If more assets are traded on-exchange that implies that more software connections are required to connect venues with participants. That, in turn, implies more trading platforms and more capacity to handle increased transactional volumes by the participants.
BMO Capital Markets’ Andrew Karsgaard outlines the new regulations regarding dark liquidity in Canada and how firms can use them to their advantage.
Canadian market participants are bracing themselves for another year of significant change. While exchanges, themselves, consolidate, liquidity continues to fragment across venues. Regulatory proposals on dark liquidity are being considered. New entrants, both lit and dark, wait in the wings for the right moment to set up shop. In this constantly changing environment, the tools available to a trader to access and analyse liquidity across markets have become critical to their success.
Regulators Looking at Dark Liquidity
In November 2010, the Canadian Securities Administrators (CSA) and the Investment Industry Regulatory Organization of Canada (IIROC) issued a Position Paper containing proposals on the subject of dark pools and dark liquidity. The proposals are summarised here, but let us focus briefly on how the debate around dark liquidity is evolving in Canada.
The regulators’ proposals were prefaced by the statement that “in order to facilitate the price discovery process, orders entered on a marketplace should generally be transparent to the public...” This seemingly innocuous perhaps, unarguable assertion has, in fact, prompted considerable debate in the market structure blogosphere. Critics of the proposals argue that there is no evidence of damage to the price discovery process in markets where dark liquidity exists. They argue that transparency should not be an end in itself, as the true objective is best execution. Non-transparent ways of trading have existed forever, because they provide an important way to minimise market impact when executing large orders.
Many go further, arguing that the insistence on transparency is actually damaging, as it has created a network of continuous, linked auction markets that are susceptible to gaming, and therefore, represent toxic pools of liquidity. By placing restrictions on dark liquidity, regulators are potentially forcing investors to participate in these pools. Canadian regulators have a history of pro-actively analysing and responding to market structure changes. Their rules concerning multiple markets were ready before the first lit ATS began operations, and they are the only regulators in the world who are active, direct members of FIX Protocol Limited, contributing to the creation and maintenance of standards in electronic trading. Generally speaking, they are engaged and well-informed. In this case, by getting ahead of the game on dark liquidity, there is a danger of throwing the baby out with the bath water.
Our uniquely Canadian broker preferencing and on-exchange crossing systems, along with the TSX’s market-on-close facility, create hybrid forms of grey liquidity, where size is not exposed to the glare of the continuous market, but where price formation and discovery still occurs. Broker preferencing takes internalisation, which is completely dark, and displays it - every trade - on a public venue. This contributes to price formation to a much greater degree than the broker-run dark pools in the US, which are not obliged to publish trades unless they reach a certain size. Dark liquidity is not the problem in Canada. We currently have a single dark pool, but we have some interesting methods of merging lit and dark liquidity that could act as models in other countries. Is it possible that these proposals focus on the symptoms rather than the disease?
New Entrants Waiting in the Wings
Awaiting the outcome of the consultation period and the final CSA/IIROC view on dark orders, are a number of potential new entrants to the Canadian market, as well as a number of new facilities being offered by existing market operators. Alpha – a well-established ATS owned by a consortium of large dealers – submitted proposals to the regulators for their Intraspread order type (essentially a broker internalisation facility) in the second half of last year. Around the same time, TMX submitted an application for their own non-displayed order types, including a non-displayed midpoint order and a non-displayed Limit Order. MatchNow, currently Canada’s only electronic dark pool, proposed an addition to their existing dark pool, offering an “internalise only” order type.
The Vienna and Ljubljana Stock Exchanges comprehensive FIX upgrades reinforce the continued global trend for reliance on FIX over an alternate proprietary technology. Annie Walsh, Chief Marketing Officer for CameronTec, examines a case for FIX.
2010 was a pivotal year that saw many local exchanges fending off new, unfamiliar competition in what for many regions have traditionally been non-competitive market places. With competition continuing to intensify and once cozy monopolies progressively being dismantled, the stakes have rarely been higher. If regional consolidation was 2010’s buzz, then 2011 will be characterized by structural reform, increased central clearing and the emergence of a host of new players that will be ushered in as a result of Dodd-Frank and EU legislative equivalents.
The paradigm shift has been good for FIX. Looking back it was the emerging new trading venues that first demonstrated considerable appetite for FIX. Their motivation was driven by an acknowledgement that FIX could provide ease of entry into markets and a competitive edge for attracting liquidity away from the traditional exchanges. Exchanges can no longer operate in isolation within segregated vertical markets. Their consolidation due to mergers and the emergence of alternative trading venues has escalated the importance of technology around the trading lifecycle. Latest figures indicate just how much liquidity the new marketplaces are attracting. Volume for BATS, Chi-X, Pure Trading and Alpha ATS, to name a few, show significant levels of liquidity shared by a broad number of different venues.
Field-leveling regulations such as MiFID and Reg NMS have also put the spotlight on technology with a growing focus on reducing latency that has implicitly changed the exchange business model. Competition for exchanges is about performance and cost, with the highest performing and lowest cost marketplaces attracting the most liquidity. Now, more than ever, the need for a more uniform API has become a critical consideration, and this is one area where investments in technology are being made. Exchanges today recognise the considerable benefits of an exchange compliant FIX interface on a number of fronts.
The FIX Protocol is increasingly providing the level playing field for many market participants, while encouraging exchange market differentiation across more value added service areas, such as Straight Through Processing (STP), latency, trading platforms and strategies, corporate services and increased data offerings. FIX enables exchanges to take advantage of economies of scale and provide broader access as well as generate the additional revenues the business requires.
The Vienna and Ljubljana exchanges are two marketplaces within the CEE Stock Exchange Group (CEESEG), now using a cutting edge FIX API for trading access, order routing and market data. CEESEG’s decision to offer this to members is in response to participant support for the protocol over any proprietary alternative. On the business side, leveraging FIX across CEESEG’s exchange members provides a more flexible and cost efficient solution.
The appetite for the fastest possible interface will always be present and FPL’s continuous, collaborative work with the exchange community, evidenced with a number of working groups, has resulted in improved latency, making FIX messages more suitable for high speed trading. Through FIX 5.0, for example, exchange clients will find it easier to implement a more flexible, faster connection to the exchange. Exchanges are increasingly taking advantage of the latest advancements in FIX and using it to establish points-of-presence in major liquidity venues worldwide — thereby providing local connectivity for local customers, which in itself significantly reduces cross-regional connectivity costs.
The cost of defining a new protocol is considerable and these costs continue for the lifetime of the product. Every new software release must include additional regression tests based on very demanding performance tests. This will ensure performance gained is real and constant. Benefits are intrinsically about the additional flow bringing revenue that the exchange attracts, either from competitors as a result of the speed offered or from new flow that is created due to this speed. This cost calculation for customers is also important. The customer will be looking for measureable benefits and/or lower costs. If the binary interface uses FIX the cost of developing and using the interface may be lower. If the data types are common with FIX, the integration with existing OMS systems may be easier.
Pantor Engineering's Rolf Andersson examines the data behind the speed of FIX for low latency market data, settling the question of whether FIX is fast enough.
The FIX Protocol as well as software implementing the protocol (aka “FIX engines”) have from time to time been said to have a performance that is too low for use where very high throughput and/or very low latency is required. Performance characteristics of the protocol as well as engine implementations have been discussed in previous articles (Kevin Houston’s article “?” and John Cameron’s “Evolution of the FIX Engine” Vol 2, Issue 7, September 2008). Granted, there are slow implementations in use and the classic FIX tag=value syntax is too verbose for some use cases: e.g. market data.
Recent developments within FPL such as the release of the FAST Protocol and efforts within the FIX 5.0 usage sub-committee to support alternative recovery state models will enable FIX to be used in high throughput, low latency scenarios. This article reviews the various sources of latency and demonstrates that a FIX over FAST implementation can be used in place of a proprietary protocol to provide very high performance and low latency.
An overview of latency sources
There are a number of latency sources that contribute to the total latency for producing, transferring and consuming market data. The impact of different sources varies widely between implementations. The following sources will be discussed:
Message processing overhead – the encoding and decoding between transfer format and the internal representation suitable for the processing required in a specific application, as well as safe-storing messages to support recovery of lost messages;
Communication processing overhead – the network processing associated with sending and receiving messages;
Scheduling delay – the delay in reacting on a request to send a message or reacting on a notification that a message has been received.
Transfer delay – the time elapsed between the start and the end of a packet containing one or more messages.
Propagation delay – a function of the physical distance between two communicating parties and the speed of light in the medium used to communicate.
These latency sources are to a large extent similar in behavior irrespective of the choice of external protocol, but there are differences as discussed below.
Message processing overhead
The overhead of processing a traditional FIX message is negatively affected by a number of aspects:
Message content – FIX messages contain a host of information for the benefit of different communicating parties.
Message format – the FIX message format contains redundant information about the message structure. Text is used to represent all data.
Recovery semantics – the FIX recovery mechanism is based on message sequencing per session and a contract that a receiver can re-request messages.
Communication processing overhead
The communication overhead depends on the number and size of network packets. Each transferred packet incurs some processing both at the sending and receiving end. One or more messages are transferred in each network packet. Smaller messages mean that less data has to be copied and that more messages can be transferred in each network packet.
Recent predictions from industry groups suggest US equity option quote volumes will nearly double over the next twelve months, severely straining the technology fabric that underpins the industry’s quoting, trading and risk management systems. We see particular vulnerabilities in processes and systems that require a direct un-throttled options market data feed, such as smart routing engines, algorithmic trading engines and portfolio risk systems, as well as the infrastructure that supports these processes and systems.
Options Market data fed from Options Price Reporting Authority (OPRA) has been steadily increasing at a current annual rate of about 40%. Currently we see around 1.3 million messages per second, with a recent high-level mark at almost 1.5 million messages per second in December. The Financial Information Forum (FIF) is projecting growth to reach 1.8 million contracts in the next twelve months. The projection does not take into account the ramp up of new exchanges or any added expansion of the Penny Pilot, and we know that as these changes go into full effect, growth will only accelerate. We expect that with the new symbology, clients will be able to execute against more strikes in all underliers. The above projections also do not take into account the added granularity in striking which we anticipate will lead to more products filling our screens, adding to the technology crunch.
Given the recent OCC symbology changes, new and existing exchanges will be fighting for order flow with new products and different business models. Some new models include hybrids of payment for order flow and maker/taker for certain names. Some exchanges are trying to attract new business by introducing non-standard options that allow clients to trade options in new ways, such as binary options. More products on more exchanges will increase the need for better, more efficient technologies. The technical hurdles to maneuver this business will likely only get higher and higher as we move forward.
Firms’ needs vary from order routing through an EMS, some of which showcase advanced options analytics, to proprietary technology systems that require a huge amount of technical horse power to consume the ever growing OPRA market data feed. These systems pipe through options algorithmic orders that spawn hundreds of child orders or advanced volatility quoting strategies.
As needs grow, it will become increasingly more important for firms with trading needs to partner up with vendors and brokerdealers who have developed specialties in these spaces and who understand how to efficiently handle the sheer mass of messages being sent now in terms of orders and market data. The underlying technology has become so specialized that it’s no longer a matter of throwing money at a software or hardware problem, but to find the best combination of hardware and software.
It will also be important for all firms involved to smartly throttle the feed to certain processes that do not necessarily need every tick to ensure that sub systems are not over saturated. We expect that every single process that tries to consume OPRA market data will need to be bolstered or re-engineered for almost all existing systems. There also has been a recent push towards publishing the depth of the options market, provided as direct feeds from the exchanges, to trading front ends and algorithmic engines. The thought is that with the proliferation of pennies, the current OPRA feeds, which only reflect the tops of books at each exchange, are less useful when trying to identify liquidity for larger block executions. Besides providing more clarity into the book, direct feeds also tend to be faster than the feeds through OPRA. Tools designed to obtain blocks in the electronic markets will become important when chasing after institutional, larger block trades. There also is some thought that using the depth of book to derive analytics will provide customers clarity into where they may get filled given the depth of book feed.
With wider use and availability of depth of book, we expect to see development in pre-trade execution analytics for those clients who need more liquidity than that published at the top of book. Customers, in turn, can get a sense for the average price they are likely to achieve if they sweep the book. Over time, this should lead to increased confidence on the likelihood of filling larger block orders electronically. This will likely draw chunkier flow that is important to this business. We expect that if the market depth becomes important for execution, this will only multiply the resources needed to handle the complete options market data feed.
The European equity trading landscape has undergone a period of rapid change, since the implementation of MiFID (Markets in Financial Instruments Directive) in November 2007. The new regulation shook up the financial industry in an unprecedented way mainly by the abolition of the concentration rule and the determination to make Best Execution the ultimate guarantee of protection for the investor. Although MiFID has not been successful in every aspect, it certainly succeeded in bringing a competitive edge to European equity trading, resulting in a drop in trading cost for brokers. On the flipside: the market has become more fragmented, and the complexity and cost of providing best execution are emerging as important factors.
Breaking barriers to entry
As new trading venues have started, the main challenge faced by market participants has been to facilitate the connection to new platforms. The use of FIX Protocol standards, making it easier and cheaper to connect to an exchange, was one of the key elements of the success of the Multilateral Trading Facilities (MTFs). And now the reality is that stocks can be and are actually traded on several different venues: traditional exchanges, MTFs, Systematic Internalisers and dark pools. The main European indexes are now traded, on average, on more than 5 visible venues. Furthermore, to attract liquidity, new players offer an innovative and simpler fee structure: no membership fees, no market data fees and attractive trading fees: posting liquidity is now rebated while in the meantime, removing liquidity from MTFs is still less expensive than on traditional exchanges.
With reduced costs, new liquidity available and frequent tighter spreads, MTFs have all the assets to attract the brokers. And it works. On the main European indexes, traditional exchanges have lost up to 44 % of market share.
Of course, traditional exchanges launched their own MTFs and dark pools and lowered their fees to compete with alternative platforms, but by and large these have not enjoyed as much success as the new entrants.
Despite the drop in costs of connection and membership, trading on alternative platforms generates new costs: costs of physical connectivity of course, but also unexpected costs due to the increasing complexity of trading.
Increasing complexity of Trading
To have a full picture of the market, brokers now need to connect to more venues. Of course, the connectivity costs involved are one of the barriers to entry. Not to mention that the sustainability of those models are still to be demonstrated. Brokers are reluctant to invest money in those solutions without knowing if they would reach any return on investment. In addition, even if connected to several venues, brokers have to set up efficient Smart Order Routing (SOR) systems to make the best of the opportunities brought by the fragmentation.
In this article, Nomura’s Ben Springett provides a brief overview of some of the key issues currently impacting European market structure, and shares his own thoughts on some of the changes likely to occur in Europe this year.
European market structure, has been, is, and will continue to be, in a state of change for the foreseeable future. Whilst European Commission regulation has been a significant catalyst in this, the industry itself is now looking to progress issues at a faster rate than the expected regulatory change. As such we are seeing increased interest in “self” regulation within the community, particularly in the areas of post trade reporting and efforts to provide a consolidated tape. All market participants are active in this, but it is not unreasonable to assume that it will be down to the broker-dealers to drive any change, as they typically are the ones that have the resources to invest in the process.
Market share amongst trading venues can be measured in many different ways and people can be forgiven from choosing one that paints their own venue in the best light. The accompanying two charts (Charts 1 and 2) show the steady decline of market share amongst the key primary exchanges, to the benefit of the MTF venues, although the total volume levels remain significantly lower than the pre-credit crunch days. When considering primary exchange volumes versus MTFs it is necessary to bear in mind that the primaries are only just starting to compete in each other’s markets, and as such the pan- European MTFs have had more blue chip names with which to capture their market share. This is set to change in 2010; Euronext launched ARCA last year, Xetra have launched their International Market (XIM) and the London Stock Exchange (LSE) have just completed the acquisition of a majority (51%) stake in Turquoise.
MiFID did not mandate a market- wide consolidated tape, as opposed to the NBBO ( National Best Bid and Offer) provided under Reg NMS, and the lack thereof is one of the key concerns raised by the buy-side in a range of forums. There is however, no significant issue with data aggregation offered by a number of key providers such as Bloomberg and Reuters; in addition to some strong fragmentation analysis products available to the market (Fidessa Fragulator, BATS Europe).
In a period of time where cost base is under increasing pressure, attention has now been drawn to the inherent impenetrable conditions that exist in market data (the LSE has sole distribution rights on LSE data, Deutsche Boerse on Deutsche Boerse data etc.), and as the number of venues from which the data is required for increases, so will the interest placed on the associated charges. In an environment with considerable focus on competition, competitive forces cannot work to reduce the fees, leaving regulation as the only option, which was again addressed under Reg NMS in the US.