CameronTec conducted a worldwide CameronFIX user survey last month to capture latest industry opinion concerning the adoption and broader usage of the FIX Protocol; movement in FIX volumes since the financial crisis hit; and the greatest challenges currently facing FIX, its users and FPL itself. Annie Walsh, Chief Marketing Officer for CameronTec sums up the findings while drawing on industry participants for further comment. Respondents to the survey represent both buy- and sell-sides, across large to smaller firms, as well as exchanges and Independent Software Vendors (ISV’s). In terms of geographic location, respondents cover every major financial precinct.
For many firms where technology is a key market differentiator on their trading desk, the need to maintain a competitive system with all the requisite functionality has importantly included FIX.
Over 75% of FIX users reported an increase in FIX volumes; with FIX usage considerably greater since the financial crisis hit in 2008. FIX investment has remained stable with 60% of firms reporting no downsizing to their FIX team over the same period.
While overall FIX usage is up worldwide, many firms did however reign in investment on new FIX infrastructure, translating to strong stability for existing FIX frameworks. Both the versions of FIX used (72% are still using FIX 4.2) and the age of their infrastructure (four years on average) indicates that firms preferred to keep with existing frameworks over anything new during the crisis, but they are now shifting gears as they look forward.
“Performance is always on the radar but in harder times project prioritisation is more a factor for long-established firms,” notes Max Colas, Chief Product Officer of CameronTec. “At the same time, the emergence of High-Frequency trading has driven stronger demands on FIX engine performance pushing the vendors to continuously deliver.”
Adds Colas: “Long-established firms have looked at recent innovations and performance upgrades in FIX engines with mixed appetites, but survey responses indicate the need for upgrades is brewing with many looking to the future and making plans with partners that will still be there tomorrow.”
There was an overwhelming consensus among respondents for the need to explore ways to reduce latency within FIX, although this enthusiasm was also met with some reservations that such efforts could, in reality materialise with a tangible outcome.
“The major challenge for big financial houses, brokers and trading firms is to achieve ultra-low-latency per inbound and outbound FIX messages in direct market access flow,” says Erika Bajer-Jurkovic at Deutsche Bank. “It is very important to accurately measure and present these latencies in the real time and flag delays. An easy to follow, unambiguous latency-measurement standard on the top of the FIX Protocol could introduce and shape new metrics with a business to be considered, especially in the area of algorithmic trading.”
Commenting on buy-side firms driving FIX latency requirements, Jet Tek’s CEO Greg Orsini says, “While there has long been pressure from buy-side firms to continually shrink market data and transaction latency, there has been no standard way to measure and communicate total latency as well as all of the constituent point-to-point latencies. Recently, the FIX Interparty Latency Working Group has been tasked with defining FIX extensions to formalise how these discrete latencies can be presented. This will provide sell-side firms the opportunity to distinguish themselves from their peers by providing new information, which their customer can use to investigate network elements, make real-time routing decisions and eliminate slow liquidity providers.”
Pantor Engineering's Rolf Andersson examines the data behind the speed of FIX for low latency market data, settling the question of whether FIX is fast enough.
The FIX Protocol as well as software implementing the protocol (aka “FIX engines”) have from time to time been said to have a performance that is too low for use where very high throughput and/or very low latency is required. Performance characteristics of the protocol as well as engine implementations have been discussed in previous articles (Kevin Houston’s article “?” and John Cameron’s “Evolution of the FIX Engine” Vol 2, Issue 7, September 2008). Granted, there are slow implementations in use and the classic FIX tag=value syntax is too verbose for some use cases: e.g. market data.
Recent developments within FPL such as the release of the FAST Protocol and efforts within the FIX 5.0 usage sub-committee to support alternative recovery state models will enable FIX to be used in high throughput, low latency scenarios. This article reviews the various sources of latency and demonstrates that a FIX over FAST implementation can be used in place of a proprietary protocol to provide very high performance and low latency.
An overview of latency sources
There are a number of latency sources that contribute to the total latency for producing, transferring and consuming market data. The impact of different sources varies widely between implementations. The following sources will be discussed:
Message processing overhead – the encoding and decoding between transfer format and the internal representation suitable for the processing required in a specific application, as well as safe-storing messages to support recovery of lost messages;
Communication processing overhead – the network processing associated with sending and receiving messages;
Scheduling delay – the delay in reacting on a request to send a message or reacting on a notification that a message has been received.
Transfer delay – the time elapsed between the start and the end of a packet containing one or more messages.
Propagation delay – a function of the physical distance between two communicating parties and the speed of light in the medium used to communicate.
These latency sources are to a large extent similar in behavior irrespective of the choice of external protocol, but there are differences as discussed below.
Message processing overhead
The overhead of processing a traditional FIX message is negatively affected by a number of aspects:
Message content – FIX messages contain a host of information for the benefit of different communicating parties.
Message format – the FIX message format contains redundant information about the message structure. Text is used to represent all data.
Recovery semantics – the FIX recovery mechanism is based on message sequencing per session and a contract that a receiver can re-request messages.
Communication processing overhead
The communication overhead depends on the number and size of network packets. Each transferred packet incurs some processing both at the sending and receiving end. One or more messages are transferred in each network packet. Smaller messages mean that less data has to be copied and that more messages can be transferred in each network packet.
Mihai Bistriteanu of Citigroup Global Markets Japan focuses on the trading pattern changes post-arrowhead implementation as well as its many opportunities and challenges.
Without a doubt, the Tokyo Stock Exchange's (TSE) implementation of “arrowhead” this January 2010 was one of the most important events in the history of Japan’s equity trading, with a lot of information and articles being circulated on the structural changes following the new system’s implementation.It is indeed amazing how arrowhead’s implementation changed the very core of Japan’s trading landscape making a huge impact on latency, market volume, trade size, price, as well as tick size dynamics.
Primary exchange speed was along awaited feature in Japan. The one digit millisecond turnaround on a non-collocated infrastructure, promised by the Tokyo Stock Exchange (TSE), is now finally happening in Japan, with a few interesting trends starting to develop as a result of the higher speed:
Strategies requiring low latency infrastructure can now be easily implemented across Japanese stocks.
There is an ease of integration of TSE flow within SOR (Smart Order Routing) systems and crossing engines.
One of the key drivers of arrowhead implementation is the aggressive growth of competition in Japan. The PTSs (Proprietary Trading Systems) and broker dark pools had the speed, and in some cases, price as competitive advantages. The liquidity available outside the primary exchange is still small compared to the ratios seen in Europe and the US.
A large percentage of the sophisticated buy-side investors refrained from using SOR technologies to avoid missing liquidity in the primary exchange because of the overall latency.
A standard matching system, with queue jumping functionality when posting liquidity, usually places multiple legs of the same order in multiple venues. When an order is matched in the proprietary crossing engine, the system sends cancellation requests to the other venues, and only when the acknowledgement is received is the cross executed.
The bottleneck for this technology used to be the speed of receiving the acknowledgement from the primary exchange, when the cross was found. Similarly SOR technologies send IOC (Immediate or Cancel) orders to multiple venues including primary exchanges. When one of the venues is slow, it will delay the entire system, therefore missing rapid price changes.
The fact that the TSE infrastructure speed is now in-line with its competitors will have a positive effect in widening the use of liquidity aggregation tools. This also may eventually result in an increase in liquidity on the alternative liquidity pools. The new broker pools and PTSs have to be competitive on all parameters (speed, price and liquidity) to enter and be successful in the Japanese market now.
Market Volume and Median Trade Size
Following the implementation, the volume growth started gradually. At this very moment, as I am writing this article, the rate of growth is still very positive. We expect this to reach saturation over the next couple of months. The main drivers behind the growth are the new strategies that take advantage of the high speed environment, as well as the general increase in activity at the beginning of the year and the market recovery.
Ned Phillips, CEO at Chi-East, elucidates on how despite the lack of a regional trading framework, non-displayed venues are gaining momentum in delivering benefits to Asian investors.
Trading in Europe and North America is now faster and cheaper than ever before. Under the umbrella of regional regulations such as MiFID and Reg NMS, the proliferation of technology has revolutionized the ways trades are executed. An endless choice of non-displayed venues (NDVs, also commonly referred to as dark pools), lit-pools and traditional exchanges have clamored to offer traderssimplified connectivity, lower latency and better execution.
Although lacking the number of choices as their Western counterparts, Asian investors are not being left behind in the technological race to provide better trading execution and lower costs. Despite the disjointed nature of the Asian trading environment, and perceived barriers such as multiple regulatory and settlement systems, NDVs still allow Asian investors to aggressively pursue the same benefits as those enjoyed by their American and European counterparts.
Asia is an expensive place to trade, especially in comparison with the US and Europe. Large spreads have lowered liquidity in many parts of Asia, which has made life difficult for many investors, especially algorithmic and high frequency traders, by restricting them to more liquid and well-known stocks. For these investors, NDVs are alternate means for them to access both lower transaction costs and higher liquidity, allowing them to undertake more diverse trading strategies.
NDVs do this by helping the market achieve its real purpose – efficiency. Liquidity pools allow traders to reduce spreads by meeting each other halfway between a bid and an offer (mid-point pricing), rather than forcing one party to give in and meet the spread. This has enabled investors to seek best execution and capture significant savings with every trade.
In turn, lower spreads are also bringing liquidity back to lesser-traded stocks, thereby further increasing trading volumes and the overall strength of the market place.
High latency and low impact transactions
NDVs also provide investors with a fast, and most importantly, anonymous pool of liquidity on which to trade large blocks of securities without risking price movements against them. This feature is particularly attractive to brokers, who are coming under increasing pressure from algorithmic and high-frequency traders to provide discreet, fast and low-cost ways of trading Asian securities.
Lower trading costs
Commission payments still make up a large part of trading costs in Asia; while trading costs charged by exchanges remain relatively high (see side-table). NDVs are already contributing to the reduction of these costs. A report by Greenwich shows commission payments in Asia ex-Japan fell sharply in 2009, as fund managers switched to electronic trading options.
Traditional exchanges have also been forced to respond to the price challenge imposed by alternative trading platforms. Already, venues such as Hong Kong Stock Exchange are reducing their fees for certain listed products. Despite this, Asia-based NDVs continue to offer trading fees which are lower than traditional trading venues, allowing brokers and fund managers to pass on further savings to investors.
It’s really happening now. The machine/black-box trading boom has arrived in Eastern Europe, says Andrea Ferancova, Partner and Director of Capital Markets, WOOD & Company.
After the successful adoption of electronic trading technology in developed markets, traders are now increasingly using algorithmic strategies to achieve their goals in more exotic markets. This trend is growing rapidly with new players coming to the market every day, in an attempt to grab some marketshare and make the best of the opportunities available. Thanks to technology and its active engagement by local players, it is now easy to connect and easy to trade in Eastern Europe.
So much has changed since that first electronic trade in Eastern Europe by one foreign institution! Starting the trading technological implementation very recently (after the communist era) had its own advantage, as all the exchanges were founded on an electronic base, but there have nonetheless been multiple barriers to reach the markets directly - local legislation, closed-end technology and protectionism to name a few.
Just few years ago, exchanges existed without any interface to allow connectivity from external systems, and orders had to be retyped manually. On those where some connectivity was possible, algorithmic processing was forbidden by law, and the frequent modification of order parameters was something that regulatory bodies viewed as market manipulation. Latency was measured in seconds, and the number of orders one could process was only limited to between tens and hundreds a day.
Apart from the trading obstacles, up until recently it was also very difficult or expensive to settle. Thanks to rules that vary from exchange to exchange, unconventional order types and behavior quite different from that of the developed markets, the region has earned its reputation as the “Wild East.”
It took some time and considerable effort to navigate a common and more reasonable course, but it has paid off in terms of market accessibility. Though there is lots of room for improvement, simple trading solutions can be found on most markets. Offering Direct Market Access (DMA) to Eastern Europe has been very challenging for us for many years, due to the non-existence of a system that would cover all or at least a majority of the markets.
Vendors were not keen on investing in developing products for closed markets due to extremely limited demand and those who tried, discovered it was difficult to sell such a product, as they would be interceded by either a lack of documentation or language barriers.
Local vendors supported only local exchanges and offered only closed-end applications. Not to mention that no one had heard of FIX or any other connectivity option either. It was also difficult to find vendors delivering options for a reasonable price, which makes a lot of sense, when you take into account the limited volume and money that could be generated on any one market. Since the markets were really difficult to reach from outside, it was very difficult to become a member locally, and almost impossible as a remote member. So, as the market was limited to only a handful of prospective customers, it simply did not seem reasonable for system vendors to offer a “bigger” solution. So, if you were to offer a particular market to trade, you had to either have a local solution and request an expensive, tailor-made integration to other market infrastructure; or get a buggy system from a renowned vendor (if any existed); or just build it completely by yourself . Local vendors were not really motivated to create and offer solutions for regional markets apart from their local ones and the big “global” vendors offered (and still offer) only incomplete “alpha state” solutions and getting them to work proved almost impossible, requiring huge investments especially as no one vendor offered solutions for many markets, making it necessary to integrate the systems of various vendors.
Aite Group’s Sang Lee argues that while naked sponsored access is causing concern among market participants, regulation alone cannot remove all systemic risk.
After years in obscurity, sponsored access emerged as a regulatory hot button issue in early 2009. More recently, it seems to have fallen off the regulatory radar screen, upstaged by high frequency trading, co-location and dark pools. Nevertheless, any future regulatory discussion regarding high frequency trading cannot take place without addressing the issues around sponsored access, and especially around the unfortunately named “naked” sponsored access.
One of our December 2009 reports titled ‘Land of Sponsored Access: Where the Naked Need Not Apply’, defines the sponsored access market, and provides estimates of sponsored access penetration of the U.S. equities market. This report also provides predictions on potential regulatory changes and possible impact on the overall evolution of the U.S. equities market. But, let’s start at the very beginning.
Sponsored access has many different meanings for market participants, and, while widely talked about, it is often misunderstood. The origin of sponsored access can be traced back to the practice of direct market access (DMA), in which a broker, who is a member of an exchange, provides its market participant identification (MPID) and exchange connectivity infrastructure to a customer interested in sending orders directly to the exchange. In this way, the broker has full control over the customer flow, including pre- and post-trade compliance and reporting. The DMA customer, in turn, gains direct access to major market centers. While firms opt to go through a sponsored access arrangement for many different reasons, reduction in latency is one of the main factors. Other, more basic reasons include additional revenue opportunities and hitting volume discounts.
There has been a lot of focus on the need for ongoing latency reduction to gain competitive edge. When breaking down the key sponsored access infrastructure components, network connectivity typically accounts for a significant portion, with an average of 450 microseconds. Exchange gateways add another 85 microseconds, and the industry average for typical pre-trade risk checks accounts for approximately 125 microseconds, with per-risk checks averaging anywhere from five to ten microseconds.
Latency levels across the three often-used types of market access (traditional DMA service; co-located, filtered sponsored access; and unfiltered sponsored access) vary widely, leading to a potential competitive edge for those firms able to achieve ultralow latency trading infrastructure. For traditional DMA services, the industry average currently ranges from four to eight milliseconds. For co-located, filtered sponsored access, the latency level dips into microseconds, ranging from 550 to 750 microseconds. Unfiltered sponsored access, not surprisingly, has the lowest range of latency, with 250 to 350 microseconds.
Challenges of Sponsored Access
Of course, sponsored access also has specific risks and challenges for participating parties as well as for the market overall. These include:
Supporting non-filtered sponsored access can lead to sponsored participants taking unacceptable levels of risk, which can cause both great financial burden and reputational damage to the sponsoring broker.
In order to support non-filtered sponsored access, sponsoring brokers must develop strong risk management and due diligence teams capable of handling the credit and operational risk of sponsored participants.
Broker-to-broker sponsored access can lead to a situation in which the sponsoring broker loses track of the activities of the sponsored broker’s customer.
Providing filtered sponsored access often leads to a higher pricing point for sponsored participants, leading to favorable competitive conditions for those brokers offering unfiltered sponsored access.
While the potential is slim, there is a chance that a rogue sponsored participant can increase overall systemic risk.
Like a sprinter racing for a gold medal, the traders’ ability to take advantage of latency is influenced not only by their skill as a trader, but also the tools they have at their disposal including hardware and software architecture, network topology and physical distance from execution venues, writes BT Radianz Services’ Matthew Lempriere .
With the 2012 Olympics in sight, all eyes will turn to London (especially those of the BT team, as the official communications partner of the Olympics and Paralympics), where we will watch world class athletes participate in tough and demanding events where a split second can make the difference between winning and losing. But what we should remember is that the competitive edge that an athlete gains over their rivals does not just come down to the individual themselves, but the infrastructure that they have around them; such as kit, training conditions and diet. A combination of these things will ultimately come together and be the key to beating the competition and winning gold.
The same is true in the world of electronic trading. The innovative brains behind the most cutting-edge instruments or complex trade strategies on today’s trading floors would be incapable of making their concepts become a reality without the modern technological infrastructure that is built around them. Where low latency was once an exclusive playground for arbitrage specialists and market makers, it has now become main-stream and is no longer confined to specialist trading systems, but is a requirement for all advanced trading strategies.
As well as coming into the mainstream, increasingly innovative trading strategies and their reliance on the latest, most advanced technology, such as algorithmic and quantitative trading, has pushed the issues of automation, latency and risk management technology further up the agenda. Latency is critical to take advantage of the swings in price and increase in order flow, and is grabbing board-level attention with senior latency-related positions being created within financial institutions to ensure that sustained focus is put on trading desks’ latency capabilities and for taking advantage of low latency to remain competitive. The rise of complex financial instruments has done much to propel reliance on technology further amongst both buy-side and sell-side organisations. This has happened to such an extent that the traditional division between the front and back office, in terms of their relative importance to the trade process, has been almost eradicated. Complex instruments, evolving investment vehicles, regulation and increased investor sophistication have caused the gap between the front and back office functions to narrow considerably. Technology, from the front office through to the back office, must move in tandem to enable the industry to be able to sustain growth and innovation.
In Europe, the trading landscape has undoubtedly changed in recent years with technological innovation and regulatory change – with initiatives such as the Markets in Financial Instruments Directive (MiFID) – providing the catalyst for a rise in alternative trading venues and off-exchange order books. These alternative trading venues have been responsible for growth in the number of trades made as large block orders that may be sliced into smaller bundles of trades to obscure them from the market. In Asia, we have a variety of different markets each with their own set of rules and regulations, so the technology must be able to react and interface with multiple venues. Both network and server technologies have had to provide the capacity for financial institutions to scale and take advantage of these increased volumes.
In response to this changing landscape the buy-side has needed to expand their use of algorithmic trading, direct market access and smart order routing technology to take advantage of liquidity fragmentation and alternative trading venues – the by-product of a post-MiFID world. The take-up of FIX standards by the buy- and sell-side in recent years has also stimulated the blossoming of electronic trading systems and algorithmic trading strategies.
What key features do successful exchanges share that encourage liquidity; how automated trading drives growth and why markets will attract incremental liquidity with the advent of global CSAs. Robert Barnes, Managing Director, Equities of UBS Investment Bank explains.
The execution arms race continues. The prize is order flow that concentrates to those most capable, particularly in navigating market structures.
Market structures comprise the rules and institutions that determine competition and the framework of interaction, including Exchange fees, which ultimately shape order execution strategies. The focus includes external factors that impact business and operating models, driving opportunities to grow revenues and reduce costs.
Exchanges rebuilding liquidity is a priority market-wide theme in 2010 in the context of competition, transparency, and investor choice at trading and clearing layers. From a User perspective, we wish to work in a spirit of partnership with Exchanges and Regulators to promote liquidity and new business, and we thank the Authorities as they provide a framework within which we can behave as entrepreneurs.
Macro trends include rising number of trades, coincident with automated electronic trading. Regulation promotes competition, transparency, investor protection. This leads to a better result for clients via competitive execution policies. Competition, thus fragmentation, makes the world more complex. Not all brokers, however, can keep up with the technological arms race. Direct Execution models of electronic trading are evolving to address this. Latency reduction increasingly is sought for competitive advantage.
There is increasing awareness of a positive dynamic involving non-displayed pools and high frequency trading. The key insights are that markets allowing discretionary non-displayed broker crossing processes and non-discretionary dark pools effectively speed net liquidity onto order books. The benefits are lower market impact, greater efficiency, and a better result for end investors.
These benefits multiply if statistical traders are active. When orderbook liquidity increases, so too does the proportion of trading opportunities; and these stimulate further orders to the orderbook from automated strategies. This incremental liquidity, aggressive and passive, narrows spreads.
The world’s markets are split into those that support and benefit from high levels of automation, and those with the opportunity to encourage more. Investors’ current focus include global macro trends and emerging markets which means that moving toward more consistent electronic access models will help markets to take advantage of this burgeoning liquidity. A good start is to implement and enhance FIX specifications to offer advanced electronic flexibility. This adoption of standardisation can aid emerging markets in growing their scale of business.
One of the more “seismic” changes to Equity markets in recent years is the proliferation of commission unbundling and Commission Sharing Agreements, “CSAs”, or Client Commission Agreements,“CCAs”, in the USA. Initiated by UK regulators in 2006, this commission unbundling initiative spread across Europe (at the end of 2007) with the arrival of the Markets in Financial Instruments Directive, or “MiFID.” Global clients, preferring one consistent process world-wide, have led the demand for CSAs to become a market convention. With many CSAs established on a global basis, it can be easier than ever before for a newly automated market joining a broker’s network to attract liquidity.
High Frequency Trading (HFT) is creating waves the world over, and Asia is no exception. With the challenges HFT presents being highlighted by many, and the benefits it offers markets being stressed by many others, Ronald Gould, Chief Executive Officer, Asia Pacific, Chi-X Global, focuses on their likely prospects in the enticing markets of China.
Developments in market structure generally occur in conjunction with three things. First, such developments require a receptive regulatory environment, one that permits change and encourages innovation. The second requirement is trading venue technology, generally coupled with the existence of more than one trading venue. Trading technology availability is improving but it would be wrong to suggest that markets everywhere are equal in this respect. Finally, there needs to be a user environment that is supportive of market change and willing to help drive innovation. These ingredients are observable in varying measure across markets in Asia, some at the forefront of change and others warily fighting against it. In most places in Asia, the current status of market change is in limbo as a result of hesitance on the part of one party or another to begin the process. Our own experience indicates that Japan, Singapore and Australia are leaders in this change while others are either cautious or opposed.
If we survey the market structure scene in Asia, some places present a more ambiguous picture than others. One of the most intriguing and perplexing pictures is China, a market filled with potential, intriguing to investors, clearly interested in innovation and change but whose plans and objectives are often opaque to the outside world. Given China’s growing importance to investors, an effort to make the picture of change clearer must be a useful one. First, it is helpful to establish a baseline from which to start, a description of the situation today.
According to figures published at the end of 2009 by the World Federation of Exchanges (“WFE”), China is now the world’s second biggest equities market based on total market capitalization. Free float is much smaller however, as both State and corporate holdings are still substantial. The market is heavily dominated by retail investors, of whom there are more than 50 million with active accounts and more arriving daily. Institutional investing is at an earlier stage although mutual funds have grown rapidly and Exchange Traded Funds (ETF’s) represent a major growth area as well. Because many of China’s largest companies were previously state owned and with shareholding still tightly controlled, turnover rates among institutional investors are very low. While a block trading facility exists in Shanghai, it is not as yet widely used and probably needs upgrading if it is to attract a greater audience among investors. The new generation trading system for the Shanghai Exchange was launched in Dec 2009, purchased from Deutsche Boerse and adapted with the help of Accenture for rather different market conditions over several years. Latency is not something that gets much scrutiny by investors in Shanghai but it is not a characteristic as yet highly valued. The question we confront today is whether we are at an inflection point for change in China, a point at which the instinct for innovation begins to drive a greater openness. Let’s look at the evidence.
Kevin McPartland, Senior Analyst, TABB group explains the metamorphosis of the Exchange today and how the very definition of an “Exchange” is being transformed.
High frequency traders are not the only ones trying to get faster. The last few years have seen exchanges enter an arms race for speed that rivals the most sophisticated trading shops in the world. The focus on reducing latency and increasing bandwidth is so extreme that we are watching the definition of“Exchange” transform right before our eyes. Not because physical trading floors in city centers have been replaced with massive data centers in out–of-the-way industrial areas, but because the exchange business model has fundamentally changed from one that is transaction-based to one that is technology-driven. The reasons why are quite simple. Execution fees have been driven down by competition largely brought on by field-leveling regulations (read Reg NMS and MiFID) enabling competition and in turn making technology, the real differentiator. The exchanges are desperate to both retain and attract more liquidity, but with execution fees often below zero and market monopolies consigned to history, only a serious investment in technology will ensure life throughout the next decade and certainly investments in technology are being made.
The most serious technology investments have been made by the world’s largest equity exchanges. NYSE Euronext purchased Wombat and NYFIX among others to create the newly branded NYSE Technologies, NASDAQ merged with OMX to create an exchange technology provider with global reach, the London Stock Exchange (LSE) recently purchased MillenniumIT to rebuild its matching engine and be its technology arm, and the Deutsche Boerse has long been a technology provider in its own right. Chi-X, the largest multi-lateral trading facility (MTF), has a separate technology arm in the form of Chi-Tech. Most recently, the Tokyo Stock Exchange (TSE) launched its long awaited Arrowhead platform with hopes of entering the low latency trading world. CME Group, BATS, and numerous others have also invested heavily in ensuring they have the latest and greatest technology. Some are working to maintain their dominant market position and the others are continuing their quest to take liquidity from the incumbents.
Exchange differentiation under the new paradigm is not easy. Twenty years ago NYSE traded NYSE listed securities and OTC securities were traded via NASDAQ; shares in UK based companies were traded on the LSE, shares in German companies were traded on Deutsche Boerse, and so on. Now, especially for US and European equities, shares of anything can be traded virtually anywhere. I’m over-simplifying of course, but with globalization flattening the world of stock exchanges, regulations keep everyone on a level playing field and with all measuring speed in microseconds the only obvious differentiator left is the name of the venue. Even if we assume traders naturally migrate to where liquidity is deep and spreads narrow, only through an understanding of exchange technology can one rationalize what causes that situation to occur.
It is a poorly kept secret that high frequency trading firms and proprietary trading desks at investment banks co-locate to shave off microseconds. This practice is at the heart of exchange-client connectivity. These high speed orders are generated within the servers of proprietary trading desks and hedge funds, and are sent via a high speed network into the exchange’s matching engine, all literally residing under one roof. This practice generates the majority of order volume in the US and increasingly in Europe.
Large agency orders from traditional buy-side sources are also important to an exchange’s success, but it is more often the job of the broker to ensure connectivity to the data center for their client flow. Simply put, the sell-side handles inter-data center connectivity and the exchanges handle intra-data center connectivity. TABB Group estimates that North American spending on market connectivity sits at just over $2 billion annually, with 70% of that number coming from the sell-side.