Feargal O’Sullivan and Jamie Hill of NYSE Technologies discuss OpenMAMA, the open source middleware Agnostic Messaging API they hope will expedite innovation in services, reduce vendor lock-in and minimize implementation time and cost.
Solving a Problem Choosing a market data vendor because of their API alone is not sound practice. The issue of how to come up with a standard way of accessing market data that allows clients to select a vendor for any range of reasons – other than the API that the vendor happens to offer – has been a struggle for a long time. Something that should be low on any decision-making tree has unfortunately tended to be much more important. There are a number of different consolidated market vendors, including some obvious names like Thomson Reuters or Bloomberg and there is also a range of direct feeds or ticker plant vendors, where instead of going to a consolidator, feeds are accessed directly from an individual exchange.
In selecting a vendor, users must write all their code to suit that vendor’s particular way of accessing the data. Changing to a different vendor requires opening up the source code and altering everything to match how this other vendor wants to access the market data. With a consolidated feed for broad international access and a direct feed for low latency algo trading in US equities, for example, many users have to write according to two to four different APIs. This has been a significant problem for the industry and with OpenMAMA we are trying to drive the industry towards a standard.
User Base This API is an eight-year-old standard that was initially developed by NYSE Technologies as the Middleware Agnostic Messaging API (MAMA), and it is quite heavily deployed in the financial services industry; close to 200 clients already use this API in their custom applications, so today it has an established installed base. We have opened that up and made it a standard by taking the source code for the APIs these firms are using today and provided it to The Linux Foundation, which will physically host the code as a neutral body.
During this process we worked with multiple parties that would not ordinarily use our API. Since the launch of OpenMAMA on 31 October 2011, one of the key factors to this being taken seriously as an open installation, was getting the right level of adoption. Before we launched, we approached a number of customers, other vendors and competitors, out of whom we established our launch partners J.P. Morgan, Bank of America Merrill Lynch, Exegy, Fixnetix and EMC. These launch partners, along with NYSE Technologies, formed a steering committee to drive the direction and the future of OpenMAMA.
From that point forth, each of those organizations who are part of that committee has a stake in Open MAMA. The API is open source under the LGPL 2.1 licence, so it is now owned by the open source community. With participation from Interactive Data, Dealing Object Technologies and TS-Associates as well, we now have a group ten strong and it is a global mix comprising different industries. Whereas before the API was driven largely by NYSE Technologies and our commercial use cases, now it is being driven forward as an industry standard. The more people we have to adopt and participate, the higher the likelihood of achieving that.
At the FIXGlobal Face2Face Forum in Seoul, Korean firms announced the formation of a FIX working group and the Korean Exchange’s intention to build an ultra low latency trading platform.
The opening speaker at the FIXGlobal Face2Face Forum Korea was keenly anticipated by the 200+ delegates, (a quarter of whom were made up of the buy-side and a third the sellside), as he was raising many of the issues that surround the HFT arena, but that are rarely touched on at industry events in Korea. By placing HFT in context , Edgar Perez, author of the recently published “The Speed Traders”, highlighted many of the opportunities and challenges that markets around the world face, in the low latency trading strategies environment. Not least, he pointed out the colossal task facing regulators and associated technology costs, just to monitor high-frequency trading, post trade, let alone real-time.
A recurring theme throughout the day, latency was covered by most of the presentations, especially in the context of FIX. Deutsche Borse’s Hanno Klein, and NYSE Technologies Asia Pacific CEO, Daniel Burgin, stressed that FIX standards are quite at home in the low latency environment, with exchanges around the world already using FIX for their low latency systems. As Mr. Burgin pointed out, “FIX is not slow, but through poor implementation, it can be made slow – and this has happened in various markets”. These comments rang true with the attendees, especially as Mr. Kyung Yoon, Division Head of Financial Investment IT Division of KOSCOM, outlined their plans not only to implement the latest version of FIX at the Korean Exchange, but also that when the new exchange system is rolled out in 2013, that speeds as low as 70 microseconds will be their benchmark. To the ‘icing on the cake’ Mr. Yoon then expressed KOSCOM’s commitment to helping establish a FIX liaison group in Korea that will ensure a highly ‘standard’ implementation of the FIX Protocol.
MC for the day, FIXGlobal’s Edward Mangles, (also FPL Asia PacificRegional Director), welcomed the announcement, stating that he and the FPL Asia Pacific group, looked forward to working more closely with KOSCOM, KRX and the Korean trading community as a whole. With delegates staying put to hear the bi-lingual presentations/discussions throughout the day, (with a few afternoon speakers actually commenting that the crowd in the room was unusually large for the final sessions), the updates on algorithmic trading (Josephine Kim, BAML) and TCA (Ofir Geffin, ITG) provoked a number of follow-up questions and discussions, indicating the delegates’ appetite surrounding these issues.
The FPL Americas Electronic Trading Conference, for those in electronic trading, is always a year-end highlight and this year was no exception. Sara Brady, Program Manager, FPL Americas Conference, Jordan & Jordan thanks all the sponsors, exhibitors and speakers who made this year’s conference a huge success.
The 6th Annual FPL Americas Electronic Trading Conference took place at the New York Marriott Marquis in Times Square on November 4th and 5th, 2009. John Goeller, Co-Chair of the FPL Americas Regional Committee, aptly set the tone for the event in his opening remarks: “We’ve lived through a number of challenging times… and we still have quite a bit of change in front of us.” After a difficult year marked by economic turmoil, the remarkable turnout at the event was proof that the industry is back on its feet and ready to move forward with the changes to the electronic trading space set forth in 2009.
Market Structure and Liquidity Two topics clearly stood out as key issues that colored many of the discussions at the conference – regulatory impact on the industry and market structure as influenced by liquidity, and high frequency trading. An overview of industry trends demonstrated that the current challenges facing the marketplace are dominated by these two elements. Market players are still trying to digest the events of 2008 and early 2009, adjusting to the new landscape and assessing the changing pockets of liquidity amidst constrained resources and regulatory scrutiny. The consistent prescription for dealing with this confluence of events is to take things slow and understand any proposed changes holistically before acting on these changes and encountering unintended consequences.
The need for a prudent approach towards change and reform was expressed by many panelists, including Owain Self of UBS. According to Self, “Everyone talks about reform. I think ‘reform’ may be the wrong word. Reform would imply that everything is now bad, but I think that we’re looking at a marketplace which has worked extremely efficiently over this period.”
What the industry needs is not an overhaul but perhaps more of a fine-tuning. Liquidity is one such area that needs carefully considered finetuning. Any impulsive regulatory changes to a pool of liquidity could negatively impact the industry. The problem is not necessarily with how liquidity is accessed, but the lack of liquidity that results in the downward price movements that marked a nightmarish 2008. Regulations against dark liquidity and the threshold for display sizes are important issues requiring serious discussion.
Rather than moving forward with regulatory measures that may sound politically correct, there needs be a better understanding of why this liquidity is trading dark. While there is encouraging dialogue occurring between industry players and regulatory bodies, two things are for sure. We can be certain that the evolution of new liquidity venues is evidence that the old market was not working and that participants are actively seeking new venues. We can also be assured that the market as a messaging mechanism will continue to be as compelling a force as it has been over the last two decades.
Risk One of the messages that the market seems to be sending is that sponsored access, particularly naked access, is an undesirable practice. Presenting the broker dealer perspective on the issue, Rishi Nangalia of Goldman Sachs noted that while many agree that naked sponsored access is not a desirable practice, it still occurs within the industry. A panel on systemic risk and sponsored access identified four types of the latter: naked access, exchange sponsored access, sponsored access of brokermanaged risk systems (also referred to as SDMA or enhanced DMA) and broker-to-broker sponsored access.
According to the U.S. and Securities Exchange Commission (SEC), the commission’s agenda includes a look specifically into the practice of naked access. David Shillman of the SEC weighed in on the commission’s concern over naked access by noting, “The concern is, are there appropriate controls being imposed by the broker or anyone else with respect to the customer’s activity, both to protect against financial risk to the sponsored broker and regulatory risk, compliance with various rules?” Panelists agreed that the “appropriate” controls will necessarily adapt existing rules to catch up with the progress made by technology.
On October 23, NASDAQ filed what they believe to be the final amendment to the sponsored access proposal they submitted last year. The proposal addresses the unacceptable risks of naked access, and the questions of obligations with respect to DMA and sponsored access. The common element of both of these approaches is that both systems have to meet the same standards of providing financial and regulatory controls. . . Jeffrey Davis of NASDAQ commented on his suggested approach: “There are rules on the books now; we think that they leave the firms free to make a risk assessment. The NEW rules are designed to impose minimum standards to substitute for these risk assessments. This is a very good start for addressing the systemic risk identified.”
These steps may be headed in the right direction, but are they moving fast enough? Shillman added that since sponsored access has grown in usage there are increasing concerns and a growing sense of urgency to ensure a commission level rule for the future, hopefully by early next year. This commission proposal would address two key issues – should controls be pre-trade (as opposed to post-trade) and an answer to the very important question, “Who controls the controls?”
Scott Fitzpatrick, Vice President / Business Manager of FIX Marketplace for NYSE Technologies breaks down the increase in FIX allocations in post-trade, with buy- and sell-side commentary from Wellington Management and Nomura.
Recently there has been a growing trend of post-trade allocations being delivered from the buy-side to their brokers via the FIX Protocol. Over the past year, we have witnessed significant growth in the number of allocations being sent via FIX through our FIX Marketplace community. Comparing the first half of 2009 to the first half of 2010, we have found that allocations sent via FIX has grown over 70%. We believe this high growth could be the start of a true paradigm shift in how the allocation process is treated.
Post-trade allocations are the breakdowns of a block trade – executed in any asset class – to a buy-side firm’s underlying client funds. Historically, allocations have been communicated to the broker by a variety of means and methods like phone, email, or fax as well as various other electronic systems.
These typical methods are failing to keep up with the faster pace of today’s trading requirements and the need to reduce risks and costs from a firm’s trading processes. As the financial industry continues to grow in new directions, we have reached a critical juncture in how post-trade allocations are handled and many forward-thinking financial institutions are turning to FIX to help solve this issue.
Why Change is Afoot
Today’s trading systems handle thousands of client orders and instructions in mere micro-seconds. With such lightning fast systems in place, the post-trade settlement process still takes days to complete. The global banking and market crisis that has occurred over the recent years should drive financial market regulators and practitioners to seriously look at changing the settlement process by reducing the risks and costs associated with the post-trade process.
For example, in Europe, where pan-European trading platforms are becoming the norm, so too will the eventual introduction of pan-European settlement. In this case, and others, reducing risk and costs will result in changes to the lifecycle of the trade and increase the demand for much shorter settlement cycles – possibly even to settle transactions on the day of the trade.
Even without the market dictating change, firms are still looking to reduce costs in this area as every cent spent is under laser focus in order to meet investor and client demands.
Today, we see two main ways in which institutions are trying to reduce this friction through FIX.
Moving traditional middle-office functions, such as allocations, closer to the point of execution
Modifying current post-trade practices to include all asset classes, particularly futures, options and Foreign Exchange
Bringing Allocations Closer to the Point of Execution
Because of the industry’s demand for immediately available information, firms today are looking at trade allocations – once considered to be a purely middle office function - as being an integrated part of the trading process. Buy-side firms are moving middle-office functions like allocations onto the trading floor as they look for new ways to communicate trade details between two trading counterparties.
By moving the allocation process to the front-office, errors relating to allocations can be recognized earlier in the settlement process. This, in turn, helps to mitigate trading errors as the trade moves through the settlement process. This shift in workflow requires new ways to communicate information between trading counterparties. Since FIX is a well established protocol and is ingrained in the current trading workflow for order generation, buy- and sellside firms can take advantage of existing technologies to push FIX into the post-trade process. This has led innovative firms to begin communicating allocations and, in the future, possibly even confirmation details using the FIX Protocol.
Kevin McPartland, Senior Analyst, TABB group explains the metamorphosis of the Exchange today and how the very definition of an “Exchange” is being transformed.
High frequency traders are not the only ones trying to get faster. The last few years have seen exchanges enter an arms race for speed that rivals the most sophisticated trading shops in the world. The focus on reducing latency and increasing bandwidth is so extreme that we are watching the definition of“Exchange” transform right before our eyes. Not because physical trading floors in city centers have been replaced with massive data centers in out–of-the-way industrial areas, but because the exchange business model has fundamentally changed from one that is transaction-based to one that is technology-driven. The reasons why are quite simple. Execution fees have been driven down by competition largely brought on by field-leveling regulations (read Reg NMS and MiFID) enabling competition and in turn making technology, the real differentiator. The exchanges are desperate to both retain and attract more liquidity, but with execution fees often below zero and market monopolies consigned to history, only a serious investment in technology will ensure life throughout the next decade and certainly investments in technology are being made.
The most serious technology investments have been made by the world’s largest equity exchanges. NYSE Euronext purchased Wombat and NYFIX among others to create the newly branded NYSE Technologies, NASDAQ merged with OMX to create an exchange technology provider with global reach, the London Stock Exchange (LSE) recently purchased MillenniumIT to rebuild its matching engine and be its technology arm, and the Deutsche Boerse has long been a technology provider in its own right. Chi-X, the largest multi-lateral trading facility (MTF), has a separate technology arm in the form of Chi-Tech. Most recently, the Tokyo Stock Exchange (TSE) launched its long awaited Arrowhead platform with hopes of entering the low latency trading world. CME Group, BATS, and numerous others have also invested heavily in ensuring they have the latest and greatest technology. Some are working to maintain their dominant market position and the others are continuing their quest to take liquidity from the incumbents.
Exchange differentiation under the new paradigm is not easy. Twenty years ago NYSE traded NYSE listed securities and OTC securities were traded via NASDAQ; shares in UK based companies were traded on the LSE, shares in German companies were traded on Deutsche Boerse, and so on. Now, especially for US and European equities, shares of anything can be traded virtually anywhere. I’m over-simplifying of course, but with globalization flattening the world of stock exchanges, regulations keep everyone on a level playing field and with all measuring speed in microseconds the only obvious differentiator left is the name of the venue. Even if we assume traders naturally migrate to where liquidity is deep and spreads narrow, only through an understanding of exchange technology can one rationalize what causes that situation to occur.
It is a poorly kept secret that high frequency trading firms and proprietary trading desks at investment banks co-locate to shave off microseconds. This practice is at the heart of exchange-client connectivity. These high speed orders are generated within the servers of proprietary trading desks and hedge funds, and are sent via a high speed network into the exchange’s matching engine, all literally residing under one roof. This practice generates the majority of order volume in the US and increasingly in Europe.
Large agency orders from traditional buy-side sources are also important to an exchange’s success, but it is more often the job of the broker to ensure connectivity to the data center for their client flow. Simply put, the sell-side handles inter-data center connectivity and the exchanges handle intra-data center connectivity. TABB Group estimates that North American spending on market connectivity sits at just over $2 billion annually, with 70% of that number coming from the sell-side.