Senior Executive Vice President Michael Lin of the Taiwan Stock Exchange goes through new developments and ongoing challenges for the TWSE.
What was your implementation procedure with your new connectivity platform, and what were you were looking to achieve? Basically, the implementation of FIX/FAST was for two reasons. One was that we see the following of international standards as a major part of our agenda for the development of our IT systems. Secondly, on the Taiwan capital market, the number of foreign institutional investors has been steadily increasing.
Foreign institutional investors already account for one-third of the market capitalisation, and they are very active in investment activity for their clients. Furthermore, many Taiwanese investors are also interested in investing in offshore products. So I think these two reasons are basically why we really wanted to implement FIX/FAST.
What is the current level of FIX Protocol implementation? We implemented the FIX Protocol three years ago. But, honestly speaking, at the current level of implementation, I don’t think we have many trading activities placed through the FIX Protocol. The reason is, for brokers, the problem is on our internal system, we still use our proprietary protocol TMP. It means that even though the brokers can place an order using the FIX Protocol, when the message enters into trading system, a conversion to TMP is needed. That means that for messages placed through the FIX Protocol, the performance is not good enough due to the need for convergence.
However, we are now gradually getting more brokers interested in placing their orders through the FIX Protocol. And, we are constructing a continuous trading mechanism, under the new system, the conversion from the FIX Protocol to TMP is no longer necessary.
Where is the Taiwan Stock Exchange amongst its global peers, as well as amongst its Asian counterparts? We have seen some clear trends in the development of connectivity, especially around the FIX Protocol and FIX/FAST.
Along with the rapid growth of globalised capital markets, we can find some exchanges that are eager to establish closer market connectivity. They are looking to build up close market connectivity for brokers, for example, the New York Stock Exchange proposed to establish connectivity between themselves and the Taiwan Stock Exchange.
The purpose of this connectivity is that it allows brokers in Taiwan or in New York to access the other market easily, just through the connectivity built by the two exchanges. Previously for any brokers in Taiwan, if they wanted to trade on the New York market, they had to find a broker in New York, and they needed to allocate resources to the connectivity solutions through service providers. That was a huge expense, and it was also very time-consuming.
However, once the connectivity between NYSE and TWSE is established, then we can make the connection for the brokers much easier and less costly.
Other exchanges are also looking at the same idea, for example, the London Stock Exchange and even in Asia we can see the Tokyo, the Korean and the Singapore Exchange are exploring connectivity solutions.
So, I think we should focus on this area. Internally, we also held discussions on helping the brokers in Taiwan to extend their global market. I think IT investment and IT management is not easy for many of the local brokers as they are quite small in scale. So we are now really focused on looking into this area.
What are you trying to achieve this year? This year there are several big projects for us. The first one is we are building our so-called “next generation trading system.” We finished the coding at the end of last year, and this year we are working on the testing. We hope by the end of this year, our new trading system can go live. This new trading system really matters because, first of all, we have changed the traditional proprietary platform to open architecture. We changed the language from COBOL to C++ and we also introduced a new middleware.
In terms of application structure, we have taken this opportunity to make the system more structural, separated into modules, and more flexible. We hope this change will make our business development easier.
Secondly, we are also building a new data centre. We thought about having a new data centre for many years and, fortunately, we are now building it. Construction is underway and we hope to finish the building next year, and to start operations in early 2015. The meaning of this data centre is that it is not only a data centre in a traditional sense, but we have constructed the new data centre around the “cloud concept”. This means we have prepared the building to very, very high specifications in terms of cooling, power consumption, etc.
Martin Sexton of London Market Systems Limited examines the benefits of the SKOS network for product classifications.
Being top of the game is the aim of any trade association and FIX members are not scared of putting their heads above the trenches when it comes to identifying a clean and simple solution to an industry issue. Not only has a proposal been put forward that meets the OTC derivative products reporting requirements associated with Dodd-Frank, but it is also working with others such as ISITC and X9 to create a viable solution to meet the need for a globally acceptable classification scheme under the ISO banner. There is an overlap between these two initiatives in that financial products need to be appropriately classified; and this is where semantics has a role to play.
As soon as ontology is mentioned a concern is always raised as to the potential impact on a project due to the risk of adding latency into the deliverable time line. There is the impression that any semantic analysis is complex and yields little in tangible benefits. So what is the missing link? Could it as simple as identifying a framework capable of linking a semantic model onto a physical model?
Simple Knowledge Organisation System (SKOS) is just that framework; it allows the user to provide meaning and understanding to financial product classifications and, where appropriate, a visual representation can be used to help convey its intended use and ensure the industry exploits its capabilities and achieve maximum potential.
If we take the recent FIX Protocol regulatory reporting proposal, SKOS allows us to link the FIX terms with those outlined in the final rule (17 CFR Part 43 - Real-Time Public Reporting of Swap Transaction Data). ISDA for their part has developed a set of taxonomies. Based on four key terms the strict hierarchy under each asset class comprises three related terms, namely Base product, sub-Product and Transaction type. A further extension to this scheme is provided for commodities to support the Settlement type (this can have the value “cash” or “physical delivery”). The FIX proposal covers all asset classes and base products whilst the ISDA proposal only focuses on OTC derivative products.
The CFTC Rule identifies two key facets, namely asset class and contract type, each of which can be broken down further by sub-asset class and contract sub-type respectively. SKOS provides the ability to manage the relationship between all the terms and most important link them to the associated FIX equivalent tags.
At a high level the relationships between the CFTC’s final rule, the FIX and the ISDA proposals can be represented in a diagrammatic form.
Carl Weir of HSBC and Gregg Drumma of Gamma Three Trading discuss how the newly formed Global Cross Asset Committee (GCAC) is helping to support existing FPL asset class focused committees in their best-practice, educational and promotional efforts.
In early 2012, the FPL Global Steering Committee (GSC) discussed the need for a new committee to oversee multi-asset class activities and strategic considerations given many of the common issues and overlapping initiatives of FPL’s Derivatives, Fixed Income and Foreign Exchange committees. The GSC proposed the creation of the Global Cross Asset Committee (GCAC) to oversee the Global Fixed Income, Global Foreign Exchange, and Global Derivatives committees and report directly to the GSC.
While equity markets have matured and been electronic for some time, there has been rapid movement towards a more electronic workflow in other asset classes. Historically, each asset class was segregated, but times have changed and are continuing to do so quickly. Exchange and marketplace consolidations have expanded global liquidity to offer multiple assets. Buy-side firms are increasingly diversifying their portfolios to add a wide range of assets to manage risk and improve returns. Sell-side counterparties are offering increased access and merging desks. A subsequent need for order/execution management systems to handle combined asset coverage has driven vendors to provide solutions to support wide asset coverage. This also includes an everexpanding range of over-the-counter (OTC) products and instruments. Once the GSC recognized the need for a committee, a call for nominations for the newly formed GCAC was sent to FPL members in March 2012, with an election held soon after. Gregg Drumma, Founder and President, Gamma Three Trading, LL C and Carl Weir, EMEA Head of Cross Asset FIX Connectivity, HSBC Global Banking and Markets were elected as co-chairs for a two-year term. Both co-chairs have years of experience in the electronic trading markets of multi-asset trading.
At the time of writing, over 50 representatives from almost 40 different FPL member firms had joined the committee. The members represent a global presence and cover a complete cross section of single and multi-asset exchanges, products, sell-side, buy-side and vendor firms. The committee’s success is driven by its members and the voices from all markets and participants over the entire trade lifecycle.
The GCAC will provide oversight, guidance and help coordinate the efforts of the Derivatives, Fixed Income and Foreign Exchange committees, and provide its findings to the GSC for consideration in specifications, best-practices, and educational and promotional efforts. As a first step towards this goal, an initial survey was distributed to the GCAC members on a range of topics relating to FIX and multi-asset trading, education and forward-thinking technology crossover integration. The results are being collated to help direct the committee to the immediate needs of its members and will be published shortly for group review. The survey questions included topics relating to specific asset classes and technical concerns, regulatory issues, instruments, the discussions around professional certification and training, in addition to certification of the FIX Protocol, and also offered additional room for comment on topics not covered.
Olumide Lala of THE Nigerian STOCK EXCHANGE shares their plan to upgrade their trading system and adopt FIX for trading.
Nothing has a greater impact on the future of organizations than the ability to harness technology. As a result of the increase in the application of technology to market processes, there is today a shift from trading floors to screenbased systems. Electronic trading platforms are increasingly in direct competition with traditional exchanges, amongst other associated developments. Electronic trading, long established and used by the leading stock exchanges, is now the norm across much of the developing world. Although the leading African exchanges, notably the Johannesburg Stock Exchange (JSE, South Africa), Cairo-Alexandria (Egypt), Casablanca (Morocco) and THE Nigerian STOCK EXCHANGE (NSE, Lagos), made the switch over ten years ago, the rest of Africa has been slow to catch up.
Increased Electronic Trading The gap between the trading technologies used by developed, emerging and frontier markets has been contracting rapidly over the last two to three years. Africa’s more peripheral stock exchanges have sought to modernize, largely in response to the interest shown in frontier markets by international investors.
During the 13th annual Conference of the African Securities Exchange Association (ASEA) in Nigeria in 2010, the ASEA identified Africa’s generally positive growth prospects as an opportunity to attract increased foreign investment at a time when growth prospects remain negative or flat elsewhere. For the host nation, the technology roadmap has been a positive one, with the recent change in management, there has been a major drive in transforming the exchange with respect to the business model and technology framework.
Growth in the Nigerian Capital Market The recent growth witnessed in the Nigerian capital market can be attributed to the introduction of remote trading or Electronic Communications Networks (ECN) in 2005. This system enabled brokers to trade in the comfort of their offices without having to come to the trading floor. To date, there are 235 remote trading connections in Lagos besides those deployed in branches across the country. In addition, there are an increasing number of dealing member firms accessing the exchange system remotely.
The ongoing transformation with respect to technology initiatives is one of the major driving forces behind the development of the Nigerian capital market. The long term vision is to make Nigeria’s capital market one of the largest 20 economies in the world by 2020. The Financial System Strategy 2020 blueprint will be used to achieve these goals: developing and transforming Nigeria’s financial sector into a growth catalyst and engineering Nigeria’s evolution into an international financial centre.
Justin Llewellyn-Jones of Fidessa explains how connectivity is adapting to meet the concerns of brokers and traders.
The ability to integrate electronic trading and FIX connectivity into receiving platforms is a minimum requirement for trading. As such, it has become something of a commoditized service. The industry no longer refers to an OMS specific connection, for example, because it does not exist. Instead, brokers rely on consolidators that eliminate the need for individual connections, and provide both asset- and application-neutral connectivity from any client.
Interest in connectivity is being revived because of the growing complexities in marrying the need for fail-safe connectivity with the need to increase revenues and cut costs in the face of persistent downward pressure on margins.
Managing connectivity requires considerable effort: monitoring and capturing order failures and rejections, identifying the source of a problem, repairing it and going back to the client and convincing them to re-send the order is a constant challenge. The costs, time and resources required to maintain the communications infrastructure and relationships with telecommunications and application service providers can be significant.
What’s more, FIX expertise is a very specialized knowledge set that commands a correspondingly high price. FIX language is really more of a framework than a firm protocol. Many participants use the language to their own end to perform tasks such as interfacing with proprietary systems, which makes expert knowledge an absolute requirement.
As an expensive, commoditized service, connectivity appears to be a technology that should be outsourced in its entirety to specialist providers. Not surprisingly, a number of brokers have questioned the value they get from owning connectivity themselves. The infrastructure, the physical network connections, the relationships with telecommunication network providers do not add value, nor do they provide better quality of execution or improved relationships with clients.
But the wholesale outsourcing of connectivity management has not happened, which leaves brokers with ‘half and half’ solutions where, for example, a vendor’s consolidator and router might be used, but relationships with network service providers are maintained in-house. Brokers want to maintain a certain level of control, and do not trust vendors to deliver a cost effective solution, or time sensitive responses and customer service.
It is also considered that core services should not be outsourced at a time when it is becoming apparent that connectivity is a critical service, and one that is hard to separate from essential revenue-raising activity.
Brokers need to improve margins by attracting more flow from increasingly selective buy-sides. They need to do so by generating new investment returns, securing recognizable market differentiation or providing liquidity. Once more, we are seeing a definite trend where the buy-side is taking on the determination of the trading strategy. What buy-side traders need is the ability to confirm a particular broker’s trading strategy and then to customize it to a particular portfolio manager or investment fund. The implication is that buy-sides will increasingly demand instant access to new strategies.
Wendy Rudd of the Investment Industry Regulatory Organization of Canada (IIROC) describes the Canadian approach to circuit breakers, minimum size and increment requirements and the role of dark liquidity.
What is currently driving the regulatory policy agenda with regard to circuit breakers? Globally, and Canada is no exception, we have seen the introduction of new rules in several areas related to the mitigation of volatility. Circuit breakers are just one of those areas. While some reforms may have been in the works already, the Flash Crash of May 2010 certainly served as a catalyst for a broader debate about market structure, trading activity and the reliability and stability of our equity trading venues.
Volatility is inevitable, so when does it become a regulatory concern? From our perspective – and we regulate all trading activity on Canada’s three equity exchanges and eight alternative trading systems – we see it as a priority to mitigate the kind of shortterm volatility that interrupts a fair and orderly market. We do not expect to handle this role alone; it is a shared responsibility that includes appropriate order handling by industry participants and consistent volatility controls at the exchange/ATS level.
What are the benefits of harmonizing circuit breaker rules with US markets? One main advantage to a shared or complementary approach is that it limits the potential for certain kinds of regulatory arbitrage in markets that operate in the same time zone. Many Canadian-listed stocks also trade in the US, and roughly half of the dollar value traded in those shares takes place on US markets each day.
Which approaches are you considering taking for market-wide circuit breakers? We are monitoring developments in the US, where regulators have proposed changes which include lower trigger thresholds calculated daily, using the S&P 500 (instead of the Dow Jones Industrial Average) and shorter pauses when those thresholds are triggered. We are currently exploring options for marketwide circuit breakers which include continuing our existing policy of harmonizing with the US, pursuing a ‘made-in-Canada’ alternative or identifying a hybrid approach that does a little bit of both. At this stage, we are soliciting industry feedback on the merits of these three approaches. With the help of that feedback, we expect to be able to choose the appropriate path soon. It is important to note that these kinds of circuit breakers are an important control but have traditionally acted more as insurance – they have only been tripped once in the US and Canada since being introduced in 1988.
How similar is IIROC’s new Single-Stock Circuit Breaker (SSCB) rule to the US rules? Single-stock circuit breakers are relatively new for both jurisdictions. The US and Canada have implemented SSCBs which are similar in that a five-minute halt is triggered when a stock swings 10% within a five-minute period. Otherwise, the Canadian approach differs in several ways. For example, our SSCB does not trigger on a large swing in price if a stock were trading on widely disseminated news after a formal regulatory halt.
Do you believe circuit breakers, market-wide or single-stock, have a deterrent effect on momentum trading? We did not set out with a prescriptive approach to influence or change trading behaviour or strategy. IIROC’s circuit breaker policies were developed to provide added insurance against extraordinary short-term volatility. We intend to study the impact of any changes and we may be able to learn more about the impact of policy changes on trading behaviour.
TS-Associates’ Henry Young, Co-chair of the FIX IPL Working Group, discusses the anticipated impact of the new FIX Inter Party Latency (FIX IPL) standard.
The FIX Inter Party Latency (FIX IPL) standard, version 1.0, will hit the streets shortly after this issue has gone to press. Now that all the hard work of designing, formulating and testing the standard has been completed, thoughts turn naturally to issues of adoption and impact on the market for latency monitoring solutions. But let’s first revisit the motivation for FIX IPL.
The 1.0 release of FIX IPL is designed to achieve two things:
• The standardisation of where latency is measured, and
• Interoperability between latency monitoring solutions.
The first point enables latency statistics published by different firms to be compared more meaningfully – ‘apples to apples’ style. The second point enables the latency monitoring solutions operated by different firms to be interconnected or ‘peered’, so that inter party latency can be measured without requiring each firm to operate a latency monitoring solution supplied by the same vendor.
The next point to consider is the likely adoption of such an interoperability standard. What’s in it for financial market participants, and for latency monitoring solution vendors? As participation in the FIX IPL Working Group has demonstrated, both constituents see advantages in such a standard existing and being widely adopted. This will make inter party latency monitoring easier, it will reduce costs for participants, lower the barriers to entry for new solution vendors, thereby creating a larger, broader and faster growing market for latency monitoring solutions. This will be to everybody’s advantage. Those who follow the proprietary route and fail to adopt the standard will be left out in the cold. We expect support for FIX IPL to become a standard tick box item in latency monitoring RFPs.
FIX IPL Architecture
The FIX IPL architecture has been designed to support latency monitoring for price and order flows, both FIX and non-FIX, in a wide variety of situations. This schematic shows a simple case that demonstrates the advantages of the FIX IPL interoperability standard. It shows an order flow between two trading parties A and B. Thesecould be either a buy-side client and a broker, or a broker and an exchange. The flow is monitored in each party’s domain using a FIX IPL Source, which observes each message in the flow, extracts some content from each message for unique identification purposes and associates a time stamp with each message observation. A FIX IPL Source transmits a series of observation messages, each of which contains the unique identification content from the observed message and an observation time stamp.
The FIX IPL messages generated by each FIX IPL Source are then brought together by a FIX IPL Sink, which could be operated by either of parties A or B, or by an entirely different party C, as shown in the schematic. The FIX IPL Sink then correlates the observation messages from each of the FIX IPL Sources, matching up the message observations at A and B, and calculating latency by subtracting the observation time stamps. The resulting per-message or per-order latency metrics can then be aggregated into time interval statistics describing the latency properties of the order flow.
Feargal O’Sullivan and Jamie Hill of NYSE Technologies discuss OpenMAMA, the open source middleware Agnostic Messaging API they hope will expedite innovation in services, reduce vendor lock-in and minimize implementation time and cost.
Solving a Problem Choosing a market data vendor because of their API alone is not sound practice. The issue of how to come up with a standard way of accessing market data that allows clients to select a vendor for any range of reasons – other than the API that the vendor happens to offer – has been a struggle for a long time. Something that should be low on any decision-making tree has unfortunately tended to be much more important. There are a number of different consolidated market vendors, including some obvious names like Thomson Reuters or Bloomberg and there is also a range of direct feeds or ticker plant vendors, where instead of going to a consolidator, feeds are accessed directly from an individual exchange.
In selecting a vendor, users must write all their code to suit that vendor’s particular way of accessing the data. Changing to a different vendor requires opening up the source code and altering everything to match how this other vendor wants to access the market data. With a consolidated feed for broad international access and a direct feed for low latency algo trading in US equities, for example, many users have to write according to two to four different APIs. This has been a significant problem for the industry and with OpenMAMA we are trying to drive the industry towards a standard.
User Base This API is an eight-year-old standard that was initially developed by NYSE Technologies as the Middleware Agnostic Messaging API (MAMA), and it is quite heavily deployed in the financial services industry; close to 200 clients already use this API in their custom applications, so today it has an established installed base. We have opened that up and made it a standard by taking the source code for the APIs these firms are using today and provided it to The Linux Foundation, which will physically host the code as a neutral body.
During this process we worked with multiple parties that would not ordinarily use our API. Since the launch of OpenMAMA on 31 October 2011, one of the key factors to this being taken seriously as an open installation, was getting the right level of adoption. Before we launched, we approached a number of customers, other vendors and competitors, out of whom we established our launch partners J.P. Morgan, Bank of America Merrill Lynch, Exegy, Fixnetix and EMC. These launch partners, along with NYSE Technologies, formed a steering committee to drive the direction and the future of OpenMAMA.
From that point forth, each of those organizations who are part of that committee has a stake in Open MAMA. The API is open source under the LGPL 2.1 licence, so it is now owned by the open source community. With participation from Interactive Data, Dealing Object Technologies and TS-Associates as well, we now have a group ten strong and it is a global mix comprising different industries. Whereas before the API was driven largely by NYSE Technologies and our commercial use cases, now it is being driven forward as an industry standard. The more people we have to adopt and participate, the higher the likelihood of achieving that.