Quant and prop traders share perspectives with Weng Cheah, Managing Director of Xinfin, about the evolution of high frequency trading.
It is unsurprising that we feel swamped by our rapidly changing industry. However, to bring some experience to these words, I had a number of conversations with quantitative and proprietary trading professionals who are responsible for managing money for themselves, or in a fund. Whilst it is not appropriate to name these individuals, the following reflects some of their perspectives.
Trading has changed dramatically in the last 25 years; firstly in that we are no longer physically present in the pit. One US-based hedge fund manager I spoke to went so far as to say that the industry had “never seen so much change in one person’s lifetime.”
This ‘electronification’ of the markets was the necessary catalyst to what has been a continuous evolution in trading, where technology has been a constant companion. However tempting it is to assume, one thing is certain, where we are today did not start by someone saying “I need to be microsecond quick to win.”
Information Process The investment process tries to manage uncertainty by seeking information that can be sorted into a model through which we can understand the value of an asset. Information is at the heart of all investment, what is curious is that all investors do not select the same information.
There are those who will research the company and build fundamental models from the financial statements and returns as their basis for trading, and a tactical allocation model based on how macroeconomic trends could set their trade quantum.
However, there are also traders who would look at asset price history and examine price actions to set their strategy. As one US-based fund manager said “the price of corn knows more about corn than I do” reinforcing the idea that price is the source of all information.
Quantitatively they recognise that they can increase their absolute return without taking on any additional risk, by stepping up the frequency of trading. Although transaction costs are higher, this is more easily managed than market risk.
Corwin Yu, Director of Trading at PhaseCapital, sits down with FIXGlobal to discuss his trading architecture, the proliferation of Complex Event Processing (CEP) and why he would rather his brokers just not call.
FIXGlobal: What instruments does your system cover? Corwin Yu: At the moment, we trade the S&P 500, and we have expanded that to include the Russell 2000, although not as an individual instrument, but as an index. We also trade the E-Mini futures on the Russell 2000 and also for the S&P500. We have done some investigation on doing the same exact type of trading with Treasuries using the TIPS indices and the TIPS ETFs and a few of the similar futures regarding those as well. We are not looking at expanding the equity side except to consider adding ETFs, indices, or futures of indices.
FG: Anything you would not add to your list? CY: We gravitate to liquid items with substantial historical market data because we really do not enter into a particular trading strategy unless there is market data to do sufficient back testing. Equities was a great fit because it has history behind it and great technology for market data, likewise for futures, where market data coverage has recently expanded. Options is a possibility, but the other asset class that is liquid but not a good fit is commodities. We shy away from emerging markets that are not completely electronic and do not have good market data. While we have not made moves into the emerging markets, we know that some other systematic traders have found opportunities there.
FG: How much of your architecture is original and how often do you review it for upgrades? CY: In terms of hardware, we maintain a two year end-of-life cycle, so whatever we have that is two years old, we retire to the back-test pool and purchase new hardware. We are just past the four year mark right now, so we have been through two hardware migrations. Usually this process is a wakeup call as to how technology has changed. When we bought our first servers, they were expensive four-core machines with a maximum memory of 64 GB. We just bought another system that can handle 256 GB through six-core processors. We are researching a one year end-of-life cycle because two years was a big leap in terms of technology and we could have leveraged some of that a year ago.
The Capital Markets Cooperative Research Centre (CMCRC)’s Alex Frino talks about his research over the past 18 months and the conclusions as to the truth about high-frequency trading.
What inspired you to focus your research on High Frequency Trading (HFT)?
There is a very poor understanding of the impact of HFTs on the market place. There is a lot of ill-informed opinion in circulation about the impact of HFT on price volatility, and their contribution to liquidity. I wanted to provide some hard data to help markets move forward and inform sensible evidence-based policy decisions.
There was also considerable interest in the idea of conducting HFT research from our regulator partners, including the FSA and ASIC.
What were your views on HFT at the outset of your research program?
When we first set about doing the research 18 months ago, I began by speaking to the investment management community to gather their views and insights into HFT and its impact on their trading. The feedback I got was overwhelmingly negative. One comment sums it up best – an investment manager said to me that “liquidity provided by the HFT community is like fog – you can see it, but when you reach out to grab it, it is not there.” So I began the program expecting to confirm these dominant views. To my surprise we discovered that the realities about HFT are almost exactly the opposite of that the investment managers were telling me.
HFT liquidity has been described as ephemeral by many on the buy-side. What does your research suggest about the ability of the buy-side to interact with HFT liquidity?
We have done research with data from the LSE, ASX, SGX, NASDAQ and NYSE Euronext on exactly this subject. The exchanges furnished us with data that identifies when HFTs are present in the market place. We then looked at the make-take decision. HFTs make liquidity when they put up a quote that gets hit by someone on the other side of the trade. They take liquidity when they hit someone else’s quote. The data clearly showed that HFTs are net makers of liquidity.
Interestingly some of our data also included information about when firms are trading through co-located servers within the exchanges. This data too showed that co-lo HFT activity was also a net provider of liquidity in those markets.
Co-location is described by some as an ‘unfair advantage’. What is your take on that given your research into the area?
My view is that if the advantage is being put to good use in providing liquidity, then it is not being misused. That pool of co-located flow is providing liquidity that would not be there otherwise, so I cannot see how that is a negative for markets.
Many market participants – including recent widely-quoted comments by Andrew Haldane of the Bank of England – are critical of the speed and sophistication of markets generally, using HFT as their example. They argue the playing field is not level and that markets should be slowed to take away perceived unfair advantages. What is your view?
I was frankly amazed by Haldane’s suggestion that markets should be slowed [by introducing speed limits and resting periods]. What he is in effect suggesting is that we should take markets backwards by a decade. That is astonishing to me because I just do not see the arguments. Market participants who do not have the technology to compete with other players can easily access brokers with algorithmic trading engines to help them execute their trades. If you cannot or do not want to build the technology yourself, you can outsource it fairly cheaply and very efficiently.
From an HFT perspective, our research demonstrates emphatically that the liquidity they provide is real and other participants interact with it constantly, so I cannot see a problem there either.
Direct Edge’s Kevin Carrai explains how they chose their new data center and what criteria were most important in making the decision.
Choosing the right data center is crucial to the success of any trading business. Gone are budgets supporting the practice of co-locating with every market center, which forces firms to take a closer look at how to optimize their connectivity and infrastructure without sacrificing performance or profits. In addition, firms now have more options than ever for where to host their IT infrastructure given the growth in the data center landscape driven by high frequency electronic trading, especially in northern New Jersey. There are several important things to consider when making this critical business decision - proximity to liquidity, cost savings, flexibility and scalability.
Proximity to Liquidity
Anyone familiar with the real estate market knows that the three most important factors are location, location and location. However, the factors that make one location better than another may change over the course of time. At one point in life, proximity to bars and restaurants may be a priority; after starting a family or getting a pet, proximity to good schools and parks may become more important. The same holds true when you are finding a home for your trading infrastructure. In the heyday of high volumes and deep pockets, firms colocated in multiple facilities regardless of cost in order to be as close as possible to liquidity destinations. In today’s trading environment, firms are reconsidering this need and are looking for a facility where they get the biggest ‘bang-for-their-buck’.
The New York/New Jersey/Connecticut tri-state area has become the location of choice for data centers that cater to the U.S. financial markets.Within this geographic area, if firms cannot be everywhere, where should they be? It is more advantageous for firms to be in a centralized location, where they can access many financial resources rather than duplicating their infrastructure at multiple data centers. Facilities that host companies who provide the same services foster healthy competition, thereby forcing vendors to improve functionality and curb costs.
Proximity to competitors, dark pools and other liquidity destinations was a key feature that attracted Direct Edge, a U.S. equity exchange that currently trades more than 9% of total consolidated volume, to NY4, an EQUINIX data center. Other market centers that have also chosen NY4 include International Securities Exchange (ISE), Hotspot FX, Boston Options Exchange (BOX) and CBOE’s new C2 Options Exchange. Having other liquidity destinations just a cross-connect away enables Direct Edge to have a robust, high performing, efficient connectivity infrastructure at a low cost.
Not only does NY4 provide proximity to a multitude of resources internally, it is well positioned physically in the NYC area. The latency between EQUINIX’s NY4 facility and Savvis, a third-party data center that hosts major exchanges and dark pools, is less than 100 microseconds.
Recent conventional wisdom has been that in order to ensure profitability, trading firms had to have a presence in every data center where there was accessible liquidity. As the number of market centers increased, so did the expense required to install and support trading and telecommunications infrastructure within each facility. Firms would gladly pay the increased expense while volatility and volumes were high, but they are now taking a hard look at this strategy.
In these uncertain market conditions, firms are more cost-conscious than ever. In order to remain profitable, firms have significantly reduced technology and telecommunication spending and can no longer support the trend of co-locating their trading infrastructure in every facility. With the increasing premiums imposed by many exchanges for co-location space, trading firms are trying to save money by reducing their footprints and minimizing hand-offs. Therefore, the selection of a few key facilities, or one facility, has become a realistic alternative to multi-facility co-location, especially with the advent of low latency connectivity between market centers.
Christian Zimmer, Head of Quantitative Trading and Research, and Hellinton Hatsuo Takada, Quantitative Trader, of Itaú Asset Management reveal the truth about high frequency trading in Brazil.
Conference panels, discussions and articles on High Frequency Trading (HFT) generally start with its definition. The term HFT is like ‘Cleopatra’ – sexy and mysterious and everyone is keen to know more about it. But the term HFT speaks for itself, so is it wasting time to go over it again?
Probably, because the term ‘high’ only has meaning relative to an external point of reference, just like cold, hot, sweet or other adjectives. This subjectivity is all the more interesting, as it is extremely difficult to measure an investor’s brief holding period in most financial markets and, therefore, determine if it really is ‘high’. Unlike in the US, where the exchanges do not register the origin of the trade, Brazilian regulation allows BM&FBOVESPA to identify the final client on every trade. Consequently, it is much easier to measure the holding period of an investor for each asset. Also, this rule is the means by which the exchange determines whether an investor’s trade is classified as a ‘day trade’ and is thus eligible for reduced fees.
Naturally, BM&FBOVESPA does not classify a trader opening a position in the morning and closing it at the end of the day as a high frequency trader. There should be far more trading than this to qualify as HFT. But how much more? It depends on the exchange’s criteria and reference point for ‘high’.
Figures for HFT published by BM&FBOVESPA in their April 2011report show 3.9% of the BM&F segment is high frequency and 5.9% of the BOVESPA segment. Consequently, the reduced fees are presented to the Brazilian trading community as less of an issue, as they say there is evidence of HFT taking hold. But HFT volume is not really increasing and is still far off the US figures which are often cited at around 60-70%. After carefully observing BM&FBOVESPA market prices, it is easy to conclude that it would take some time (possibly hours) to have a change in the prices sufficiently large enough to pay the transaction costs.Remember that HFT strategies are very sensitive to transaction costs.
Our suggestion is to step away from making subjective references to ‘high frequency’. Instead, one should look at the underlying trading strategies. The incentives an exchange should create to attract flow must be adjusted to the strategies that are really needed. Each strategy deserves a different set of policies and this will help the diversification of the traders’ strategies.
A trader using a market maker strategy can live with exchange fees as long as the bid-ask spread is sufficiently high. If the spread narrows, the costs become crucial and the exchange must lower the fees in order to keep this client in the market. On the other hand, a directional trader has different issues; if the fees are high, a trader must wait longer for a relevant price move so that they can capitalize on their position. Contrary to the market maker, the directional trader loves to see narrow bid-ask spreads. There would be no need to lower fees when the spread is close. The same is true for the statistical arbitrage traders.
When looking at the third party analyses of HFT in the international markets, we often see that the most common strategy is the market maker approach. This fact is strongly influenced by market fragmentation, which we do not have in Brazil. Fragmentation creates new intermarket trades, which could qualify as arbitrage trades, but not necessarily as market maker trades. Fragmentation also makes exchanges and other venues compete for the customers that provide liquidity and, as a result, give incentives to market makers. As mentioned above, Brazil does not have a fragmented market and BM&FBOVESPA does not see it necessary to ask for more liquidity. At least not as long as international capital flows are strong and increasing. Liquidity is needed in second tier shares and below.
Can there possibly be a silver lining in the current financial meltdown? John Knuff of Equinix, argues that now is a time to upgrade your investment, allowing the Asia Pacific region to catch up with its US and European peers.
While the global financial crisis has inevitably had an impact on investment, most commentators seem to agree, that the Asia Pacific market will see on-going development, particularly as it continues to invest in the infrastructure and technologies, that will allow it to match its US and European counterparts, in key areas such as execution speed, easier market access and direct data feeds.
Analysts such as Celent see the current downturn as a significant opportunity for Asia exchanges, even suggesting in a recent report that Asian exchanges have the potential to overtake their US colleagues in the near future. Before this can happen, however, there needs to be sustained investment in the technology, skills and processes that will enable lower latency, easier access and faster data feeds across the region.
One of the key challenges remains the diversity of the region. While the geographical diversity, and vast distances involved, will always make it hard for traders to gain low latency access to multiple market centers, the added complexity of local regulations and last mile access make region-wide performance goals even more difficult to achieve. Nevertheless, factors such as direct market access, the increasing presence of alternative trading systems and the introduction of crossing networks, will have a tremendous impact shortly after local regulations ease.
Investing to close the gap with other global markets To assume that the different markets in the Asia Pacific region will progress seamlessly together towards a more deregulated and open environment would be unrealistic. The global financial crisis is already leading some Asia Pacific exchanges and regulators to be more defensive in their outlook. However, it also provides an opportunity, for more traditional venues, to develop and implement their own alternative trading strategies to compete, more favourably, with new market entrants as conditions improve.
It is this imperative to remedy the handicap of limited bandwidth and slower trading platforms, that is driving Asia Pacific financial institutions to continue to update their technology infrastructure. As many of the incumbent exchanges re-tool their matching engines and foster technology partnerships, with global leaders like NASDAQ OMX and NYSE Euronext, the broker / dealer communities are quickly positioning themselves to be the partner of choice for many of their US and European counterparts.
Given this background, we believe it’s important for Asian market participants to ensure they are making the right infrastructure and connectivity choices today, to allow them to compete more effectively tomorrow.
A world of more end points and more trading venues In an Asia Pacific market driven by the continued growth of automated and algorithmic trading, the emergence of new liquidity opportunities and increasing numbers of order destinations and market data sources, we’re increasingly going to see financial firms trading a much wider range of asset classes and instruments across broader geographies.
In those countries, where the incumbent exchanges still handle the majority of trading, these new market developments will have a significant impact as local traders who, limited by their current choices, increasingly send order flows to more accessible and transparent electronic markets. All this translates into more end points and execution venues, and is driving demand among financial services firms for a greater choice of networks with low latency/ high bandwidth capabilities to enable these higher message rates and optimise throughput.
With the landscape of the Asia Pacific market's different trading centers evolving so quickly, it’s becoming increasingly apparent that a strong element of foresight, and much broader connectivity options, will play as important a role as proximity, when it comes to making location decisions across the Asia Pacific region.
Making the right technology decisions This is important given the highly volatile and competitive nature of today’s financial markets. Trading volumes are shifting dramatically, new entrants are changing the market opportunities, and there’s a continued growth in automated, algorithmic and alternative trading strategies. At the same time, many markets are fragmenting away from their traditional single venue exchange-based structure, while major players continue to join forces in line with the trend of globalization and consolidation.
Whatever your line of business, there’s a pressing requirement for a stable, global infrastructure that assists you in achieving your market goals. Whether you’re an asset management firm or a hedge fund, you depend on the ability to access continuous streams of global market data, messaging, news, history and analysis to ensure successful execution of your trading strategy. Or you could be an exchange that needs to be able to quickly connect to participants or clients, receive orders over a range of different financial extranets and broker connections, post or match the orders almost instantly, and respond in milliseconds (or increasingly microseconds).