Quant and prop traders share perspectives with Weng Cheah, Managing Director of Xinfin, about the evolution of high frequency trading.
It is unsurprising that we feel swamped by our rapidly changing industry. However, to bring some experience to these words, I had a number of conversations with quantitative and proprietary trading professionals who are responsible for managing money for themselves, or in a fund. Whilst it is not appropriate to name these individuals, the following reflects some of their perspectives.
Trading has changed dramatically in the last 25 years; firstly in that we are no longer physically present in the pit. One US-based hedge fund manager I spoke to went so far as to say that the industry had “never seen so much change in one person’s lifetime.”
This ‘electronification’ of the markets was the necessary catalyst to what has been a continuous evolution in trading, where technology has been a constant companion. However tempting it is to assume, one thing is certain, where we are today did not start by someone saying “I need to be microsecond quick to win.”
Information Process The investment process tries to manage uncertainty by seeking information that can be sorted into a model through which we can understand the value of an asset. Information is at the heart of all investment, what is curious is that all investors do not select the same information.
There are those who will research the company and build fundamental models from the financial statements and returns as their basis for trading, and a tactical allocation model based on how macroeconomic trends could set their trade quantum.
However, there are also traders who would look at asset price history and examine price actions to set their strategy. As one US-based fund manager said “the price of corn knows more about corn than I do” reinforcing the idea that price is the source of all information.
Quantitatively they recognise that they can increase their absolute return without taking on any additional risk, by stepping up the frequency of trading. Although transaction costs are higher, this is more easily managed than market risk.
Michele Patron, Senior Quantitative Trader, AllianceBernstein talks to Stuart Baden Powell, Head of European Electronic Trading Strategy, RBC Capital Markets about sell-side algorithms, efficient sourcing of liquidity, the need for pre- and post-trade transparency and high frequency trading.
Stuart Baden Powell, RBC: Recently, there has been much discussion about improvements in sell-side agency algorithms: some would argue that the core ‘building blocks’ of scheduled and opportunistic algorithms remain virtually identical, built around the same underlying models; others would point to a more radical shift away from mere incremental enhancements. Regardless of view, what is clear is that the buy-side is taking control of its execution destiny. Concerns about a reduction in trust, together with insufficient transparency of internal operations from many brokers have all contributed towards the shift. Whilst some buy-side firms will purchase off-the-shelf, canned algorithms from the sell-side, marginally tweak them and call them their own, other institutional firms are taking matters more into their own hands. Quantitative trading has been of huge importance to hedge funds over recent years. However, there are now a few select long only houses moving to incorporate quantitative trading in-house and link this to their own fundamental trading strategies. AllianceBernstein would fall into that latter bracket – Michele, you have worked at both CQS and BGI and now run European Quantitative Trading at AllianceBernstein. Could you talk us through what you are up to?
Michele Patron, AllianceBernstein: I think that the discussion about how much buy-side firms should rely on the sell-side for trading research should have a definite answer by now: it is well-recognised that saving transaction costs represents an important source of alpha – even for medium turnover strategies. In addition, the wealth of information that buy-side firms have about their own flow cannot be achieved by counterparties, especially in multi-broker interaction scenarios: an accurate estimate of an alpha decay profile, which could be based on simple internal factors (i.e. order reason or PM strategy), will give the buy-side an important trading advantage.
At AllianceBernstein, we see our counterparties as partners, both in the high and low touch space. Within Quant Trading – which is globally headed by Dmitry Rakhlin – we continually try to analyse and customize execution algorithms, after we have had open discussions with our key counterparties. The ‘building blocks’ that you referred to earlier, are in principle easy to understand, and all the algo offerings out there can be bucketed into a few categories: the sell-side can offer us a hedge with smarter technology and expertise around execution tactics. There are some very smart options available in the market to minimize the latency arbitrage effect on client flows. A good solution can be achieved without tweaking the most relevant variable for this problem – system latency.
Two key tasks for a buy-side trading desk are: sourcing liquidity efficiently – especially for high AD V orders – and managing momentum – for low AD V. Having the ability to provide electronic solutions to address the latter, gives traders the opportunity to concentrate on the first task.
Richard Nelson, Head of EMEA Trading for AllianceBernstein, shares his perspectives on navigating volatility, prospects for developing exchanges, new regulation and the balance between transparency and best execution.
FIXGlobal: How much does volatility affect the way that you trade and what are you using to measure volatility on the desk?
Richard Nelson, AllianceBernstein: We use an implementation shortfall benchmark, so the longer we take to execute an order, the wider the range of possible execution outcomes. Volatility, in particular intraday volatility, increases that potential range, so you could see very good or very poor execution outcomes as a result. In reaction to that, we take a more conservative execution strategy or stretch the order out over a longer time period. And, for instance, if we get a hit on a block crossing network, we will not go in with as large a quantity as we would in a less volatile market. In that way we try to dampen down the potential effects that volatility might have on the execution outcome.
FG: How is AllianceBernstein using technology to improve performance and cut costs on the trading desk?
RN: It plays quite an important part and has done so for quite a while. We are pretty lucky in that we have a team of quant trading analysts. Most of them are in New York, but we have one here on the desk in London, and they help us to analyze the changing market environment and recommend the best ways we can adapt to it. Our usage of electronic trading has increased in the last year, we benefit from the quant trading analysts looking at the results we are achieving with our customized algorithms. We are more confident about getting good consistent execution outcomes because they are monitoring the process and making the necessary changes to ensure the results are what we are expecting. This, in turn, increases the productivity of the traders I have on the desk. They can place their suitable orders into these algorithms and let them run which allows us to focus on trying to get better outcomes on our larger, more liquidity-demanding orders.
On top of that, as market liquidity has dropped significantly, we are trying to make sure we reach as much potential liquidity as possible, and ideally we want to do that under our own name rather than go to a broker who then goes to another venue. We believe that going directly into a pool of liquidity is better done under your own name rather than via a broker because we can then access the ‘meaty’ bits of the pool rather than the ‘froth’. We are looking into ways of doing that but one of the problems is that, potentially, you get a lot of executions from a number of different venues, which results in multiple tickets for settlement. Our goal is to access all these potential liquidity pools, yet also control our ticketing costs, which are a drag on performance for clients.
FG: Was it an intentional change to increase electronic trading or was it a byproduct?
RN: It was a little of both. Our quant trader has been with us for two years and when he first arrived he had to sort out the data issues that exist in Europe and to clean things up. Once the data integrity was sorted out, we looked at different ways of employing quantitative analyses. Having somebody here who is constantly monitoring the execution outcomes means we can proceed down this path with real confidence. As a London firm, we were a little behind in our adoption of electronic trading, but now we are in the middle of the pack in terms of usage. It makes sense from a business and productivity perspective that there are many orders that do not need human oversight, which are best done in algorithms.
Guosen Securities’ Shen Tao reveals the latest trends in algo usage by Chinese asset managers, domestic mutual funds and Qualified Foreign Institutional Investors (QFIIs).
Who are the primary customers for algorithmic products in China? Algorithmic trading started in the Chinese A share market some time in 2007. In 2005, the first commercial FIX engine went live to accommodate the execution needs of the Chinese A share market of Qualified Foreign Institutional Investors, or QFIIs, as part of the plan by the Chinese government to allow regulated capital market investment by foreign investors. After an initial experimental phase of FIX connectivity with global trading networks, the local FIX trading platform became solid enough to interface with a real algo engine. In 2007, some leading global investment banks (predominantly, QFIIs from the sell-side) began to offer algorithmic trading facilities for their clients and their own proprietary trading desks. Most of these facilities were located offshore (e.g. Hong Kong) and connected to the Chinese brokers’ FIX gateway via a financial trading network such as Bloomberg.
The earliest providers and users of algo trading in the Chinese market were solely QFIIs and their clients. In 2008, although the global market was in turmoil and many infrastructure budgets were cut across the international financial community, there were still some firms seeking expansion opportunities for the future. Among them, some global banks with local brokerage joint venture subsidiaries began to build their onshore algo facilities. At about the same time, some leading purely local brokers also started their efforts in algo development, Guosen among them. We started in March 2008 and also targeted QFII investors for algorithmic trading, however, we understood the future of algorithmic trading in the Chinese market would rest on the domestic mutual fund industry. In late 2009, the Guosen algo platform was almost ready and the aforementioned onshore algo facilities run by the sell-side joint ventures of global banks also went live. The day of the algo had finally arrived for China.
In 2010, with support from a leading buy-side OMS vendor Hundsun; Guosen and UBS began their efforts by offering an algo solution for local mutual fund companies. In November 2010, UBS won its first success with two Beijing-based mutual fund companies, with Guosen securing a third six months later. Since that time, more than a dozen mutual fund companies have started using algorithms from UBS and Guosen. 2010 was the first year of the algo, from a local perspective. Currently, the momentum of mutual fund companies adopting algo platforms continues. We estimate that by the end of 2011, in terms of assets under management, over 40% of the local mutual fund industry could be covered by broker-provided algo services.
In retrospect, QFII investors were the founders of the market, but soon, the local mutual fund industry will become the primary user of algos. In addition, we foresee insurance companies adopting algo trading soon.
Daniel Ciment of J.P. Morgan details the development of Brazilian algos and outlines the most effective strategies for trading in Brazil.
Using Algos in Brazil Already accustomed to trading with algorithms or using algorithms to trade strategies in different markets around the world, as international buy-side traders look to Brazil, they want to trade there in the same way they have traded elsewhere. Even though having just one exchange makes the data feed more streamlined, because of the low liquidity profile of certain stocks in Brazil, you cannot use algorithms to trade all stocks electronically. For the more liquid names, many traders are using benchmark algorithmic strategies, like VWAP, percentage of volume, or arrival price. Most algorithmic strategies are based on benchmarks for now, as buy-side traders seek to replicate the methods they use elsewhere, while obviously taking into account the intricacies of the market structure. In the end, if they trade with algorithms in the US, Europe and Asia, they want to trade with algorithms in Brazil as well.
Infrastructure and Volume Spikes This is one of the challenges that we face as an industry. As you are building electronic infrastructures, you have to build for growth and not just for where we are today. When we look at a market, whether it is Brazil or more developed markets like the US, Europe or Asia, we know what we are trading today, but we have to build to accommodate what we will trade in a year, two years and what we think the peak might be. Just because a market trades a couple of hundred million in a day, or in the US, 8 billion shares a day, it does not mean you build your plan to support 8 billion shares a day because a year from now, that figure might be 20% higher.
More so, if a major event happens next week, then that figure might double, so you need to build sufficient headroom. Right now, we can handle a lot more than what we manage on a daily basis, but that is on purpose to make sure that at times of stress we are there for our clients and that they can trade through us with full confidence.
DMA or Boots-on-the-Ground? To be successful in a market like Brazil, brokers need to have people on-site who know the local investor community and know the local financial community. J.P. Morgan has a major trading presence in Sao Paulo, and that is just one piece of the offering in Brazil. For small firms who want access, outsourcing is a realistic option, but if you are going to be big in a market, especially in a market like Brazil, an in-country trading team is required.
Technical Challenges Reliable trading requires market data and telecommunications systems, which are present in Brazil, along with data center space and algorithms that are tuned to the local market and market structures. This tuning includes the liquidity profiles of the stocks as well as the rules and regulations of the exchange; you cannot apply the same algorithms from one region to another and expect them to work. We spend a lot of time and effort, fine tuning our algorithms, testing them on our desk and then rolling them out to clients. It is not just copy-and-paste.
CIBC’s Thomas Kalafatis maps out the new CSA rules regarding direct electronic access and suggests its potential effects on brokers and institutional traders.
Are the updated Direct Electronic Access (DEA) requirements a response to patterns endemic to Canada or are they a response to patterns observed elsewhere? Given the existing Investment Industry Regulatory Organization of Canada (IIRO C) rules and the timing of the Canadian Securities Administrator (CSA)’s DEA rule proposal, it is fair to say that the rules proposed by our regulators are intended to maintain consistency with changes in other jurisdictions and prevent regulatory arbitrage. We do not believe that the rules are the result of a specific effort to solve a localized Canadian problem, but rather a preventative measure to ensure structural issues that have arisen elsewhere will not take root in Canada.
The issues around direct electronic access raised in the United States (who is accessing marketplaces directly, and how they are ensuring automated systems will not malfunction) are less of a concern in Canada. TMX rule 2-501 limits who is eligible to receive DEA access, restricting DEA to wellcapitalized firms, or firms that are registered and regulated in certain other jurisdictions.
IIRO C Notice 09-0081 addresses how automated systems should be managed to mitigate the risk of malfunctions. It requires brokers to manage the risk of electronic trading by clients in the same way that they manage the risk of their own electronic trading. This includes ensuring that automated risk filters are in place, that order flow from an automated system can be interrupted/switched off by the broker, and that strategies are tested prior to being deployed to market. These basic, principlesbased protections have been effective at mitigating risk in Canada since well before the wave of automation hit our markets in 2008.
The proposed DEA rules are a movement away from the IOSCO principles-based approach that has traditionally been taken in Canada, towards a more prescriptive regime more like the 15C-3-5 rules introduced by the SEC in the United States this year. This builds consistency between the Canadian and American jurisdictions that are so closely intertwined.
Automated pre-trade risk filters are in place for many brokerdealers. How difficult will this regulation be to implement? Broker-dealers will need to monitor the proposed rules closely, particularly with regard to their Sponsored Direct Market Access (SDMA) clients. These clients have their own sophisticated automated risk management systems in place – as required by UMIR rules and, more importantly, as a result of their own risk aversion. They connect directly to exchanges to minimize latency. The DEA rule proposes to change this, in parallel to 15C-3-5 in the US, in that brokers will need to have “direct and exclusive control” over the risk filters on client flow; this means that a duplicative set of filters operated by the broker will have to be put in place.
In this case, Canadian brokers benefit from the earlier adoption of 15C-3-5 in the United States where various technologies have been developed to meet SEC rules that went into effect in the summer of 2011. Depending on the needs of its client base, a Canadian broker can choose between several types of risk filter offerings operating in a latency range from the low milliseconds to the low microseconds. The only differentiator is cost, with a significant premium on the single-digit microsecond lowest latency offerings.
Generally, it is not economic for a Canadian broker to develop the ultra-low latency solutions in-house, and the Canadian broker community benefits from the availability of third party technologies developed to meet the US rules that came in to effect earlier this year.
Corwin Yu, Director of Trading at PhaseCapital, sits down with FIXGlobal to discuss his trading architecture, the proliferation of Complex Event Processing (CEP) and why he would rather his brokers just not call.
FIXGlobal: What instruments does your system cover? Corwin Yu: At the moment, we trade the S&P 500, and we have expanded that to include the Russell 2000, although not as an individual instrument, but as an index. We also trade the E-Mini futures on the Russell 2000 and also for the S&P500. We have done some investigation on doing the same exact type of trading with Treasuries using the TIPS indices and the TIPS ETFs and a few of the similar futures regarding those as well. We are not looking at expanding the equity side except to consider adding ETFs, indices, or futures of indices.
FG: Anything you would not add to your list? CY: We gravitate to liquid items with substantial historical market data because we really do not enter into a particular trading strategy unless there is market data to do sufficient back testing. Equities was a great fit because it has history behind it and great technology for market data, likewise for futures, where market data coverage has recently expanded. Options is a possibility, but the other asset class that is liquid but not a good fit is commodities. We shy away from emerging markets that are not completely electronic and do not have good market data. While we have not made moves into the emerging markets, we know that some other systematic traders have found opportunities there.
FG: How much of your architecture is original and how often do you review it for upgrades? CY: In terms of hardware, we maintain a two year end-of-life cycle, so whatever we have that is two years old, we retire to the back-test pool and purchase new hardware. We are just past the four year mark right now, so we have been through two hardware migrations. Usually this process is a wakeup call as to how technology has changed. When we bought our first servers, they were expensive four-core machines with a maximum memory of 64 GB. We just bought another system that can handle 256 GB through six-core processors. We are researching a one year end-of-life cycle because two years was a big leap in terms of technology and we could have leveraged some of that a year ago.
J.P. Morgan ‘s Frank Troise sat down with FIXGlobal to chart the expansion of electronic trading tools available to the buy-side and point out which new tools will make the difference in the months to come.
In what way has the trader’s desktop improved?
Over the last few years, the biggest improvements have been the inclusion of more multi-asset class execution capabilities and the inclusion of additional analytics. Desktop trading platforms that support equities options, futures and FX trading, with the ability to track all of those orders in the market and give aggregated profit and loss are much more prevalent. More trader desktops incorporate pre-trade analytics measures, such as market impact estimates as well as post-trade execution information.
What do your clients say they want most from their analytics?
Clients want a combination of real-time and post-trade analytics. Prior to starting the trade, clients want tools that help their investment decision process. Once an investment decision is made, pre-trade market impact and trade scheduling tools can help traders develop an implementation game plan. Through the course of the trade, clients like to see real-time analytics that can help them improve the performance of their trade; for example, abnormalities around volatilities and volumes. Post-trade, clients want performanc reports measuring actual execution costs against various benchmarks on a daily, monthly, and quarterly basis.
How does putting so much technology in the hands of the trader change the role of the broker? How does the broker add value in addition to the electronic tools?
In the electronic broker business, our value added comes in our role as execution consultant and our ability to educate clients on the use of pre- and post-trade analytics and execution tools. I look at the roles and responsibilities of the people on our electronic client trading desk as helping clients implement their investment ideas. When a client has a trade to execute, it is up to our team to educate that client on the tools they can use to put together a plan, present them with the tools to execute the trade and while they are executing the trade, provide information that can be used to improve their plan throughout the execution period.
After the trade is executed, we work with clients to evaluate how well they did against their plan and help them improve their trading process in the future. We focus on creating and enhancing client products continuously. Our goal is to make it easy for them to use analytics and execution tools to achieve best execution. The better we understand a client’s goals and objectives the more we can collaborate with the client on custom solutions and training. Electronic trading products are very different from traditional equities execution capabilities. A key differentiating characteristic is that the products reside and are used by the client at the client site. In the traditional model virtually no broker technology oriented product existed at the client site.
The communication mechanism for order delivery was the telephone and execution occurred in the broker/ dealer environment. Electronic brokering is a very intrusive business. Our products exist in the client’s technology infrastructure. This has led to changing core competencies of brokerage firms. We now have to be experts at delivering products into the client site. This has implications on training and technology integration.
How does the electronic broker assist clients in locating liquidity, either through tools or the consulting process?
Liquidity has and continues to be a top priority for clients. They have always come to brokers to find liquidity in as ‘quiet’ a way as possible. In today’s landscape, much of that liquidity exists in electronic form and is fragmented. The result has been a proliferation of tools (e.g., algos, routers) that help clients navigate liquidity pools to logically consolidate the fragmented liquidity. To assist in that process we have created a pool to concentrate order flow across various trading desks, retail segments of the broader J.P. Morgan Chase organization, transition management flow, and third party broker dealer flow. I refer to it as a centralized electronic merchandise hub.