Are the markets crumbling or multiplying? With Josephine Kim, Director, Asia Pacific Electronic Sales – Global Execution Services, Bank of America Merrill Lynch.
How is market fragmentation changing and developing across Asia? Japan had the first mover advantage in Asia as the first to welcome fragmentation, but it has not really blossomed, compared to Australia where we have seen a lot more volume growth on the new venues and execution channels. Hong Kong is quite interesting because most people really want to get fragmentation into that market, but there are limitations such as regulation and the proliferation of fees. Brokers do offer internal crossing engines, so a lot of clients are benefiting from those.
India is also interesting because it had two exchanges for years and recently got a third. We do not see that much activity, but at least it is creating a lot of buzz in terms of how fragmented the market is. It will be interesting to watch.
How are other Asian markets developing? It typically depends on how the exchanges are preparing themselves. If you look at Japan for example, they opened up to various alternative venues and were pretty open-minded in terms of sharing their liquidity with other independent venues. But then the exchanges decided to merge to be more competitive as they realised that liquidity is something they want to keep. Australia is forcing the exchanges to share liquidity with best execution rules.
Korea and Singapore are the two markets most likely to come next, although there are other venues like Indonesia and Malaysia where a few broker-dealers offer limited crossing engines. Korea is a very active market, especially for futures and options, so a lot of people are interested in trading, but then again, they do have the investor ID restrictions and they are trying to implement a financial transaction tax so that is going to kind of hinder the attractiveness of trading.
Singapore is slightly different because the market depth is not as attractive as Korea. Singapore used to be a hub for US and European-based high-frequency trading firms, but it just seems to be losing its ground as a hub. Having the longest trading hours in Asia may open more doors for investors from different time zones but without the market depth, it will still be a challenge for Singapore to attract investors and independent liquidity providers. Minimum crossing rules also draws interesting opinions from people as some believe this will enhance and control the market participants and reduce toxicity of the pool albeit the overall reduction on actual crossing opportunity.
How does that variation across markets affect the trading environment? The buy-side used to choose an execution broker based on the level and quality of research and their trading ideas. Today, the buy-side tends to go with the broker that has the liquidity. So, the buy-side traders are often watching the market and watching their stock, so they can see who is on the order book panel, and they tend to put their entire orders on the brokers with the most liquidity. A lot of this change is tied to unbundling, but it is also to do with the liquidity and facilitation as they often find it difficult to trade when there is less liquidity available in the market.
Are changes such as CSAs enabling that unbundling and enabling that separation between research and execution? Are these tools coming into existence to meet that desired change or are these tools enabling that change? These tools are definitely opening the doors for the traders to choose from. It is simply an option that, because of this policy, the buy-side head traders have the independency to choose the best execution brokers and feel less obligated to trade with the best research providers.
On the sell-side everyone is becoming more liquidity sensitive. The buy-side trading instructions are becoming more complicated; the buy-side still want to have the baseline of a simple VWAP or POV as their first and second algos, but when the liquidity comes in, they do not want to miss out that opportunity, so a liquidity seeking type of smart algo, with a combination of base benchmark, seems to be being used more commonly.
How are the liquidity profiles of those venues changing? Let’s use Australia, as an example. It is mandatory there to provide best execution to the client. That means that it is the broker’s responsibility to find the best execution price, across the dark or lit; it is not a choice anymore, it is an obligation. So, because of that, we started looking at the quality of liquidity pools. The number of liquidity pools has gone up a bit such as Chi-X Australia launching in 2011; and there are numerous exchange provided dark and lit pools – see Table A-1. The quality of each venue has risen as various enhancements or improvements have come online. We now care more about where orders are getting crossed within the dark liquidity, whether it is getting crossed at mid or better, or whether it is having any price reversion after the fill has been made. So, I think that quality is top of mind now.
If you look at the market share, we are still talking about a small portion of the pie – see Table A-2. One has to also have context of this and understand how dark liquidity is performing in the US and Europe – see Tables B1 and B2.
Ben Read, Electronic Sales, Merrill Lynch, examines the ongoing trends and potential pitfalls in electronic trading in Australia.
To the year ended July 31, turnover on the ASX had fallen 46% over the previous year on year, with eight down months out of the 12. (Please see diagram1 on next page). August provided some relief, with turnover up 23%, but globally equity volumes continue to decline through the seasonally quiet summer months; driven by a continued rotation out of equities amidst global economic uncertainty – the European Sovereign Debt Crisis, the US election and “Fiscal Cliff”, and Chinese growth moderation at the same time as a once-in-a-decade leadership change, all weigh heavily.
Beyond the macroeconomic (cyclical) challenges over the past 18 months, there has been a significant structural shift in the Australian market. Most significantly, in executing orders on behalf of investors, participants now have two public, or lit, venues and exchange-run/broker dark pools to consider. Best execution obligations set out in the corresponding ASIC Market Integrity Rules have introduced a new set of market integrity rules, which bear resemblance to the European Union’s Markets in Financial Instruments Directive (MiFID). Each require every market participant to now have a policy in place which sets out the framework in which it will meet its best execution obligations to its clients. The similarity to MiFID is its principles-based nature of this best execution requirement, which allows each broker to decide how it determines the best outcome for clients.
This is driving electronic trading/execution to become a more significant part of how Australian markets trade. Globally, we have witnessed this over the past decade and recent data estimates that 75% of US and 55% of European volumes are now being executed electronically. In Australia, we estimate this figure to be closer to 35%; up significantly over the past 2-3 years, but still very low by global standards, which, to us, implies that there is still substantial growth ahead. ASIC has been following this shift towards electronic, or direct, execution and in its recent consultation paper (CP184) provided guidance and rules on automated trading. One point was to ensure that all participants maintain robust filters and controls across all platforms that access the market, with a key requirement to have a kill switch in place that can shut down parts of the execution infrastructure when behaviour outside accepted patterns is detected.
A kill switch can take two forms: it can either be software coded, or hardware-based. The former is a functionality that exists across the execution platform. It allows the trading desk, in combination with oversight functions, to quickly shut down either flow from a particular client, or a particular algorithm, that they believe is not behaving correctly. A hardware-based solution is more complex and involves the physical disconnection of parts of a market participant’s order routing platform from the exchange infrastructure. This is typically accomplished by closing or blocking network switches that exist between the two.
Also vitally important is the real-time monitoring that alerts trading desks to any form of execution that is occurring outside acceptable parameters. Most market participants with established electronic trading platforms have pre-trade filters based on a percentage of AD V, value of single orders, total notional and net delta. Taking this further is the implementation of systems that can alert on abnormal patterns of messages, trades and realised profit and loss. Significant deviations from historical patterns can significantly speed up decision making in whether to involve a kill switch.
After the recent events in the US involving Knight Securities, SEC Chairman, Mary Schapiro commented that “when broker-dealers with access to markets use computers to trade, trade fast, or trade frequently, they must check those systems to ensure they are operating properly.1 ”
In Australia, the investment community is supportive of such initiatives to ensure that overall market integrity and confidence is maintained and that investors and participants are protected from similar events occurring. In the immediate aftermath of Knight Capital, sell-side lines of business right across the industry conducted thorough reviews of their internal risk frameworks and there was plenty of dialogue with clients on what their trading limits were with different trading desks they faced. The majority of this work started before ASIC formally published CP184.
Bank of America Merrill Lynch’s Josephine Kim lays out the reasons why the close price benchmark is so important and how experienced traders can utilize technology to meet the benchmark.
Close Price Benchmark
Mutual Funds Interest in the closing price is due to the structure of the market. Mutual funds are marked at the end of the day at the closing price. This is the published net asset value (NAV) that indicates that the mutual fund will be marked once a day, and where all the purchases and sales of that mutual fund will be marked at. This is a relic of the market structure of mutual funds. Now, mutual funds are not traded each year or all day long – buys and sells are made at any point during the day and the price is given at the market close. If mutual funds are traded on any given day, the mutual fund trader is going to attempt to get the closing price if at all possible.
Transition Trades Another way in which a trader would try to beat the closing price is via a transition trade. This is when a pension fund moves assets from one manager to another by selling the assets in one manager’s holding in order to buy the assets from another. The pension fund may go to a specialist transitional manager, who takes all the holdings from the current manager, sells them, uses the proceeds to buy the holdings for the new manager, and then passes those holdings back to the new manager. This process has a mark on valuation and the closing price is that mark.
Index Rebalancing The final type of trade that allows traders to beat the closing price is index rebalancing. Asset managers may opt for a guarantee on their trade. Brokers will bid aggressively for those trades with the same tradeoffs occurring. Index rebalancing trades have an additional nuance in that everyone in the market typically knows whether the street has to buy or sell a stock for the rebalance, so brokers will put out research ahead of the rebalance.
The brokers then have to make a tradeoff, knowing there is potential demand to buy or sell these stocks from many different buyers and sellers. Traders must decide whether to trade early because they know everyone is going to buy the stocks when the index weight goes up, or trade them later (or even the next day), because everyone is going to sell. In the last few years, more and more investors have the requisite information to make the right decision on rebalancing trades.
Implications of Missing the Close Price The implications of missing the close price benchmark are straightforward. If the trader purchases below the close price, the mutual fund investor buys the fund at a higher price than their investment, which creates transfer of wealth from the new purchaser to the shareholders of the fund. If the mutual fund is not able to beat the close price but requires funds in excess of the new investment, then the existing shareholders subsidize the purchase.
In a transition trade, missing the close price on the funds to sell and the funds to buy creates numerous problems. For transition trades, brokers will very often give an assessment of the expected range of slippage ahead of time, based on the liquidity of the portfolio they are trying to liquidate and the portfolio they are trying to invest in. The broker clearly hopes to land within that range. If they are outside that range however, the transitional manager might refund some of their fee or maybe even negotiate a profit share ahead of time to incentivize them to minimize slippage. Unless the transition manager is able to buy at a discount, missing the close price means a loss to the fund. This slippage can have major implications for the value of assets retained by the client.
Feargal O’Sullivan and Jamie Hill of NYSE Technologies discuss OpenMAMA, the open source middleware Agnostic Messaging API they hope will expedite innovation in services, reduce vendor lock-in and minimize implementation time and cost.
Solving a Problem Choosing a market data vendor because of their API alone is not sound practice. The issue of how to come up with a standard way of accessing market data that allows clients to select a vendor for any range of reasons – other than the API that the vendor happens to offer – has been a struggle for a long time. Something that should be low on any decision-making tree has unfortunately tended to be much more important. There are a number of different consolidated market vendors, including some obvious names like Thomson Reuters or Bloomberg and there is also a range of direct feeds or ticker plant vendors, where instead of going to a consolidator, feeds are accessed directly from an individual exchange.
In selecting a vendor, users must write all their code to suit that vendor’s particular way of accessing the data. Changing to a different vendor requires opening up the source code and altering everything to match how this other vendor wants to access the market data. With a consolidated feed for broad international access and a direct feed for low latency algo trading in US equities, for example, many users have to write according to two to four different APIs. This has been a significant problem for the industry and with OpenMAMA we are trying to drive the industry towards a standard.
User Base This API is an eight-year-old standard that was initially developed by NYSE Technologies as the Middleware Agnostic Messaging API (MAMA), and it is quite heavily deployed in the financial services industry; close to 200 clients already use this API in their custom applications, so today it has an established installed base. We have opened that up and made it a standard by taking the source code for the APIs these firms are using today and provided it to The Linux Foundation, which will physically host the code as a neutral body.
During this process we worked with multiple parties that would not ordinarily use our API. Since the launch of OpenMAMA on 31 October 2011, one of the key factors to this being taken seriously as an open installation, was getting the right level of adoption. Before we launched, we approached a number of customers, other vendors and competitors, out of whom we established our launch partners J.P. Morgan, Bank of America Merrill Lynch, Exegy, Fixnetix and EMC. These launch partners, along with NYSE Technologies, formed a steering committee to drive the direction and the future of OpenMAMA.
From that point forth, each of those organizations who are part of that committee has a stake in Open MAMA. The API is open source under the LGPL 2.1 licence, so it is now owned by the open source community. With participation from Interactive Data, Dealing Object Technologies and TS-Associates as well, we now have a group ten strong and it is a global mix comprising different industries. Whereas before the API was driven largely by NYSE Technologies and our commercial use cases, now it is being driven forward as an industry standard. The more people we have to adopt and participate, the higher the likelihood of achieving that.
Despite the rapid advances in sophisticated trading tools, Bank of America Merril Lynch’s Anthony Victor argues that in times of volatility a knowledge of the basics has never been more important.
While market structure and technologies may have transformed, basic trading skills are still critical to success.You do not succeed in any career unless you learn the basics and build a solid foundation in the fundamentals. For a role in electronic trading, the basics include understanding trading mechanics, market structure and technology, as well as the platforms that clients use to trade. Those old monochrome Quotron machines that were prevalent on trading floors when I started my career are now on the trash heap, replaced by state-of-the art technology, including touch-screen order management systems and workstations that supply news, market data, and analytics.
Since 2000, there have been some very dramatic changes in market structure such as decimalization, the advent of Reg. NMS and an increasing number of liquidity pools. Some of these changes resulted in a reduction of bid/offer spreads and a decrease in average trade size, which in turn, pushed market participants to use more advanced trading technology and ultimately algorithmic trading.
Algorithmic trading strategies, initially used by Portfolio Desks to manage large baskets of stocks, were eventually rolled out directly to the buy side. Trading no longer required multiple phone calls with instructions to execute an order. With a mere push of a button, these instructions could be sent electronically and trading goals could be efficiently realized. However, use of these tools still requires a good understanding of the basics and a team of support professionals that understand the nuances of how algorithmic strategies operate in the marketplace.
Within the last five years, Electronic Sales Trading (EST) desks have emerged on Wall Street to support the clients' electronic trading activity. Unlike traditional sales trading, which focuses on what clients are trading, electronic sales trading puts the emphasis on how they are trading. Most EST desks have evolved from a pure internal support role into a client-facing, direct coverage role that assesses a client’s performance via real-time benchmark monitoring and post-trade transaction cost analysis. Electronic Sales Traders need to understand market structure and their firm’s algorithmic offering (and that of the competition) to successfully support the trading platform.
While market structure, trading tools, and trading desk responsibilities have all evolved over time, basic trading skills are still critical to success. Part of that skill set includes the ability to maintain a disciplined approach to trading amidst a barrage of news, overall market fragmentation, and a huge volume of market data. Algorithms have assisted the buy-side coping with the complexity of the marketplace, but the choice of strategy ultimately belongs to the trader.
In Q4 2008, when volatility (the amount of uncertainty or risk about the size of changes in a security's value) peaked and preyed upon market participants’ emotions, many traders moved to more aggressive electronic trading strategies. In that tough environment, traders migrated away from passive strategies, like VWAP and lowparticipation algorithms, to more aggressive liquidity-seeking strategies, including an increased use of ‘dark pool’ aggregators. In a more volatile environment, traders felt pressure to make a stand or risk higher opportunity costs.
However, sometimes intuitive approaches do not work as expected. When analyzing the effects of the volatile market and looking at the slippage, or the difference of the execution price versus the price at time of order receipt (arrival price slippage), these aggressive strategies proved less successful than passive strategies, primarily due to the effects of volatility on spreads and depth of book. During that period, our research shows that S&P 500 spreads widened an average of 72% and book depth decreased 42% compared to Q1. Wide spreads and decreased book depth created a treacherous environment for aggressive, liquidity-seeking strategies, but favored more passive strategies.
The FPL Americas Electronic Trading Conference, for those in electronic trading, is always a year-end highlight and this year was no exception. Sara Brady, Program Manager, FPL Americas Conference, Jordan & Jordan thanks all the sponsors, exhibitors and speakers who made this year’s conference a huge success.
The 6th Annual FPL Americas Electronic Trading Conference took place at the New York Marriott Marquis in Times Square on November 4th and 5th, 2009. John Goeller, Co-Chair of the FPL Americas Regional Committee, aptly set the tone for the event in his opening remarks: “We’ve lived through a number of challenging times… and we still have quite a bit of change in front of us.” After a difficult year marked by economic turmoil, the remarkable turnout at the event was proof that the industry is back on its feet and ready to move forward with the changes to the electronic trading space set forth in 2009.
Market Structure and Liquidity Two topics clearly stood out as key issues that colored many of the discussions at the conference – regulatory impact on the industry and market structure as influenced by liquidity, and high frequency trading. An overview of industry trends demonstrated that the current challenges facing the marketplace are dominated by these two elements. Market players are still trying to digest the events of 2008 and early 2009, adjusting to the new landscape and assessing the changing pockets of liquidity amidst constrained resources and regulatory scrutiny. The consistent prescription for dealing with this confluence of events is to take things slow and understand any proposed changes holistically before acting on these changes and encountering unintended consequences.
The need for a prudent approach towards change and reform was expressed by many panelists, including Owain Self of UBS. According to Self, “Everyone talks about reform. I think ‘reform’ may be the wrong word. Reform would imply that everything is now bad, but I think that we’re looking at a marketplace which has worked extremely efficiently over this period.”
What the industry needs is not an overhaul but perhaps more of a fine-tuning. Liquidity is one such area that needs carefully considered finetuning. Any impulsive regulatory changes to a pool of liquidity could negatively impact the industry. The problem is not necessarily with how liquidity is accessed, but the lack of liquidity that results in the downward price movements that marked a nightmarish 2008. Regulations against dark liquidity and the threshold for display sizes are important issues requiring serious discussion.
Rather than moving forward with regulatory measures that may sound politically correct, there needs be a better understanding of why this liquidity is trading dark. While there is encouraging dialogue occurring between industry players and regulatory bodies, two things are for sure. We can be certain that the evolution of new liquidity venues is evidence that the old market was not working and that participants are actively seeking new venues. We can also be assured that the market as a messaging mechanism will continue to be as compelling a force as it has been over the last two decades.
Risk One of the messages that the market seems to be sending is that sponsored access, particularly naked access, is an undesirable practice. Presenting the broker dealer perspective on the issue, Rishi Nangalia of Goldman Sachs noted that while many agree that naked sponsored access is not a desirable practice, it still occurs within the industry. A panel on systemic risk and sponsored access identified four types of the latter: naked access, exchange sponsored access, sponsored access of brokermanaged risk systems (also referred to as SDMA or enhanced DMA) and broker-to-broker sponsored access.
According to the U.S. and Securities Exchange Commission (SEC), the commission’s agenda includes a look specifically into the practice of naked access. David Shillman of the SEC weighed in on the commission’s concern over naked access by noting, “The concern is, are there appropriate controls being imposed by the broker or anyone else with respect to the customer’s activity, both to protect against financial risk to the sponsored broker and regulatory risk, compliance with various rules?” Panelists agreed that the “appropriate” controls will necessarily adapt existing rules to catch up with the progress made by technology.
On October 23, NASDAQ filed what they believe to be the final amendment to the sponsored access proposal they submitted last year. The proposal addresses the unacceptable risks of naked access, and the questions of obligations with respect to DMA and sponsored access. The common element of both of these approaches is that both systems have to meet the same standards of providing financial and regulatory controls. . . Jeffrey Davis of NASDAQ commented on his suggested approach: “There are rules on the books now; we think that they leave the firms free to make a risk assessment. The NEW rules are designed to impose minimum standards to substitute for these risk assessments. This is a very good start for addressing the systemic risk identified.”
These steps may be headed in the right direction, but are they moving fast enough? Shillman added that since sponsored access has grown in usage there are increasing concerns and a growing sense of urgency to ensure a commission level rule for the future, hopefully by early next year. This commission proposal would address two key issues – should controls be pre-trade (as opposed to post-trade) and an answer to the very important question, “Who controls the controls?”
Has the industry found its latest villain in the form of dark pools? Not so, argued a group of traditional and alternative trading venue operators over dinner in Singapore last week. Dark pools are nothing new; they’re just finding their feet in Asia’s rugged exchange landscape. FIXGlobal’s Becky Merrett took a look at developments in the industry.
Caught in the middle of a seasonal Singapore early evening downpour, a group of regional specialists make the dash from their taxis to the warmth of an Italian restaurant. The roll-call reads like the Who’s Who of trade execution – Singapore Exchange (or SGX as it is most commonly known), BlocSec, Liquidnet, Chi-X, ITG, Bank of America Merrill Lynch and Credit Suisse. Have dark pools taken over from hedge funds as the baddie-du-jour in Asia? Should dark pools and exchanges compete, cooperate or co-exist?
Before the first cork was pulled opinions were flying, fuelled by the discussion’s volunteer ‘devil’s advocate’ in the form of Credit Suisse’s James Rae, (also Co-Chair of the FPL Singapore Working Group). “What is the purpose of a dark pool versus a traditional exchange? Why do we need alternative venues in Asia? And how do they interchange?” he asked.
“Dark pools have been around for a number of years,” argued Bank of America Merrill Lynch’s Mark Wheatley, fresh off a plane from Japan. “They’re not new in Asia. It’s mostly an extension of the internal broker systems. The alternative trading systems (ATS) we see now in Asia is the industry responding to the demands of their clients by creating a more formalised system.”
Discussing whether ATS should or not should not exist was pointless it seemed, as I toyed with my antipasti. “These are market and client-driven initiatives. All markets evolve, and financial markets evolve faster than most. To try and back track is both unwanted and unwarranted. Judging from the response from the markets in the US and Europe, ATS are here to stay,” Chi-X’s Rob Rooks stated emphatically.
Competitors or complementary? Before the starters were finished, we’d killed the notion that ATS were going to slip quietly away into the night. Instead the conversation turned to the respective roles of traditional exchanges versus off-exchange platforms.
Liquidnet’s Greg Henry weighed in. “An exchange is about price discovery, it’s about listing and taking companies to market. Our focus is on efficiency, latency, liquidity and best execution.”
Unsurprisingly, it was a common view among the alternative venues around the table. Trading activity on the NYSE, they argued, now accounted for less than 30 percent of revenue. The role of traditional exchanges was increasingly focused on listing and sourcing capital. “The stringent regulations on listing, the information required, it provides a comfort blanket for investors,” Henry argued.
“At the end of the day, we all have to create the structure that works for our clients,” Henry concluded.
The right structure for your client? It was a theme that emerged again and again over the evening. The overriding – although not unanimous – feeling was that dark pools catered for one kind of investor, while exchanges provided security and solace for others.
“We don’t want to list organisations. The compliance involved in the process doesn’t fit with our business model. We’re more interested in a symbiotic relationship between ATS and exchanges. We attract different investors with different strategies. The investors trading through our venue are more likely than not to only hold a position for 10 minutes or less,” explained Rooks.
It was time for our lone exchange operator to pitch in. “We’re comfortable with the competition. Although, if we see a proliferation of venues, such as we’ve seen in the US, this is not going to help the region,” said SGX’s Bob Caisley. “We feel that the best way to move forward is to understand what dark pools offer and to let our clients access this technology,” he added.
It was an understandable position, given the recent announcement of a joint venture between SGX and multilateral-trading facility, Chi-X. The deal, which was inked in August, is aiming to launch its Chi-East non-displayed liquidity pool by June 2010. Clearly the move has raised the stakes as it is the first time in Asia that a dark pool has the backing of a regional exchange.
Timothy Furey, Goldman Sachs, Neal Goldstein, Nomura and John Goeller, Bank of America Merrill Lynch, shed light on the process of managing risk in electronic trading.
At the start of this year, FPL announced the completion of an initial set of guidelines, which recommends risk management best practices in electronic trading for institutional market participants. In the third quarter of 2010, FPL launched a group to raise awareness regarding the implications of electronic trading on risk management and to develop standardized best practices for industry consideration. Over the last few months, the group, which consists of a number of senior leaders in electronic trading from the major sell-side firms, has been working on developing this set of guidelines to encourage broker-dealers to incorporate a baseline set of standardized risk controls.
The objective of the guidelines is to provide information around risk management and encourage firms to incorporate best practices in support of their electronic trading platforms. In today’s volatile marketplace, the automation of complex electronic trading strategies increasingly demands a rational set of pre-trade, intra-day and pattern risk controls to protect the interests of the buy-side client, the sell-side broker and the integrity of the market. The objective of applying electronic order risk controls is to prevent situations where a client, the broker and/or the market can be adversely impacted by flawed electronic orders.
The scope of the particular set of risk controls included in the guidelines is for electronic orders delivered directly to an algorithmic trading product or to a Direct Market Access (DMA) trading destination. The recommended risk controls included provide the financial services community with a set of suggested guidelines that will systemically minimize the inherent risk of executing electronic algorithmic and DMA orders.
In what area are sell-side and buy-side firms’ risk controls most in need of improvement?
Timothy Furey, Managing Director, Goldman Sachs and FPL Risk Management Committee Co-Chair: One of the observations coming from the FPL risk sessions was that the buy-side and sell-side had really given considerable thought to their own individual firm’s risk controls. That said, both the sell-side and the buy-side should continue to focus on pulling together a standard, consistent base set of controls that their respective firms can reasonably implement. Therefore, it is more a question of standardization than a need for specific improvement.
John Goeller, FPL Americas Regional Committee Co-Chair and Managing Director, Global Execution Services, Bank of America Merrill Lynch: This effort was not necessarily to address an apparent deficiency in how the buy-side or the sell-side handles risk management, but to codify a set of best practices for all firms to use. It was generally accepted when we started this process that all firms implement some level of risk controls around their business. Our goal was to identify the most common ones and ensure that we have a base set of controls that all firms can implement.
Neal Goldstein, Managing Director, Nomura Securities International and FPL Risk Management Committee Co-Chair: It is important for the buy-side community to recognize that their efforts to implement risk management controls for electronic trading will be more effective when a collaborative effort is made with their sell-side executing brokers. For algorithmic and conventional (low frequency) DMA orders, the first line of defense should be the risk controls incorporated within the buy-side OMS/EMS. The most effective risk control is to prevent a questionable order from leaving the buy-side OMS/EMS. A specific factor that the buy-side should be looking at more closely is the impact a given order has on available liquidity. While the order validation employed by many buy-side clients accounts for notional value and order quantity, another factor that needs more consideration is the Average Daily Volume (ADV) during the trading interval. Creating an order to trade, where the volume participation rate may exceed ADV for a given interval, can have significant adverse impact on execution price and algorithmic performance, particularly for illiquid names.
What role, if any, should the exchanges play in implementing risk controls?
John Goeller: Most exchanges have technology solutions (in certain situations it is mandatory) around risk management. In some cases, these tools are optional and only work when accessing a particular exchange. Regardless, if a firm is utilizing exchange provided tools, home-grown, or vendor-supplied, they can still leverage our efforts to understand whether their tools are implementing industry best practices.
Bank of America Merrill Lynch’s James Wardle takes a look at adverse selection in public dark pools.
Dark pools provide sources of non-displayed liquidity facilitating anonymous matching between counterparties helping reduce market impact costs and minimise information leakage. Marketshare of pan-European dark volume has risen dramatically over the past few years hitting a year high of 2.5% in May 2010, see Figure 1. As executed volumes in the dark continue to rise and high-frequency volumes increase, uncertainty over fill-quality is at the front of everyone’s mind — at what cost does accessing dark liquidity come?
High-frequency (HF) participants (e.g. hedge-funds, market makers, etc.) have an investment horizon that is typically much shorter than the traditional long only institution. It is the coming together of these two distinct flows that exacerbates the occurrence of adverse selection. The use of short-term alpha forecast models by HF traders and other market participants to try and opportunistically execute at temporary lows for buys and highs for sells means that the opposing counterparty may be liable to early execution at local price maxima (minima) for buys (sells) and thus becomes the victim of adverse selection - see Figure 2.
Adverse selection can also arise from gaming. This is where the presence of a large block is detected in the dark by a number of ping orders (small orders looking for size), after which the informed trader waits for a temporary adverse price spike before they send a large order to consume the liquidity found in the dark. This results in the informed trader obtaining size at a favourable price to them at the expense of the uninformed trader who again becomes the victim of adverse selection. Across large orders the cost of adverse selection can add up and seriously damage returns.
Post-trade, adverse selection can be identified in two main ways: measuring the performance of the fill to short-term price-movements, and looking for reversion patterns postfill. Short-term price movements are well-captured by using the Time-Weighted Average Mid-price (TWAM) measure; this takes the average mid-price T seconds before the fill and T seconds after the fill. Varying the TWAM time-frame helps us to identify any executions that may have occurred at temporary price spikes. A positive return of the fill price to TWAM implies we have filled at a price better than short-term price movements (positive-selection) and a negative return implies that we have filled at a worse price (negative or adverse selection).