Capital Group’s Brian Lees is driving efforts to ask more questions of brokers, and for more data on where an order is shown before it executes, but can the buy-side handle the resulting deluge?
The current work you are doing on venue reporting analysis Our first push was simply to try to collect information about ‘where’ we were executing and a little bit about ‘how’ we were executing, namely, did we post or did we take liquidity. So having done that, the question was where do we go from there? And as such, the topic of requesting more data on where we didn’t execute and what order types were used started to be raised by some representatives on the FPL Americas Buy-Side Working Group. Some participants had already started down this road with brokers, asking for information relating to post-trade about where their orders were sprayed out to by the algorithms and what types of orders were placed on exchanges and also which exchanges they were on, etc. So that’s where the conversation began and that’s where we reached out to Jeff Alexander and Linda Giordano, because Barclays had already spearheaded this conversation.
What we are looking to achieve either in real time or post-trade, is whether we can standardise a format for brokers to tell us how our order interacted with the market, including when the order was placed, what order types were used, where it was placed in the markets and whether or not we got hits. The concern with this is not so much can we get it, because if we sign enough non-disclosure agreements we can get the information from the brokers. Some brokers have concerns about that information getting out and somebody reverse-engineering their algorithms, but from the buy-side perspective, I think the biggest concern is whether we can manage the volume of data that we would get.
The resources to store and analyse data and make some sort of good use of it With the original data that we were getting, on where the execution took place, we talked a lot about this with smaller firms who were using TCA vendors to help them analyse this information. With this type of information, if we went a step further, the brokers would not want us sending that out to TCA firms, because it shows their methodology for how their algorithms behave. I was in New York several weeks ago and took the opportunity to meet up with Jeff and Linda while we were there. We invited Jeff to join one of our conference calls for the buy-side committee, which he did, and he talked about what they’ve been proposing. He showed proposals for both the real-time collection of data, via FIX messages, actually proposing a whole new FIX message to be created for this purpose, which could then be sent in real time. Or, alternatively we could standardise a format for collecting the information post-trade which, as a spreadsheet, would then tell us what we want to see. We’re trying to standardise how you ask for the data and what format it is going to be in, by creating best practices for how to get the data from the brokers. That way the brokers don’t have to keep coming up with a different format for every client that asks for it. The best practices do specify that the ISO MIC codes would be the standard for identifying the exchange that you executed on, but we said nothing about what you should do with the data once you get it.
Exchange involvement in the conversation We did talk to some exchanges when we were first trying to standardise how to identify the exchanges, because when we first standardised the MIC codes, they did not cover all the exchanges, this was due to the fact that they hadn’t all registered with the ISO organisation and we wanted them to.
We had a little bit of trouble in differentiating the dark order books from the lit order books and some of the exchanges that have both. These exchanges consider themselves a hybrid book, and they didn’t want to be known as two different things. We didn’t have a way to differentiate the dark and the lit flow without introducing yet another FIX tag. That back and forth added to the conversation as part of the registration authority’s decision to come out with the new market segment concept, which says you can have an exchange defined and have child MIC codes that differentiate different segments of the market. We’re beginning to start conversations with exchanges about this topic, but that’s the extent to which we’ve had any discussion with them.
Broker willingness to participate in the process The first half of this, just getting the information about where you executed, the brokers didn’t have any problem, because it’s public record once it executes. When we started talking about the more detailed reporting, they did raise a concern about the information being sent out and NDAs so that, you, as a client, are not going to send the data out to a third party. But because other firms had already started down this road we talked about the purpose of this, which was just to have someone looking over their shoulder to make sure that they are acting in the best interest of the client and not potentially favouring rebates over best execution; they can’t really argue with that logic. Somebody should have some oversight as to whether or not the right decisions are being made.
Asia’s market structure creates demand for increasingly granular trading information – as Kent Rossiter, Head of Asia Pacific Trading Allianz Global Investors and Michael Corcoran, Managing Director ITG discuss, FIX can help.
Asia Pacific faces different liquidity challenges to other regions, particularly given that spreads are often much wider and are therefore an even more significant contributing factor to overall trading costs (See Chart). As the trading environment evolves in the region and the focus on managing costs grows, the requirements for transparency and feedback on trading increases. This is happening in parallel with the evolution of new trading venues in the region, particularly dark pools. Buy-side traders now want a greater level of detail on their dark pool fills to help them understand the behavior of their orders and manage their execution venues proactively to get the best trading result.
Kent Rossiter heads up the Asia Pacific trading desk of Allianz Global Investors, and is constantly looking for ways to improve the efficiency of their process and minimise the costs of trading. From his perspective, while post-trade TCA is now well-established, a particular growth area is the requirement for more detailed data on a shorter timeframe. He explains “We as buy-side traders are now trading an increasing amount of our orders ourselves using the electronic tools available, and when we do so we want more granularity and data fed back to us: which venues are our orders being executed in, at what price, and how aggressively. We want information that helps us adjust strategies on the fly for better trading outcomes, or quickly review the results so we can manage our future performance.”
One result of this is new demand in the region for analysis of maker/taker indicators on orders so that a trader can identify how often they are crossing the spread to find liquidity. Allianz Global Investors has been working with ITG and other brokers in the region to implement support of maker/taker analysis to help the trading desks improve their insight into market conditions and get more transparency into the behavior of their orders in dark venues.
Understanding Maker/Taker Understanding whether an order is making or taking liquidity is important, particularly in wide-spread environments such as many of the Asian markets. Michael Corcoran, Managing Director of ITG, says “Traders want to know instantly whether they are providing liquidity or taking it, instead of retrospectively needing to compare fills and timestamps manually against what the market was trading at. This can be very useful information to help them adjust the trading strategy in real-time to the market conditions and the liquidity available. It can also help determine what kind of ‘throttle’ they should put on their strategy or their algo to find the right level of aggressiveness for the orders they are working. In addition to that it can also be a very valuable tool for sell-side firms, helping to refine the development and rules of algorithmic strategies and improve strategic ideas that will work for certain clients or order types.”
This is of growing relevance in a multi-venue environment, for example in Asia where over the past few years a lot more broker dark pools have been developed. Many buy-side firms now choose to use a dark aggregator to help improve their efficiency in accessing multiple venues, and here some kind of maker/taker liquidity analysis can be a helpful data point for assessing the type of outcome a trader is getting in those pools. Corcoran explains “Both ITG as a dark aggregator, and our buy-side clients themselves, want to understand whether orders are consistently making or taking liquidity in a specific dark venue so that the impact can be assessed – for example if our client’s orders always take liquidity in a certain venue we would review that to understand why. If we can pass that data directly back to the clients they can then make a decision about whether they want to be removed from that venue or change the distribution of their order flow across different pools. Likewise, if we see orders taking liquidity then see an unexpected change in the stock’s trading profile, this can be a useful warning indicator about the participants in a specific pool.”
FIX Tag 851 – a Potential Solution A specific FIX Tag, 851, or Last Liquidity Indicator, has been developed by FIX Protocol Ltd (FPL) as an identifier of maker/taker behavior. The US appears to have the most established support of liquidity-indicating tags with exchanges able to pass the data back to brokers and most of those brokers able to pass that on to clients. In Europe, likewise the large exchanges and brokers can support this, although there is less among the mid and smaller brokers.
However, in Asia the tag is sparsely supported, if it all, by the exchanges, alternative lit trading venues and many of the broker dark pools. Firms therefore have to come up with interpretive solutions and workarounds to give their buy-side clients a higher level of detail and transparency on their trading, particularly in dark pool aggregation.
Rossiter would prefer an industry-wide approach to improving transparency and the availability of maker/taker data which includes vendors, brokers, and most importantly the exchanges “Typically the actual FIX tag for this information is supposed to be generated by the exchange or trading venue, and it is passed to the brokers who need to be able to identify and accept that tag and then pass it into the vendor EMS or OMS platform that the client is using. So there are a number of parties within the workflow who are affected and they need to collaborate to bring in changes. An industry-wide adoption of the relevant FIX tag would definitely be a good solution”.
Carlos Oliveira, Electronic Trading Solutions at Brandes Investment Partners examines the process of choosing a TCA provider, and the role of FPL.
We use Markit’s Execution Quality Manager (formerly known as QSG) for equity trading TCA. Our decision to switch providers was based on increased algorithm usage, a desire for more functionality, greater execution transparency and most importantly, the availability of more granular data for analysis via FIX.
We FTP our data daily and the results are available to us no later than US market open the next day. Trades are reviewed against traditional and custom benchmarks. We grant access to every trader and risk member, so that they can construct their own views as desired. Typically on a quarterly basis, we conduct our own and adapt broker studies to better understand the impact of our orders.
The implementation process We evaluated four providers before making our final decision. We wanted a flexible platform that would accommodate maximum self-serving, custom reporting needs; minimal ongoing maintenance or upgrades requiring internal resources; and flexibility on custom solutions, such as the proper measurement of our ADR creation activity.
One vendor offered a very rich solution that was beyond our needs. For two others, we were not comfortable with the process for submitting data and how much work we would need to do internally. A key determinant was the overall level of commitment to the implementation, which we concluded Markit’s Managing Director Tim Sargent clearly demonstrated. It took us roughly two months to solidify the extract process and we went live on January 1, 2011.
TCA has become a key component of our trading process and we continue to realise value, primarily for post-trade at the moment. The value comes from the constant learning about our orders, what has worked well or not, and the adapting and improving of trading.
The large amount of data to analyse can be overwhelming at first and easily misinterpreted if not careful.
Frequent and honest dialog with the vendor, the traders, as well as tapping other sources of knowledge (i.e. broker TCA contacts and industry publications) is key to a successful implementation. Many reports went through several iterations, sometimes a quarter or two apart, before we got it to a meaningful and actionable state.
To avoid having too much of a one-side perspective, we compare broker-provided TCA reports with our vendor often. This helps the dialogue with both the brokers and the vendor – keeps both parties engaged and attentive.
The role of FPL Our interaction with FPL began with the TCA implementation.
In late 2010, in conferences as well as in industry press, many parties were encouraging the buy-side to gain a better understanding of broker SOR practices and where the orders were getting executed, but with no actionable recommendations outside a specific platform. Being broker-neutral, the FIX execution venue reporting best practices proposed in early 2011 by the FPL Americas Buy-side Working Group helped us to move forward with this goal in the TCA platform. FPL Membership has enabled further contact with other buy-side firms and knowledge sharing not available otherwise to a smaller firm.
We started by asking for Tag 30, LastMarket. Broker responses to the data request varied greatly across brokers and regions. Correspondence spanned many months and contacts, particularly when we asked for MIC codes as opposed to proprietary values. We understand the queue priorities of brokers’ systems and demands of larger clients, and are very appreciative for what they have done thus far.
Some of our broker relationships have been exceptionally supportive in this effort, leading to enhanced dialogue on routing practices and more meaningful, targeted market structure content calls. Though not perfect, it is a significant improvement from just a year ago.
Ideally we would like to move forward and obtain data for Tag 851, but we are very much aware of the mapping challenges from exchanges to the brokers and to the OMS/EMS systems. We tabled this for 2012, but plan on revisiting it again in 2013.
Huw Gronow, Director, Equities Trading, and Mark Nebelung, Managing Director of Principal Global Investors, make the case that TCA should be part of pre-, during, and post-trade analysis.
Transaction Cost Analysis (TCA) has evolved significantly with the advent of technology in trading, and thus the ability to capture incrementally higher quality data. Historically the preserve of compliance departments was to examine explicit costs only as a way of governing portfolio turnover; this evolution provides institutional asset managers with several opportunities: the ability to quantitatively assess the value of the trading desk, the tools to form implementation strategies to improve prioritisation to reduce trading costs, and therefore improve alpha returns to portfolios.
Cost analysis models, methods and techniques have blossomed in the environment, propagated not only by technological advancements, but also in the explosion of data available in modern computerised equity trading.
The benefits of applying cost analysis to the execution function are manifold. It empowers the traders to make informed decisions on strategy choice, risk transfer, urgency of execution and ultimately to manage the optimisation of predicted market impact and opportunity costs.
Although maturing, the TCA industry still has some way to go to fully evolve, and that is largely a function of a characteristically dynamic market environment and non-standardised reporting of trades and market data (the so-called “consolidated tape” issue). Moreover, with the advent and increase in ultra-low latency high-frequency short term alpha market participants (“HFT”), which now account for the majority of trading activity in US exchanges and who dominate the market, the exponential increase in orders being withdrawn before execution (with ratios of cancelled to executed trades regularly as high as 75:1) means that there must be an implied effect on market impact which is as yet unquantified, yet empirically must be real. Finally, fragmentation of equity markets, both in the US and Europe, provide a real and new challenge in terms of true price discovery and this must also by extension be reflected in the post-trade arena.
Nevertheless, waiting for the imperfections and inefficiencies in market data to be ironed out (and they will surely be in time, whether by the industry or by regulatory intervention) means the opportunity to control trading costs is wasted. You cannot manage what you don’t measure. Therefore, with the practitioner’s understanding allied to sound analytical principles, it is very straightforward, while avoiding the usual statistical traps of unsound inferences and false positives/negatives, to progress from an anecdotal approach to a more evidence-based process very quickly.
On the trading desk, the ability to leap forward from being a clerical adjunct of the investment process to presenting empirical evidence of implementation cost control and therefore trading strategy enhancement is presented through this new avalanche of post trade data, which of course then becomes tomorrow’s pre-trade data. The benefit of being able to enrich one’s analysis through a systematic and consistent harvest of one’s own trading data through FIX tags is well documented. The head of trading then arrives at a straight choice: is this data and its analysis solely the preserve of the execution function, or can the investment process, as a whole, benefit from extending its usage? We aim to demonstrate that both execution and portfolio construction functions can reap significant dividends in terms of enhanced performance.
PM Involvement Portfolio managers’ involvement in transaction cost analysis tends to be a post-trade affair at many firms, on a quarterly or perhaps monthly basis, that inspires about as much excitement as a trip to the dentist. It may be viewed as purely an execution or trading issue and independent of the investment decision making process. However, there is one key reason why portfolio managers should care about transaction costs: improved portfolio performance. The retort might be that this is the traders’ area of expertise coupled with a feeling of helplessness on how they could possibly factor transaction costs in. The answer lies in including pre-trade transaction costs estimates to adjust (reduce) your expected alpha signal with some reasonable estimate of implementation costs. You can now make investment decisions based on realisable expected alphas rather than purely theoretical ones.
A key characteristic of many investment processes that make some use of a quantitative alpha signal process is that you always have more stocks (on a stock count basis) in the small and micro-cap end of the investable universe. There are simply more stocks that rank well. This is also the same part of the universe where liquidity is the lowest and implementation shortfall is the highest. If you don’t properly penalise the alpha signals with some form of estimated transaction cost, your realized alpha can be more than eroded by the implementation costs.
Proving the Point To illustrate the impact of including transaction cost estimates in the pre-trade portfolio construction decision making process, consider the following two simulations. Both are based on exactly the same starting portfolio, alpha signals and portfolio construction constraints. The only difference is that in the TCs Reflected simulation, transaction costs were included as a penalty to alpha in the optimisation objective function whereas in the TCs Ignored simulation, pre-trade transaction cost estimates were ignored. The simulations were for a Global Growth strategy using MSCI World Growth as the benchmark, running from January 1999 through the end of June 2012 (13.5 years) with weekly rebalancing. They were based on purely objective (quantitative) alpha signals and portfolio construction (optimisation) with no judgment overlay. Transaction cost estimates were based on ITG’s ACE Neutral transaction cost model. Starting AUM was $150 million. Post-transaction cost returns reflect the impact of the transaction cost estimates for each trade.
As the regulatory juggernaut gathers pace, Alexander Neil, Head of Equity and Derivatives Trading at EFG Bank examines the issues behind the tape, and what the buy-side wants.
Many of my buy-side peers have given up hope on a consolidated tape (CT), but the success of a CT is absolutely paramount now, even more so than it was a few years ago. Not just for industry insiders, but for politicians and the outside world to be shown that these can be transparent markets and that we are not penalised by misguided efforts to force volume onto lit markets, abolish dark pools or volume-enhancing factions such as certain HFT activities. In such a low-volume, low-commission environment I feel the stakes are especially high to get this right from the first day, and not let it drag on and into MiFID III . It shouldn’t be this hard to track trades in Europe and it’s funny to think that whilst we’ve seen a real race to zero in pre-trade latency, it feels like the post-trade space is being drawn out over years!
The Holy Grail is a ‘quality’ tape of record at ‘reasonable cost’. But what is a fair price for market data? Is it really something that can be left to market forces, or is it one of those things that should be regulated like electricity prices. After all, there are social responsibility aspects to market data, as ultimately higher costs for the buy-side are implicitly passed on to the broader (investing) public.
There were initially three routes that the European Commission (EC) wanted to take us down. The first route was to employ the same model as they did for execution and let the invisible hand of the market find the best solution and pricing through healthy competition. The second option was for a prescribed non-profit seeking entity to manage the CT, and the third option would be a public tender with just one winner. The EC seems to be leaning towards the fully commercial approach, and it has set the stage for a basic workflow where APAs (approved publication arrangements) collect and pass on the data to Consolidated Tape Provider(s) (CTP). But, if market forces alone could find a compromise between cost and implementation, we would have an affordable and reliable European Consolidated Tape (ECT) in place already, and MiFID II could instead concentrate on new problems.
So my first concern with the purely commercial approach is that so far, it hasn’t worked; incumbent exchanges are still charging pre-MiFID levels for their data (despite, or indeed because of, their diminished market share in execution), and the only real effort to break the stalemate (namely, the MTFs throwing down the gauntlet) will end up just penalising the buy-side more in the short-term. If the regulator doesn’t address data pricing head on, the buy-side may well end up suffering the effects of a scorched-earth move (wasn’t ESMA granted more powers than its predecessor CESR after all?).
Industry Initiatives However, it’s not all bad. The COBA Project has recently announced a proposal which promises to address these commercial obstacles and has initial support from exchanges and other venues which contribute more than 50% of total turnover. Their solution establishes a new commercial model for Consolidated Tape data which lowers the cost and incorporates the best practices recommendations developed by FPL and other industry working groups. The best practices provide details on how trade condition flags should be normalised thereby enabling consolidation of trade data across exchanges, MTFs, OTC and even BOAT. FPL’s best practices recommendations also bring together wide representation from across the industry, and has been concentrating on data standardisation (including timestamp synchronisation and a clear distinction between execution times stamps and reporting time stamps).
The Coba Project is spearheaded by two former exchange and MTF people, and seems to be the most ambitious in terms of setting a deadline (Q2 ’13). For their sake I would like to see that good work recognised, but the EC has not officially endorsed them and I see this as one of the main failings so far. Without this endorsement or intervention, I worry that the whole effort will run out of steam. And if that happens (if the regulator doesn’t give the industry a nudge) I worry it will ironically signal the failure of the freemarket approach and the regulator will have to make an embarrassing U-turn and go for the prescribed, utility model. Remember the case of BOA T, which had the potential to become an ECT, but it perhaps wasn’t endorsed enough.
So we’re in a position where the exchanges and data vendors are rushing to try and come to a mutually beneficial solution BEFORE the regulator steps in and forces a US-style consolidated tape, and by doing so, potentially remove the commercial benefits for exchanges and vendors.
Being a CTP in itself will be a tough business though, and I wonder if there’s such a thing as a commercially-viable CTP proposition: Not only will they operate in a highly regulated business, but a few years down the line there’s the possibility that Europe goes the same way as the US and starts looking at moving away from a CT and instead getting direct fees from the exchanges (a sort of parallel industry, not quite direct competition). Not only that, but because under current proposals their product will be free after 15 minutes, I expect more investors might just accept a 15 minute lag and get the data for free.
FIXGlobal speaks with the buy-side in China about the prospects for China’s equity market, IPOs and how new technology and competition will improve domestic trading.
GDP and Trading Volumes The property market might continue to cool down in 2012, but it is not reasonable to expect the Chinese economy to shrink significantly this year because the Chinese government will allocate resources to other sectors of the economy. Because of the Lunar New Year effect, it looks as though Chinese Consumer Price Index (CPI) is heading upwards. Based on adjusted CPI, the property asset bubble is a political issue rather than an economic one. The Chinese government has pledged to continue monitoring property prices, and its strong fiscal position gives them various options in terms of how they address this situation. Trading volumes are expected to be much the same as 2011 and inflation should be heading downwards.
Major Driver: IPOs or Economics? There has been a rapid increase in the number of IPOs in China, but the regulators are questioning the quality of some of the IPO companies. Of those companies newly listed in 2011, valuation declined quite significantly. Investors used to think an IPO was like a lottery – buying new shares virtually guaranteed a profit. Many investors did not consider the actual valuation and quality of the company, and many are now realizing that not all investments are worth their list price.
The Chinese equity markets are in a transition stage; they are moving from being somewhat amateur to being much more economic and investor-driven. There were instances of listed companies in one industry that changed industries after the IPO (often moving into property development) and occasionally changing the name of the company, leaving investors uncertain about their strategy and focus.
Listed companies used to have considerable power, but the market is changing in a positive direction. However, we do not know how quickly the market will become transparent and trustworthy. The regulators, media and institutional investors are now more serious about issues of valuation, transparency, corporate governance, etc. The regulators should consider increasing Qualified Foreign Institutional Investor (QFII) and ways of improving the dissemination of information to investors in order to set a good example in the domestic market.
A primary focus of the Chinese Securities Regulatory Commission (CSRC) this year is insider trading. Addressing this matter will improve the quality of listed companies and give investors greater protection. The regulators are working on improving access to information for investors and institutional funds will benefit significantly from this transparency. Regulators are concerned with addressing both the difficulty of access to information and the quality of information about IPOs, and it is quite likely that they will be able to improve both aspects.
Applying New Technology The biggest technology upgrade implemented in the past six months has been algorithmic trading. Most Chinese buyside use their brokers’ algos, but in China, domestic mutual funds are not allowed to route orders to brokers. So what many dealing desks have done is to install the brokers’ algo engine on their side, so for every algo they choose, they go through their server and send the order to the exchange. In this way, dealers achieve efficiency in their algo usage because they do not use any brokerage; as a dealer, they are almost like their own broker. Algo trading also provides the buy-side with more precise post-trade analysis; specifically, the ability to analyze how much alpha has been captured and the transaction costs involved.
The primary benchmark used by most Chinese buy-side traders is Implementation Shortfall (IS), which is used to generate information to help the fund manager improve their investment strategies. For example, it might provide data about the delay cost created by an investment decision made an hour after the market opens, showing the fund manager that if the decision had been made earlier they could have saved a certain amount on the investment.
Courtney McGuinn, FIX Protocol, Ignatius John, Sapient Global Markets, and Bill Hebert of the FPL Global Education & Marketing Committee discuss the role FIX has played in pre-trade and trading and look forward to its application in the post-trade space.
When the FIX Protocol was introduced to the financial services industry in the early 1990s the primary, though not exclusively intentional focus of its founders and early adopters was the electronic communication of equity related pre-trade indications, order and execution messages between buy- and sell-side firms. In itself this was revolutionary for an industry which saw itself as increasingly reliant on electronic trading solutions at the markets and exchange levels. Until that point it had never actually embraced a uniform standard which competitors at all levels: asset management, broker dealer, exchange and vendor could openly agree upon.
The rest of the story is a progressive history of collaborative cooperation or ‘coopetition’ as some would refer to the joint and often volunteer efforts of a competitive universe of industry participants. As time went on, non-equity asset class instruments (derivatives, fixed income, foreign exchange, commodities etc.) were provided for in newer versions of the FIX Protocol and supported workflows expanded to illustrate more comprehensive aspects of the trading systems lifecycle including post-trade processing.
Post-trade allocations have been supported in a fundamental message type as early as FIX 2.7. Certain buy-side firms and their sell-side trade execution partners worked together to support communication of FIX allocations, but widespread adoption was limited. This was due in part to then existing limitations in the FIX post-trade formats, more traditional third party product and network options, competing technical development priorities and budget constraints by all involved parties.
Traditionally, traders on both the buy- and sell-sides viewed trading responsibilities as completed once an order had been executed. From that point on all downstream processing and any reconciliation issues were handed off to the middle and back office operational staff and the respective internal and external systems including third party solutions that supported their activities.
Although allocations and other post-trade transaction types were added and enhanced in progressive versions of FIX, in 2004 FPL launched FIX 4.4 which opened new opportunities – not only to be able to send block level trade and allocation details, but also to confirm and affirm at the block trade and account level.
Initially the enhanced post-trade functionality in the FIX Protocol attained less visibility as firms both regionally and globally were satisfactorily using a major industry utility to confirm and affirm the trades with their counter-parties before sending the settlement details to the custodian banks.
Richard Nelson, Head of EMEA Trading for AllianceBernstein, shares his perspectives on navigating volatility, prospects for developing exchanges, new regulation and the balance between transparency and best execution.
FIXGlobal: How much does volatility affect the way that you trade and what are you using to measure volatility on the desk?
Richard Nelson, AllianceBernstein: We use an implementation shortfall benchmark, so the longer we take to execute an order, the wider the range of possible execution outcomes. Volatility, in particular intraday volatility, increases that potential range, so you could see very good or very poor execution outcomes as a result. In reaction to that, we take a more conservative execution strategy or stretch the order out over a longer time period. And, for instance, if we get a hit on a block crossing network, we will not go in with as large a quantity as we would in a less volatile market. In that way we try to dampen down the potential effects that volatility might have on the execution outcome.
FG: How is AllianceBernstein using technology to improve performance and cut costs on the trading desk?
RN: It plays quite an important part and has done so for quite a while. We are pretty lucky in that we have a team of quant trading analysts. Most of them are in New York, but we have one here on the desk in London, and they help us to analyze the changing market environment and recommend the best ways we can adapt to it. Our usage of electronic trading has increased in the last year, we benefit from the quant trading analysts looking at the results we are achieving with our customized algorithms. We are more confident about getting good consistent execution outcomes because they are monitoring the process and making the necessary changes to ensure the results are what we are expecting. This, in turn, increases the productivity of the traders I have on the desk. They can place their suitable orders into these algorithms and let them run which allows us to focus on trying to get better outcomes on our larger, more liquidity-demanding orders.
On top of that, as market liquidity has dropped significantly, we are trying to make sure we reach as much potential liquidity as possible, and ideally we want to do that under our own name rather than go to a broker who then goes to another venue. We believe that going directly into a pool of liquidity is better done under your own name rather than via a broker because we can then access the ‘meaty’ bits of the pool rather than the ‘froth’. We are looking into ways of doing that but one of the problems is that, potentially, you get a lot of executions from a number of different venues, which results in multiple tickets for settlement. Our goal is to access all these potential liquidity pools, yet also control our ticketing costs, which are a drag on performance for clients.
FG: Was it an intentional change to increase electronic trading or was it a byproduct?
RN: It was a little of both. Our quant trader has been with us for two years and when he first arrived he had to sort out the data issues that exist in Europe and to clean things up. Once the data integrity was sorted out, we looked at different ways of employing quantitative analyses. Having somebody here who is constantly monitoring the execution outcomes means we can proceed down this path with real confidence. As a London firm, we were a little behind in our adoption of electronic trading, but now we are in the middle of the pack in terms of usage. It makes sense from a business and productivity perspective that there are many orders that do not need human oversight, which are best done in algorithms.
At the FIXGlobal Face2Face Forum in Seoul, Korean firms announced the formation of a FIX working group and the Korean Exchange’s intention to build an ultra low latency trading platform.
The opening speaker at the FIXGlobal Face2Face Forum Korea was keenly anticipated by the 200+ delegates, (a quarter of whom were made up of the buy-side and a third the sellside), as he was raising many of the issues that surround the HFT arena, but that are rarely touched on at industry events in Korea. By placing HFT in context , Edgar Perez, author of the recently published “The Speed Traders”, highlighted many of the opportunities and challenges that markets around the world face, in the low latency trading strategies environment. Not least, he pointed out the colossal task facing regulators and associated technology costs, just to monitor high-frequency trading, post trade, let alone real-time.
A recurring theme throughout the day, latency was covered by most of the presentations, especially in the context of FIX. Deutsche Borse’s Hanno Klein, and NYSE Technologies Asia Pacific CEO, Daniel Burgin, stressed that FIX standards are quite at home in the low latency environment, with exchanges around the world already using FIX for their low latency systems. As Mr. Burgin pointed out, “FIX is not slow, but through poor implementation, it can be made slow – and this has happened in various markets”. These comments rang true with the attendees, especially as Mr. Kyung Yoon, Division Head of Financial Investment IT Division of KOSCOM, outlined their plans not only to implement the latest version of FIX at the Korean Exchange, but also that when the new exchange system is rolled out in 2013, that speeds as low as 70 microseconds will be their benchmark. To the ‘icing on the cake’ Mr. Yoon then expressed KOSCOM’s commitment to helping establish a FIX liaison group in Korea that will ensure a highly ‘standard’ implementation of the FIX Protocol.
MC for the day, FIXGlobal’s Edward Mangles, (also FPL Asia PacificRegional Director), welcomed the announcement, stating that he and the FPL Asia Pacific group, looked forward to working more closely with KOSCOM, KRX and the Korean trading community as a whole. With delegates staying put to hear the bi-lingual presentations/discussions throughout the day, (with a few afternoon speakers actually commenting that the crowd in the room was unusually large for the final sessions), the updates on algorithmic trading (Josephine Kim, BAML) and TCA (Ofir Geffin, ITG) provoked a number of follow-up questions and discussions, indicating the delegates’ appetite surrounding these issues.