Huw Gronow, Director, Equities Trading, and Mark Nebelung, Managing Director of Principal Global Investors, make the case that TCA should be part of pre-, during, and post-trade analysis.
Transaction Cost Analysis (TCA) has evolved significantly with the advent of technology in trading, and thus the ability to capture incrementally higher quality data. Historically the preserve of compliance departments was to examine explicit costs only as a way of governing portfolio turnover; this evolution provides institutional asset managers with several opportunities: the ability to quantitatively assess the value of the trading desk, the tools to form implementation strategies to improve prioritisation to reduce trading costs, and therefore improve alpha returns to portfolios.
Cost analysis models, methods and techniques have blossomed in the environment, propagated not only by technological advancements, but also in the explosion of data available in modern computerised equity trading.
The benefits of applying cost analysis to the execution function are manifold. It empowers the traders to make informed decisions on strategy choice, risk transfer, urgency of execution and ultimately to manage the optimisation of predicted market impact and opportunity costs.
Although maturing, the TCA industry still has some way to go to fully evolve, and that is largely a function of a characteristically dynamic market environment and non-standardised reporting of trades and market data (the so-called “consolidated tape” issue). Moreover, with the advent and increase in ultra-low latency high-frequency short term alpha market participants (“HFT”), which now account for the majority of trading activity in US exchanges and who dominate the market, the exponential increase in orders being withdrawn before execution (with ratios of cancelled to executed trades regularly as high as 75:1) means that there must be an implied effect on market impact which is as yet unquantified, yet empirically must be real. Finally, fragmentation of equity markets, both in the US and Europe, provide a real and new challenge in terms of true price discovery and this must also by extension be reflected in the post-trade arena.
Nevertheless, waiting for the imperfections and inefficiencies in market data to be ironed out (and they will surely be in time, whether by the industry or by regulatory intervention) means the opportunity to control trading costs is wasted. You cannot manage what you don’t measure. Therefore, with the practitioner’s understanding allied to sound analytical principles, it is very straightforward, while avoiding the usual statistical traps of unsound inferences and false positives/negatives, to progress from an anecdotal approach to a more evidence-based process very quickly.
On the trading desk, the ability to leap forward from being a clerical adjunct of the investment process to presenting empirical evidence of implementation cost control and therefore trading strategy enhancement is presented through this new avalanche of post trade data, which of course then becomes tomorrow’s pre-trade data. The benefit of being able to enrich one’s analysis through a systematic and consistent harvest of one’s own trading data through FIX tags is well documented. The head of trading then arrives at a straight choice: is this data and its analysis solely the preserve of the execution function, or can the investment process, as a whole, benefit from extending its usage? We aim to demonstrate that both execution and portfolio construction functions can reap significant dividends in terms of enhanced performance.
PM Involvement Portfolio managers’ involvement in transaction cost analysis tends to be a post-trade affair at many firms, on a quarterly or perhaps monthly basis, that inspires about as much excitement as a trip to the dentist. It may be viewed as purely an execution or trading issue and independent of the investment decision making process. However, there is one key reason why portfolio managers should care about transaction costs: improved portfolio performance. The retort might be that this is the traders’ area of expertise coupled with a feeling of helplessness on how they could possibly factor transaction costs in. The answer lies in including pre-trade transaction costs estimates to adjust (reduce) your expected alpha signal with some reasonable estimate of implementation costs. You can now make investment decisions based on realisable expected alphas rather than purely theoretical ones.
A key characteristic of many investment processes that make some use of a quantitative alpha signal process is that you always have more stocks (on a stock count basis) in the small and micro-cap end of the investable universe. There are simply more stocks that rank well. This is also the same part of the universe where liquidity is the lowest and implementation shortfall is the highest. If you don’t properly penalise the alpha signals with some form of estimated transaction cost, your realized alpha can be more than eroded by the implementation costs.
Proving the Point To illustrate the impact of including transaction cost estimates in the pre-trade portfolio construction decision making process, consider the following two simulations. Both are based on exactly the same starting portfolio, alpha signals and portfolio construction constraints. The only difference is that in the TCs Reflected simulation, transaction costs were included as a penalty to alpha in the optimisation objective function whereas in the TCs Ignored simulation, pre-trade transaction cost estimates were ignored. The simulations were for a Global Growth strategy using MSCI World Growth as the benchmark, running from January 1999 through the end of June 2012 (13.5 years) with weekly rebalancing. They were based on purely objective (quantitative) alpha signals and portfolio construction (optimisation) with no judgment overlay. Transaction cost estimates were based on ITG’s ACE Neutral transaction cost model. Starting AUM was $150 million. Post-transaction cost returns reflect the impact of the transaction cost estimates for each trade.
As the regulatory juggernaut gathers pace, Alexander Neil, Head of Equity and Derivatives Trading at EFG Bank examines the issues behind the tape, and what the buy-side wants.
Many of my buy-side peers have given up hope on a consolidated tape (CT), but the success of a CT is absolutely paramount now, even more so than it was a few years ago. Not just for industry insiders, but for politicians and the outside world to be shown that these can be transparent markets and that we are not penalised by misguided efforts to force volume onto lit markets, abolish dark pools or volume-enhancing factions such as certain HFT activities. In such a low-volume, low-commission environment I feel the stakes are especially high to get this right from the first day, and not let it drag on and into MiFID III . It shouldn’t be this hard to track trades in Europe and it’s funny to think that whilst we’ve seen a real race to zero in pre-trade latency, it feels like the post-trade space is being drawn out over years!
The Holy Grail is a ‘quality’ tape of record at ‘reasonable cost’. But what is a fair price for market data? Is it really something that can be left to market forces, or is it one of those things that should be regulated like electricity prices. After all, there are social responsibility aspects to market data, as ultimately higher costs for the buy-side are implicitly passed on to the broader (investing) public.
There were initially three routes that the European Commission (EC) wanted to take us down. The first route was to employ the same model as they did for execution and let the invisible hand of the market find the best solution and pricing through healthy competition. The second option was for a prescribed non-profit seeking entity to manage the CT, and the third option would be a public tender with just one winner. The EC seems to be leaning towards the fully commercial approach, and it has set the stage for a basic workflow where APAs (approved publication arrangements) collect and pass on the data to Consolidated Tape Provider(s) (CTP). But, if market forces alone could find a compromise between cost and implementation, we would have an affordable and reliable European Consolidated Tape (ECT) in place already, and MiFID II could instead concentrate on new problems.
So my first concern with the purely commercial approach is that so far, it hasn’t worked; incumbent exchanges are still charging pre-MiFID levels for their data (despite, or indeed because of, their diminished market share in execution), and the only real effort to break the stalemate (namely, the MTFs throwing down the gauntlet) will end up just penalising the buy-side more in the short-term. If the regulator doesn’t address data pricing head on, the buy-side may well end up suffering the effects of a scorched-earth move (wasn’t ESMA granted more powers than its predecessor CESR after all?).
Industry Initiatives However, it’s not all bad. The COBA Project has recently announced a proposal which promises to address these commercial obstacles and has initial support from exchanges and other venues which contribute more than 50% of total turnover. Their solution establishes a new commercial model for Consolidated Tape data which lowers the cost and incorporates the best practices recommendations developed by FPL and other industry working groups. The best practices provide details on how trade condition flags should be normalised thereby enabling consolidation of trade data across exchanges, MTFs, OTC and even BOAT. FPL’s best practices recommendations also bring together wide representation from across the industry, and has been concentrating on data standardisation (including timestamp synchronisation and a clear distinction between execution times stamps and reporting time stamps).
The Coba Project is spearheaded by two former exchange and MTF people, and seems to be the most ambitious in terms of setting a deadline (Q2 ’13). For their sake I would like to see that good work recognised, but the EC has not officially endorsed them and I see this as one of the main failings so far. Without this endorsement or intervention, I worry that the whole effort will run out of steam. And if that happens (if the regulator doesn’t give the industry a nudge) I worry it will ironically signal the failure of the freemarket approach and the regulator will have to make an embarrassing U-turn and go for the prescribed, utility model. Remember the case of BOA T, which had the potential to become an ECT, but it perhaps wasn’t endorsed enough.
So we’re in a position where the exchanges and data vendors are rushing to try and come to a mutually beneficial solution BEFORE the regulator steps in and forces a US-style consolidated tape, and by doing so, potentially remove the commercial benefits for exchanges and vendors.
Being a CTP in itself will be a tough business though, and I wonder if there’s such a thing as a commercially-viable CTP proposition: Not only will they operate in a highly regulated business, but a few years down the line there’s the possibility that Europe goes the same way as the US and starts looking at moving away from a CT and instead getting direct fees from the exchanges (a sort of parallel industry, not quite direct competition). Not only that, but because under current proposals their product will be free after 15 minutes, I expect more investors might just accept a 15 minute lag and get the data for free.
Paul Squires, Head of Trading AXA Investment Managers systematically analyses the consequences of structural market change and sell-side head count reduction across the street.
Amid the current market and trading environment the expression “A Perfect Storm” springs to mind because, clearly, the entire industry has seen decreasing margins and volumes since 2008. At the same time, there has been an arms race to invest in technology just to maintain position. Those two things aren’t exactly the best backdrop for cash equity. Furthermore, even if the cash equity business is seen as a loss-leader for other more profitable asset classes, Basel 3 and global banking reforms seem to be impinging upon those commercial realities as well. 2012 was a very tough year for banks and brokers, and I think we’re finally seeing a little bit of fallout from that in terms of strategic reorganisation. It’s not just a seasonal thing now; many in the industry had sustained hope that it was just a tough period, that we would come out of it and that volumes would return to 2008 levels; however, I believe people are realising it’s much more structural.
Splitting Hairs Commission Sharing Agreements (CSAs) are increasingly an essential facility for the buy-side. CSAs were really the avenue to enable CP176 (FSA consultation paper on unbundling). They reduce the extent to which trade execution might be constrained to where fund managers are getting their advice and service. In other words, the buyside executes with the broker where they have a CSA (provided they can give good execution), paying both an execution and an advisory component at the same time thus building the advisory pot, which can then be used to pay for independent research (or gives the fund managers the freedom to pay for advisory services from a broker whose execution service is not as strong).
More recently, the FSA has said that fund managers weren’t embracing the opportunity to split the different commission components as much as they had hoped, and the FSA is pushing again for that to happen. This is entirely appropriate from a client’s perspective, in my view. It’s the client’s money that’s being used every time the buy-side trades; if you’re paying a bundled commission, that’s effectively the client paying for the execution service and the advisory service and they should expect the best decision for both elements of that.
There are a couple of areas of focus within the current consolidation of advisory services and execution services. On the execution side, we’ve seen most impact from a more strategic ‘top-down’ view of sales trading. In the past, electronic sales trading was seen as supplementary to the traditional cash equity sales trading. There’s been a hard push to set up the provision of algorithms, and that has created a duplicated set of execution services. Now, my desk has taken a decision to focus our contact with our primary cash equity sales traders, but enabling them to see our algorithmic flow. This means that if we’re trading a significant volume of a stock and the cash sales trader can see what we’re doing, we’re optimising all our execution avenues. There is electronic access to multiple venues, but there is also the traditional broker distribution channel.
As part of our regular trading reviews, we explained to our brokers that our cash sales trader is the one who knows our account and our style of trading. We’ve had the historic relationship with them and they are best placed to disseminate the most relevant market information to us very quickly. For example, if we self-direct an algorithmic order, they could see that we were looking to buy a chunk of a French small cap; if they happen to see flow in that stock from another source internally, then they would be able to pick up the phone and say, “I know you’re working an algorithm, but if you’re interested, we’ve got the natural seller.” This is the level of service we want, but it’s taking the market quite a long time to get to that point. Based on our conversations, we’ve discovered that we are in the minority in wanting the sales trader to see the algorithmic flow. In contrast to our view, we believe many in the buy-side see anonymity as the key benefit of an algorithm.
A lot of the buy-side use algorithms almost primarily for the anonymity, which means they end up with a duplicated set of coverage with electronic coverage and cash sales trading. Clearly, that’s an expensive way to organise coverage for a typical asset manager whose volumes have declined substantially in the past couple of years. Therefore, I think we will continue to see brokers moving their electronic teams much closer to the program team or the cash sales traders.
Sell-Side Headcount Changes The impact on the buy-side isn’t just caused by the fact that the headcount of sales trading has shrunk, it is that the number of clients has expanded. There are now so many small hedge funds and boutique asset managers. In many instances, the sales trader is doing his best to pick up the phone and put the orders into the system but, in our view, he often no longer has time to closely scrutinise the markets and stocks in the way that we used to benefit from.
At AXA Investment Managers, we trade with approximately one hundred brokers a year. But within that, it’s a very concentrated focus with our top 20 or so brokers being absolutely key. In addition, there’s a significant tail of brokers that we need access to less frequently for very specific orders.
There will likely always be two or three brokers, who, while market consensus suggests a certain direction, may feel that it’s worth their while taking a different view and who see an opportunity to gain market share by going against the trend.
Does a buy-side firm these days need more than two or three execution-only brokers who are the traditional sort of agency guys who are very driven to get your flow? They do a lot of work to be close to the market; talk to a lot of people; give you a lot of market colour. I think what this means is that the emphasis has shifted from the buy-side trader picking up the phone to someone on the sell-side who then directs how to execute your order, to the buy-side trader now having all the relevant tools. This concept of “best selection” as a process for us is one of the main aspects of “best execution”; in other words, the due diligence before we decide exactly how we are going to trade the order. Do we pick up the phone because, in fact, we just want a risk price; we want instant liquidity and the immediacy of execution? Do we want to park it passively in a couple of dark pools, and know a particular algorithm that is going to do that for us? Do we want to just pick up a phone to the sales trader and say, “Just keep it to yourself for a while, but I’m looking to buy a chunk of this particular stock in case you see anything in it”? Maybe we want to have a look around the shareholder list, see who might have been active in it and see if we’ve got any opportunities to do cross a block naturally? There are so many different ways to execute now and hence this concept of ‘total liquidity management’.
Mizuho Securities’ Spyridon Mentzas discusses the status of the Japanese exchange merger and offers thoughts on how well the two systems will merge and the benefits investors can expect.
Compatibility The merger of Tokyo Stock Exchange (TSE) and Osaka Securities Exchange (OSE) is not yet finalized, but it appears they will merge in the beginning of 2013, with the details yet to be specified. The first impression is that they have nearly identical trading rules with some minor differences, such as the OSE trading until 3:10, while the TSE closes at 3:00. When TSE decided to shorten the lunch time in November, the OSE did the same. When one of the exchanges (usually, the TSE) changes the rules, then the other moves in tandem: for example, changing the tick sizes. If the merger does go ahead, it is likely that they are going to use the TSE’s cash system, arrowhead, and OSE’ J-GATE for derivatives. They will use the old systems in parallel, which will achieve a reduction in cost because they will not have to maintain two systems.
Further Industry Consolidation The ECN’s in the US enjoyed technological superiority versus the classic exchanges, where NYSE’s latency was significantly slower than Arca’s. This would have been reason enough for TSE to consider buying a PTS, but with arrowhead’s current latency of less than 2 milliseconds (and another upgrade in the next few months to target less than a millisecond), simply buying a PTS would not give them a noticeable advantage because the TSE and OSE are on par with the PTSs. The reason why PTSs are increasing their market share is that, unlike in the UK and US, where Reg NMS and MiFID have required trading on the exchange with the best price, in Japan the PTSs draw volume through decimal points and smaller tick sizes than the incumbents.
For example, Mizuho Financial Group might trade on the TSE at 105 yen bid, 106 yen offer. That one yen spread is close to 100 basis points or almost one percent, whereas the PTS trades at 0.1 yen. This is a major incentive for investors to buy and sell on the PTSs with their smaller increments to reduce market impact and trading costs. From the beginning, the regulators have not been overly concerned with the PTSs deciding to trade in decimal places and have 0.1 yen ticks. It was always up to the PTSs to decide and the TSE could do the same. If anything, I think the new exchange would rather reduce their tick sizes, than merge again.
However, not all participants would be happy to see new tick sizes, for example, some of the proprietary houses or small firms that trade with retail, as altering their downstream systems to handle decimal places would be costly.
This will also create a fragmentation of liquidity in tick sizes. The bids and offers on the TSE are often thick, with something like 50 billion shares sitting on the bid side, so with 0.1 yen ticks, the average order size might move to 3 million or 1 million shares. Traders who want to buy a large lot will have to scroll up and down to find out how much they have to go up to absorb the available liquidity. I think for the traditional long-only traders, this might mean an increased scattering of liquidity. There is sufficient liquidity in the market at present; even for stocks trading at a low price – there are market makers trying to make 1% during the day. If smaller tick sizes are introduced, that liquidity will likely be scattered or disappear.
AFME’s Securities Trading Committee Chairman Stephen McGoldrick unlocks the latest MiFID proposals and looks at the rules for Organized Trading Facilities, algo trading and a consolidated tape.
Organized Trading Facilities (OTFs) The OTF regime began life as a specific regulatory wrapper to put around broker crossing systems, (which are a new mechanism for delivering an existing service). Crossing, which is almost the definition of a broker, has become highly automated. Whilst most crossing activities have not changed, other aspects of the industry were seen to require regulation – namely increased automation and greater scope of crossing. The initial proposals outlined an umbrella category of systems called OTFs, with one category created to hold broker crossing systems and another to hold the systems for G20 commitments around derivatives trading.
When the MiFID II proposals came out at the end of 2011, the ‘umbrella’ aspect had been simplified into a structure intended to be ‘all things to all people’, which is where it has come undone. MiFID II has created a regulatory receptacle for a practice and the two things differ in shape. The broker crossing system does not fit into the receptacle that has been created for it because much of the trading is against the books of the system’s operators, which is prohibited under the current proposals.
The regulators do not want speculative, proprietary trading within these systems, but unwinding risk created by clients is both useful and risk-reducing. An opt-in mechanism for compliance, allowing traders to decide if they want their orders traded this way may be a solution. Conflict management of this sort is common in the financial sector, as it ensures that any discretion is not exercised against the interests of the client. Certainly, when it comes to measuring the client’s interests against the operator of an OTF, it is absolutely unambiguous that their interests must come first. Therefore, any exercise of discretion that disadvantages the client relative to the operator is already prohibited. A formal, documented process to ensure that segregation stays in place is good, but to effectively prohibit the vast majority of trading on broker crossing systems seems to abandon the regulators’ objectives – to increase transparency and protect clients.
Furthermore, trades allowed into a broker crossing system would be instantly reported, creating post-trade transparency. The current proposals call for OTFs to be treated in the same way as Multilateral Trading Facilities (MTFs), which fosters uncertainty about the waivers for pre-trade transparency. Currently, there are clear criteria for granting a waiver to a platform: one is that orders are large in size, the other is taking reference prices from a third party platform. The Commission will not, however, be making the decisions about waivers; they have been handed to the European Securities Market Authority (ESMA) to determine. There is a danger in specifying too stringent limits for these waivers, which would create a very different landscape from that explicitly envisaged by MiFID I.
Systemic Internalisers (SIs) Our understanding is that regulators did not want to split activity that was in an OTF into two, but rather to regulate the broker crossing systems and to remove the subjectivity of SIs. The current SI proposal is aimed at regulating automated market making by banks, so that institutions make markets by reference to market conditions, not by reference to their clients. In MiFID I, the SI regime was introduced to protect retail investors, but subsequently this seems to have changed. When the European Commission (EC) was asked by the Committee of European Securities Regulators (CESR) to clarify the rationale for an SI regime, they declined to do so. As a result there is a distinct lack of clarity regarding the intent of the SI rules. If we had a clearer vision of the direction the regulators wished to take the market, then it would be far easier to assess whether the regulations were moving us in the right direction – or not.
Richard Nelson, Head of EMEA Trading for AllianceBernstein, shares his perspectives on navigating volatility, prospects for developing exchanges, new regulation and the balance between transparency and best execution.
FIXGlobal: How much does volatility affect the way that you trade and what are you using to measure volatility on the desk?
Richard Nelson, AllianceBernstein: We use an implementation shortfall benchmark, so the longer we take to execute an order, the wider the range of possible execution outcomes. Volatility, in particular intraday volatility, increases that potential range, so you could see very good or very poor execution outcomes as a result. In reaction to that, we take a more conservative execution strategy or stretch the order out over a longer time period. And, for instance, if we get a hit on a block crossing network, we will not go in with as large a quantity as we would in a less volatile market. In that way we try to dampen down the potential effects that volatility might have on the execution outcome.
FG: How is AllianceBernstein using technology to improve performance and cut costs on the trading desk?
RN: It plays quite an important part and has done so for quite a while. We are pretty lucky in that we have a team of quant trading analysts. Most of them are in New York, but we have one here on the desk in London, and they help us to analyze the changing market environment and recommend the best ways we can adapt to it. Our usage of electronic trading has increased in the last year, we benefit from the quant trading analysts looking at the results we are achieving with our customized algorithms. We are more confident about getting good consistent execution outcomes because they are monitoring the process and making the necessary changes to ensure the results are what we are expecting. This, in turn, increases the productivity of the traders I have on the desk. They can place their suitable orders into these algorithms and let them run which allows us to focus on trying to get better outcomes on our larger, more liquidity-demanding orders.
On top of that, as market liquidity has dropped significantly, we are trying to make sure we reach as much potential liquidity as possible, and ideally we want to do that under our own name rather than go to a broker who then goes to another venue. We believe that going directly into a pool of liquidity is better done under your own name rather than via a broker because we can then access the ‘meaty’ bits of the pool rather than the ‘froth’. We are looking into ways of doing that but one of the problems is that, potentially, you get a lot of executions from a number of different venues, which results in multiple tickets for settlement. Our goal is to access all these potential liquidity pools, yet also control our ticketing costs, which are a drag on performance for clients.
FG: Was it an intentional change to increase electronic trading or was it a byproduct?
RN: It was a little of both. Our quant trader has been with us for two years and when he first arrived he had to sort out the data issues that exist in Europe and to clean things up. Once the data integrity was sorted out, we looked at different ways of employing quantitative analyses. Having somebody here who is constantly monitoring the execution outcomes means we can proceed down this path with real confidence. As a London firm, we were a little behind in our adoption of electronic trading, but now we are in the middle of the pack in terms of usage. It makes sense from a business and productivity perspective that there are many orders that do not need human oversight, which are best done in algorithms.
Daiwa Capital Markets’ David deGraw catalogs the movements of Japanese markets in 2011 and discusses the various approaches Japan could take with regard to dark pools and High Frequency Trading (HFT).
Volume and Liquidity in Japan Right now, contagion from Europe and the turmoil from the United States have depressed equity transaction volumes across the globe. Once a recovery starts to gain steam, Asia will be the driver for growth and Japan will be a quality play. Due to the perennial underweighting of Japan, I expect volumes in Japan will quickly surpass pre-crisis levels in such a scenario. With exchange volumes being so low, non-traditional liquidity is playing an increasingly important role. We have seen transaction volumes on our nondisplayed liquidity pool as well as PTS volumes continue to grow relative to exchange volumes. We are trying to bring the benefits of crossing to as many client types as possible and our unique position as a principal domestic investment bank enables us to access semi- to non-professional liquidity sources, such as corporate and religious entities, educational endowments, quasipublic institutions, agricultural cooperatives, and retail investors.
Role of PTSs in Japan The role of PTSs has increased steadily since the start of this year and has accounted for as high as 7-8% of market share. The success of SBI Japannext and Chi-X Japan PTS shows that the market is rewarding innovation and efficiency that is created as a result of increased openness and competition. Conversely, the closing of Kabu.com shows that a PTS’s revenue model may not be sustainable over an extended period of low trading volume. Therefore it is critical for participants to carefully evaluate the viability of a venue so that the large upfront technology investments are not wasted.
The implementation of centralized clearing through JSCC was critical for the existing PTSs to rapidly and dramatically expand their share in 2011. However, since August, growth has slowed somewhat along with the rest of the market. Having said that, there are still very good reasons to expect future growth in PTS market share. Both PTSs are working aggressively to on-board new participants and Chi-X has recently announced the introduction of liquidity rebates in Japan. Chi-X have a successful record of growing their market share in Europe with liquidity rebates, and such economic incentives are sure to be strong drivers for growth in Japan as well. In fact, it should open the door for a totally new class of venue fee arbitrageurs to trade Japanese equities. Furthermore, domestic institutions are expected to allow smart order routing to PTSs once regulations are amended to exempt PTSs from the 5% TOB rule.
Matteo Cassina of Citadel Execution Services Europe comments on the development of a European consolidated tape as well as a unified concept of best execution.
The long awaited proposals on the review of the Markets in Financial Instruments Directive (MiFID) were published in October 2011. The so-called MiFID II and MiFIR proposals aim to address, among other things, changes in the European market structure and competition between trading venues. Whilst the proposals, in their current form, do not provide as much detail as market participants had hoped, they represent a unique opportunity to address fundamental issues impacting the efficient functioning of Europe’s equity markets.
The EU legislative process is such that the European Parliament and the European Council will agree their negotiating positions, before embarking on a trialogue process mediated by the European Commission. The final legislative text may not be ready for implementation until as late as 2014, but this timeframe represents a good opportunity for the rules and their impact to be given adequate consideration. In particular, the issues of best execution and consolidated tape need to be given greater prominence during this review process, if policymakers are to honour the original objectives of MiFID, protect the retail investor and ensure Europe’s equity markets become efficient and competitive.
A key benefit of regulation is that it drives standardization of behaviour but thus far, this has not materialised (in the retail broker community in relation to best execution requirements). Large institutions have the capabilities to take advantage of the proliferation of alternative trading venues and are benefitting from cost reductions by being able to execute their orders in the venue which offers the lowest price for a security at a given time. The majority of retail investors, however, are still either unaware of, or do not have, the opportunity to access alternative trading venues. This means they do not always benefit from prices equal to, or better than, those available in primary venues.
While the principle of best execution is reiterated in MIFID II, it is not included in MIFIR which means that — once again — best execution is a principle, not a rule and therefore open to interpretation at the national level. This is in stark contrast to the best execution model in the US, where the requirements to achieve best execution are much more stringent. Currently, a retail broker in Europe can chose to route all of its trading to one single venue, on the basis that it has a good commercial relationship with that venue, or that it is too costly for the broker to connect to multiple venues. The broker may choose to send all orders to a venue with the highest chance of getting the best price, without necessarily guaranteeing that it is the best price at that moment in time. This is an unfair outcome for the retail investor and MiFID II/ MiFIR proposals, regrettably, do not go far enough to redress this.
Enforcing best execution will take time and will depend on broader market harmonization, but now is the time for regulators and retail investors to demand a more compelling definition of best execution. In particular, greater clarity is required around the execution policies provided by retail brokers to their clients. These policies are documents in which retail brokers explain how their best execution obligations are fulfilled under MiFID. Trading venues and brokers should also be required to provide execution quality statistics, detailing how well they performed in achieving best execution. This much needed clarity would, for example, result in firms having to justify — to both regulators and clients — why certain trading platforms are listed on their best execution policy and, why others have been omitted. In short, how and why some orders are routed to specific venues and not to those with the best price.
Daniel Ciment of J.P. Morgan details the development of Brazilian algos and outlines the most effective strategies for trading in Brazil.
Using Algos in Brazil Already accustomed to trading with algorithms or using algorithms to trade strategies in different markets around the world, as international buy-side traders look to Brazil, they want to trade there in the same way they have traded elsewhere. Even though having just one exchange makes the data feed more streamlined, because of the low liquidity profile of certain stocks in Brazil, you cannot use algorithms to trade all stocks electronically. For the more liquid names, many traders are using benchmark algorithmic strategies, like VWAP, percentage of volume, or arrival price. Most algorithmic strategies are based on benchmarks for now, as buy-side traders seek to replicate the methods they use elsewhere, while obviously taking into account the intricacies of the market structure. In the end, if they trade with algorithms in the US, Europe and Asia, they want to trade with algorithms in Brazil as well.
Infrastructure and Volume Spikes This is one of the challenges that we face as an industry. As you are building electronic infrastructures, you have to build for growth and not just for where we are today. When we look at a market, whether it is Brazil or more developed markets like the US, Europe or Asia, we know what we are trading today, but we have to build to accommodate what we will trade in a year, two years and what we think the peak might be. Just because a market trades a couple of hundred million in a day, or in the US, 8 billion shares a day, it does not mean you build your plan to support 8 billion shares a day because a year from now, that figure might be 20% higher.
More so, if a major event happens next week, then that figure might double, so you need to build sufficient headroom. Right now, we can handle a lot more than what we manage on a daily basis, but that is on purpose to make sure that at times of stress we are there for our clients and that they can trade through us with full confidence.
DMA or Boots-on-the-Ground? To be successful in a market like Brazil, brokers need to have people on-site who know the local investor community and know the local financial community. J.P. Morgan has a major trading presence in Sao Paulo, and that is just one piece of the offering in Brazil. For small firms who want access, outsourcing is a realistic option, but if you are going to be big in a market, especially in a market like Brazil, an in-country trading team is required.
Technical Challenges Reliable trading requires market data and telecommunications systems, which are present in Brazil, along with data center space and algorithms that are tuned to the local market and market structures. This tuning includes the liquidity profiles of the stocks as well as the rules and regulations of the exchange; you cannot apply the same algorithms from one region to another and expect them to work. We spend a lot of time and effort, fine tuning our algorithms, testing them on our desk and then rolling them out to clients. It is not just copy-and-paste.