Capital Group’s Brian Lees is driving efforts to ask more questions of brokers, and for more data on where an order is shown before it executes, but can the buy-side handle the resulting deluge?
The current work you are doing on venue reporting analysis Our first push was simply to try to collect information about ‘where’ we were executing and a little bit about ‘how’ we were executing, namely, did we post or did we take liquidity. So having done that, the question was where do we go from there? And as such, the topic of requesting more data on where we didn’t execute and what order types were used started to be raised by some representatives on the FPL Americas Buy-Side Working Group. Some participants had already started down this road with brokers, asking for information relating to post-trade about where their orders were sprayed out to by the algorithms and what types of orders were placed on exchanges and also which exchanges they were on, etc. So that’s where the conversation began and that’s where we reached out to Jeff Alexander and Linda Giordano, because Barclays had already spearheaded this conversation.
What we are looking to achieve either in real time or post-trade, is whether we can standardise a format for brokers to tell us how our order interacted with the market, including when the order was placed, what order types were used, where it was placed in the markets and whether or not we got hits. The concern with this is not so much can we get it, because if we sign enough non-disclosure agreements we can get the information from the brokers. Some brokers have concerns about that information getting out and somebody reverse-engineering their algorithms, but from the buy-side perspective, I think the biggest concern is whether we can manage the volume of data that we would get.
The resources to store and analyse data and make some sort of good use of it With the original data that we were getting, on where the execution took place, we talked a lot about this with smaller firms who were using TCA vendors to help them analyse this information. With this type of information, if we went a step further, the brokers would not want us sending that out to TCA firms, because it shows their methodology for how their algorithms behave. I was in New York several weeks ago and took the opportunity to meet up with Jeff and Linda while we were there. We invited Jeff to join one of our conference calls for the buy-side committee, which he did, and he talked about what they’ve been proposing. He showed proposals for both the real-time collection of data, via FIX messages, actually proposing a whole new FIX message to be created for this purpose, which could then be sent in real time. Or, alternatively we could standardise a format for collecting the information post-trade which, as a spreadsheet, would then tell us what we want to see. We’re trying to standardise how you ask for the data and what format it is going to be in, by creating best practices for how to get the data from the brokers. That way the brokers don’t have to keep coming up with a different format for every client that asks for it. The best practices do specify that the ISO MIC codes would be the standard for identifying the exchange that you executed on, but we said nothing about what you should do with the data once you get it.
Exchange involvement in the conversation We did talk to some exchanges when we were first trying to standardise how to identify the exchanges, because when we first standardised the MIC codes, they did not cover all the exchanges, this was due to the fact that they hadn’t all registered with the ISO organisation and we wanted them to.
We had a little bit of trouble in differentiating the dark order books from the lit order books and some of the exchanges that have both. These exchanges consider themselves a hybrid book, and they didn’t want to be known as two different things. We didn’t have a way to differentiate the dark and the lit flow without introducing yet another FIX tag. That back and forth added to the conversation as part of the registration authority’s decision to come out with the new market segment concept, which says you can have an exchange defined and have child MIC codes that differentiate different segments of the market. We’re beginning to start conversations with exchanges about this topic, but that’s the extent to which we’ve had any discussion with them.
Broker willingness to participate in the process The first half of this, just getting the information about where you executed, the brokers didn’t have any problem, because it’s public record once it executes. When we started talking about the more detailed reporting, they did raise a concern about the information being sent out and NDAs so that, you, as a client, are not going to send the data out to a third party. But because other firms had already started down this road we talked about the purpose of this, which was just to have someone looking over their shoulder to make sure that they are acting in the best interest of the client and not potentially favouring rebates over best execution; they can’t really argue with that logic. Somebody should have some oversight as to whether or not the right decisions are being made.
Asia’s market structure creates demand for increasingly granular trading information – as Kent Rossiter, Head of Asia Pacific Trading Allianz Global Investors and Michael Corcoran, Managing Director ITG discuss, FIX can help.
Asia Pacific faces different liquidity challenges to other regions, particularly given that spreads are often much wider and are therefore an even more significant contributing factor to overall trading costs (See Chart). As the trading environment evolves in the region and the focus on managing costs grows, the requirements for transparency and feedback on trading increases. This is happening in parallel with the evolution of new trading venues in the region, particularly dark pools. Buy-side traders now want a greater level of detail on their dark pool fills to help them understand the behavior of their orders and manage their execution venues proactively to get the best trading result.
Kent Rossiter heads up the Asia Pacific trading desk of Allianz Global Investors, and is constantly looking for ways to improve the efficiency of their process and minimise the costs of trading. From his perspective, while post-trade TCA is now well-established, a particular growth area is the requirement for more detailed data on a shorter timeframe. He explains “We as buy-side traders are now trading an increasing amount of our orders ourselves using the electronic tools available, and when we do so we want more granularity and data fed back to us: which venues are our orders being executed in, at what price, and how aggressively. We want information that helps us adjust strategies on the fly for better trading outcomes, or quickly review the results so we can manage our future performance.”
One result of this is new demand in the region for analysis of maker/taker indicators on orders so that a trader can identify how often they are crossing the spread to find liquidity. Allianz Global Investors has been working with ITG and other brokers in the region to implement support of maker/taker analysis to help the trading desks improve their insight into market conditions and get more transparency into the behavior of their orders in dark venues.
Understanding Maker/Taker Understanding whether an order is making or taking liquidity is important, particularly in wide-spread environments such as many of the Asian markets. Michael Corcoran, Managing Director of ITG, says “Traders want to know instantly whether they are providing liquidity or taking it, instead of retrospectively needing to compare fills and timestamps manually against what the market was trading at. This can be very useful information to help them adjust the trading strategy in real-time to the market conditions and the liquidity available. It can also help determine what kind of ‘throttle’ they should put on their strategy or their algo to find the right level of aggressiveness for the orders they are working. In addition to that it can also be a very valuable tool for sell-side firms, helping to refine the development and rules of algorithmic strategies and improve strategic ideas that will work for certain clients or order types.”
This is of growing relevance in a multi-venue environment, for example in Asia where over the past few years a lot more broker dark pools have been developed. Many buy-side firms now choose to use a dark aggregator to help improve their efficiency in accessing multiple venues, and here some kind of maker/taker liquidity analysis can be a helpful data point for assessing the type of outcome a trader is getting in those pools. Corcoran explains “Both ITG as a dark aggregator, and our buy-side clients themselves, want to understand whether orders are consistently making or taking liquidity in a specific dark venue so that the impact can be assessed – for example if our client’s orders always take liquidity in a certain venue we would review that to understand why. If we can pass that data directly back to the clients they can then make a decision about whether they want to be removed from that venue or change the distribution of their order flow across different pools. Likewise, if we see orders taking liquidity then see an unexpected change in the stock’s trading profile, this can be a useful warning indicator about the participants in a specific pool.”
FIX Tag 851 – a Potential Solution A specific FIX Tag, 851, or Last Liquidity Indicator, has been developed by FIX Protocol Ltd (FPL) as an identifier of maker/taker behavior. The US appears to have the most established support of liquidity-indicating tags with exchanges able to pass the data back to brokers and most of those brokers able to pass that on to clients. In Europe, likewise the large exchanges and brokers can support this, although there is less among the mid and smaller brokers.
However, in Asia the tag is sparsely supported, if it all, by the exchanges, alternative lit trading venues and many of the broker dark pools. Firms therefore have to come up with interpretive solutions and workarounds to give their buy-side clients a higher level of detail and transparency on their trading, particularly in dark pool aggregation.
Rossiter would prefer an industry-wide approach to improving transparency and the availability of maker/taker data which includes vendors, brokers, and most importantly the exchanges “Typically the actual FIX tag for this information is supposed to be generated by the exchange or trading venue, and it is passed to the brokers who need to be able to identify and accept that tag and then pass it into the vendor EMS or OMS platform that the client is using. So there are a number of parties within the workflow who are affected and they need to collaborate to bring in changes. An industry-wide adoption of the relevant FIX tag would definitely be a good solution”.
Carlos Oliveira, Electronic Trading Solutions at Brandes Investment Partners examines the process of choosing a TCA provider, and the role of FPL.
We use Markit’s Execution Quality Manager (formerly known as QSG) for equity trading TCA. Our decision to switch providers was based on increased algorithm usage, a desire for more functionality, greater execution transparency and most importantly, the availability of more granular data for analysis via FIX.
We FTP our data daily and the results are available to us no later than US market open the next day. Trades are reviewed against traditional and custom benchmarks. We grant access to every trader and risk member, so that they can construct their own views as desired. Typically on a quarterly basis, we conduct our own and adapt broker studies to better understand the impact of our orders.
The implementation process We evaluated four providers before making our final decision. We wanted a flexible platform that would accommodate maximum self-serving, custom reporting needs; minimal ongoing maintenance or upgrades requiring internal resources; and flexibility on custom solutions, such as the proper measurement of our ADR creation activity.
One vendor offered a very rich solution that was beyond our needs. For two others, we were not comfortable with the process for submitting data and how much work we would need to do internally. A key determinant was the overall level of commitment to the implementation, which we concluded Markit’s Managing Director Tim Sargent clearly demonstrated. It took us roughly two months to solidify the extract process and we went live on January 1, 2011.
TCA has become a key component of our trading process and we continue to realise value, primarily for post-trade at the moment. The value comes from the constant learning about our orders, what has worked well or not, and the adapting and improving of trading.
The large amount of data to analyse can be overwhelming at first and easily misinterpreted if not careful.
Frequent and honest dialog with the vendor, the traders, as well as tapping other sources of knowledge (i.e. broker TCA contacts and industry publications) is key to a successful implementation. Many reports went through several iterations, sometimes a quarter or two apart, before we got it to a meaningful and actionable state.
To avoid having too much of a one-side perspective, we compare broker-provided TCA reports with our vendor often. This helps the dialogue with both the brokers and the vendor – keeps both parties engaged and attentive.
The role of FPL Our interaction with FPL began with the TCA implementation.
In late 2010, in conferences as well as in industry press, many parties were encouraging the buy-side to gain a better understanding of broker SOR practices and where the orders were getting executed, but with no actionable recommendations outside a specific platform. Being broker-neutral, the FIX execution venue reporting best practices proposed in early 2011 by the FPL Americas Buy-side Working Group helped us to move forward with this goal in the TCA platform. FPL Membership has enabled further contact with other buy-side firms and knowledge sharing not available otherwise to a smaller firm.
We started by asking for Tag 30, LastMarket. Broker responses to the data request varied greatly across brokers and regions. Correspondence spanned many months and contacts, particularly when we asked for MIC codes as opposed to proprietary values. We understand the queue priorities of brokers’ systems and demands of larger clients, and are very appreciative for what they have done thus far.
Some of our broker relationships have been exceptionally supportive in this effort, leading to enhanced dialogue on routing practices and more meaningful, targeted market structure content calls. Though not perfect, it is a significant improvement from just a year ago.
Ideally we would like to move forward and obtain data for Tag 851, but we are very much aware of the mapping challenges from exchanges to the brokers and to the OMS/EMS systems. We tabled this for 2012, but plan on revisiting it again in 2013.
Huw Gronow, Director, Equities Trading, and Mark Nebelung, Managing Director of Principal Global Investors, make the case that TCA should be part of pre-, during, and post-trade analysis.
Transaction Cost Analysis (TCA) has evolved significantly with the advent of technology in trading, and thus the ability to capture incrementally higher quality data. Historically the preserve of compliance departments was to examine explicit costs only as a way of governing portfolio turnover; this evolution provides institutional asset managers with several opportunities: the ability to quantitatively assess the value of the trading desk, the tools to form implementation strategies to improve prioritisation to reduce trading costs, and therefore improve alpha returns to portfolios.
Cost analysis models, methods and techniques have blossomed in the environment, propagated not only by technological advancements, but also in the explosion of data available in modern computerised equity trading.
The benefits of applying cost analysis to the execution function are manifold. It empowers the traders to make informed decisions on strategy choice, risk transfer, urgency of execution and ultimately to manage the optimisation of predicted market impact and opportunity costs.
Although maturing, the TCA industry still has some way to go to fully evolve, and that is largely a function of a characteristically dynamic market environment and non-standardised reporting of trades and market data (the so-called “consolidated tape” issue). Moreover, with the advent and increase in ultra-low latency high-frequency short term alpha market participants (“HFT”), which now account for the majority of trading activity in US exchanges and who dominate the market, the exponential increase in orders being withdrawn before execution (with ratios of cancelled to executed trades regularly as high as 75:1) means that there must be an implied effect on market impact which is as yet unquantified, yet empirically must be real. Finally, fragmentation of equity markets, both in the US and Europe, provide a real and new challenge in terms of true price discovery and this must also by extension be reflected in the post-trade arena.
Nevertheless, waiting for the imperfections and inefficiencies in market data to be ironed out (and they will surely be in time, whether by the industry or by regulatory intervention) means the opportunity to control trading costs is wasted. You cannot manage what you don’t measure. Therefore, with the practitioner’s understanding allied to sound analytical principles, it is very straightforward, while avoiding the usual statistical traps of unsound inferences and false positives/negatives, to progress from an anecdotal approach to a more evidence-based process very quickly.
On the trading desk, the ability to leap forward from being a clerical adjunct of the investment process to presenting empirical evidence of implementation cost control and therefore trading strategy enhancement is presented through this new avalanche of post trade data, which of course then becomes tomorrow’s pre-trade data. The benefit of being able to enrich one’s analysis through a systematic and consistent harvest of one’s own trading data through FIX tags is well documented. The head of trading then arrives at a straight choice: is this data and its analysis solely the preserve of the execution function, or can the investment process, as a whole, benefit from extending its usage? We aim to demonstrate that both execution and portfolio construction functions can reap significant dividends in terms of enhanced performance.
PM Involvement Portfolio managers’ involvement in transaction cost analysis tends to be a post-trade affair at many firms, on a quarterly or perhaps monthly basis, that inspires about as much excitement as a trip to the dentist. It may be viewed as purely an execution or trading issue and independent of the investment decision making process. However, there is one key reason why portfolio managers should care about transaction costs: improved portfolio performance. The retort might be that this is the traders’ area of expertise coupled with a feeling of helplessness on how they could possibly factor transaction costs in. The answer lies in including pre-trade transaction costs estimates to adjust (reduce) your expected alpha signal with some reasonable estimate of implementation costs. You can now make investment decisions based on realisable expected alphas rather than purely theoretical ones.
A key characteristic of many investment processes that make some use of a quantitative alpha signal process is that you always have more stocks (on a stock count basis) in the small and micro-cap end of the investable universe. There are simply more stocks that rank well. This is also the same part of the universe where liquidity is the lowest and implementation shortfall is the highest. If you don’t properly penalise the alpha signals with some form of estimated transaction cost, your realized alpha can be more than eroded by the implementation costs.
Proving the Point To illustrate the impact of including transaction cost estimates in the pre-trade portfolio construction decision making process, consider the following two simulations. Both are based on exactly the same starting portfolio, alpha signals and portfolio construction constraints. The only difference is that in the TCs Reflected simulation, transaction costs were included as a penalty to alpha in the optimisation objective function whereas in the TCs Ignored simulation, pre-trade transaction cost estimates were ignored. The simulations were for a Global Growth strategy using MSCI World Growth as the benchmark, running from January 1999 through the end of June 2012 (13.5 years) with weekly rebalancing. They were based on purely objective (quantitative) alpha signals and portfolio construction (optimisation) with no judgment overlay. Transaction cost estimates were based on ITG’s ACE Neutral transaction cost model. Starting AUM was $150 million. Post-transaction cost returns reflect the impact of the transaction cost estimates for each trade.
As the regulatory juggernaut gathers pace, Alexander Neil, Head of Equity and Derivatives Trading at EFG Bank examines the issues behind the tape, and what the buy-side wants.
Many of my buy-side peers have given up hope on a consolidated tape (CT), but the success of a CT is absolutely paramount now, even more so than it was a few years ago. Not just for industry insiders, but for politicians and the outside world to be shown that these can be transparent markets and that we are not penalised by misguided efforts to force volume onto lit markets, abolish dark pools or volume-enhancing factions such as certain HFT activities. In such a low-volume, low-commission environment I feel the stakes are especially high to get this right from the first day, and not let it drag on and into MiFID III . It shouldn’t be this hard to track trades in Europe and it’s funny to think that whilst we’ve seen a real race to zero in pre-trade latency, it feels like the post-trade space is being drawn out over years!
The Holy Grail is a ‘quality’ tape of record at ‘reasonable cost’. But what is a fair price for market data? Is it really something that can be left to market forces, or is it one of those things that should be regulated like electricity prices. After all, there are social responsibility aspects to market data, as ultimately higher costs for the buy-side are implicitly passed on to the broader (investing) public.
There were initially three routes that the European Commission (EC) wanted to take us down. The first route was to employ the same model as they did for execution and let the invisible hand of the market find the best solution and pricing through healthy competition. The second option was for a prescribed non-profit seeking entity to manage the CT, and the third option would be a public tender with just one winner. The EC seems to be leaning towards the fully commercial approach, and it has set the stage for a basic workflow where APAs (approved publication arrangements) collect and pass on the data to Consolidated Tape Provider(s) (CTP). But, if market forces alone could find a compromise between cost and implementation, we would have an affordable and reliable European Consolidated Tape (ECT) in place already, and MiFID II could instead concentrate on new problems.
So my first concern with the purely commercial approach is that so far, it hasn’t worked; incumbent exchanges are still charging pre-MiFID levels for their data (despite, or indeed because of, their diminished market share in execution), and the only real effort to break the stalemate (namely, the MTFs throwing down the gauntlet) will end up just penalising the buy-side more in the short-term. If the regulator doesn’t address data pricing head on, the buy-side may well end up suffering the effects of a scorched-earth move (wasn’t ESMA granted more powers than its predecessor CESR after all?).
Industry Initiatives However, it’s not all bad. The COBA Project has recently announced a proposal which promises to address these commercial obstacles and has initial support from exchanges and other venues which contribute more than 50% of total turnover. Their solution establishes a new commercial model for Consolidated Tape data which lowers the cost and incorporates the best practices recommendations developed by FPL and other industry working groups. The best practices provide details on how trade condition flags should be normalised thereby enabling consolidation of trade data across exchanges, MTFs, OTC and even BOAT. FPL’s best practices recommendations also bring together wide representation from across the industry, and has been concentrating on data standardisation (including timestamp synchronisation and a clear distinction between execution times stamps and reporting time stamps).
The Coba Project is spearheaded by two former exchange and MTF people, and seems to be the most ambitious in terms of setting a deadline (Q2 ’13). For their sake I would like to see that good work recognised, but the EC has not officially endorsed them and I see this as one of the main failings so far. Without this endorsement or intervention, I worry that the whole effort will run out of steam. And if that happens (if the regulator doesn’t give the industry a nudge) I worry it will ironically signal the failure of the freemarket approach and the regulator will have to make an embarrassing U-turn and go for the prescribed, utility model. Remember the case of BOA T, which had the potential to become an ECT, but it perhaps wasn’t endorsed enough.
So we’re in a position where the exchanges and data vendors are rushing to try and come to a mutually beneficial solution BEFORE the regulator steps in and forces a US-style consolidated tape, and by doing so, potentially remove the commercial benefits for exchanges and vendors.
Being a CTP in itself will be a tough business though, and I wonder if there’s such a thing as a commercially-viable CTP proposition: Not only will they operate in a highly regulated business, but a few years down the line there’s the possibility that Europe goes the same way as the US and starts looking at moving away from a CT and instead getting direct fees from the exchanges (a sort of parallel industry, not quite direct competition). Not only that, but because under current proposals their product will be free after 15 minutes, I expect more investors might just accept a 15 minute lag and get the data for free.
David Dight, Senior Software Architect at Liquid Capital, looks at free and open source software development over his 25 years using and developing solutions.
A brief history of FOSS; major developments and definitions The term FOSS (free and open-source software) has only recently come into common use. Software made available free of charge has been variously referred to as just ‘free’ software, ‘public domain’ software and sometimes ‘freeware’. The 80’s and 90’s saw the proliferation of ‘try-before-you-buy’ software, also known as shareware (or even ‘crippleware’). The term Open Source software was used much later to distinguish it from ‘free’ software reflecting its licensing restrictions. Today we use FOSS to refer to non-commercial and generally free software which may have certain restrictions on copyright, non-commercial use and distribution. One great advantage of this set-up is that it is now used as a development model.
FOSS has been around longer than most people probably imagine, but it wasn’t really prominent until the 1980s. It was largely the domain of hobbyists and enthusiasts until the advent of Richard Stallman’s GNU project. Today there are close to 400 GNU software packages available. One of them, Stallman’s EMACS editor is still popular (though perhaps controversial) and in wide use.
The early 90’s saw the first release of Linus Torvald’s Linux which has now become the most widely used UNIX flavour today — especially in the finance sector. My first exposure to FOSS was with Slackware Linux (at that time released on floppy). Having already used costly SCO Xenix and then SCO Unix for some years, Linux offered the first free community supported UNIX, and our team jumped on it.
Why do organisations choose FOSS? Lots of reasons; not least of which is it’s free! But just because something is free doesn’t mean it’s necessarily any good, so first let’s look at the other reasons for choosing FOSS.
If you are on the buy-side trade it’s likely you’ll need to talk FIX.
FIX has had a long (informal) link with FOSS. One of the earliest Open Source implementations, QuickFIX still enjoys popularity. It remains popular because of the critical mass of QuickFIX users out there, the low barriers to entry and the abundance of support – both community and commercial. More fundamentally, QuickFIX performs well and is easy to use. Competing with QuickFIX are many commercial FIX offerings and, to be sure, many of them are excellent in terms of the critical discriminators such as low latency and interoperability. So why does QuickFIX continue to capture so much of the market?
Here emerges one of the main reasons FOSS can be the right choice. In my view, it’s about software sovereignty. A typical medium to large buy-side organisation will generally have a team of highly skilled engineers with the capability and experience to develop pretty much anything asked of them. Give them a FOSS package and the right resources and they can modify, extend and enhance it to suit business requirements. In some cases, just having the source available allows developers to quickly fix bugs and get back in production after an outage. In other cases, the source can be instructive, providing insights into techniques and methodologies that influence future designs and implementations.
These are key distinguishing features of FOSS versus commercial offerings (aka closed source). You can’t just change a commercial piece to suit individual business needs and idiosyncrasies. It is usually extremely difficult to get a vendor to add the specific features you want. Even when they do agree to add your feature request it’s often expensive and may take some time, and it’s also unlikely to precisely match your requirements.
In an increasingly globalised world, Shane Neal, Head Trader of Matthews International Capital Management talks to FIXGlobalTrading about the benefits, and challenges, of trading in Asia from the US.
What are some of your key issues and opportunities being a US buy-side operating in Asia?
I think the majority of the issues Matthews faces trading in Asia are faced by any firm trading the region; liquidity, volatility and anonymity, to name a few. We have a propensity for identifying smaller- and medium-size companies to invest in, so a disciplined, and often patient, approach to trading is required. Market participants in Asia are sometimes less inclined than traders in the US or Europe to trade in block form; whether due to a lack of trader discretion or because they are benchmarked to a specific participation rate or volume weighted average price (VWAP). Block specialist desks, and dark pools designed for block trading, have started to help change this attitude and demonstrate the usefulness in increasing average execution size and lowering implicit trading costs.
Matthews’ trading desk maintains live trading coverage during all Asia and US market hours from our headquarters in San Francisco. Matthews is the largest dedicated Asia investment specialists in the US, and having our traders and investment team based in a strategic location in San Francisco allows our investment team enough distance from the noise in the market to focus on our long-term investment objectives. Our open work environment provides collaboration among traders and portfolio managers and has been vital to fostering dialogue about execution strategies as part of the investment process. When covering the “night” trading desk (Pacific Time), we are often the “eyes and ears” for the investment team regarding day-to-day market colour, and act as a hub for disseminating pertinent information and filtering much of the market noise.
The advantages of market connectivity through an execution management system (EMS), a sophisticated order management system (OMS), compliance capabilities, and direct market access make remotely trading into Asia an efficient and cost-effective operation. I’m unclear as to the value proposition for some firms to relocate their Asia trading operation to the region. The argument for improved service quality or an information advantage doesn’t hold, in my opinion; I don’t see how trading into India is made easier simply because one’s office is in Hong Kong, as opposed to the US or Europe. For smaller firms that don’t have dedicated staffing during Asian market hours, time zone differences may pose a hurdle when it comes to potential settlement issues, compliance needs or short windows of opportunity for interesting liquidity, like stock placements. Fortunately, at Matthews, we have a deep and diverse team with a range of perspectives and expertise. Our singular focus on investing in Asia means that everyone at the firm, from back office staff, IT to Compliance is committed to supporting our investment objectives in Asia.
How does the regulatory environment and differences between meeting SEC regs and Asian regs present a challenge?
We apply the “best execution” guidelines, familiar in the US, to our Asia trading efforts. As liquidity continues to fragment in Asia, the buy-side will increasingly rely on smart order routers in seeking the best price and utilising transaction cost analysis (TCA) for measurement, as well as other qualitative factors that contribute to broker execution quality.
Foreign exchange transaction costs have been a topic of discussion in the US recently. The lack of transparency in the OTC FX market has motivated the buy-side to work with their foreign exchange dealers and banks to improve time stamping, which is necessary to measuring execution quality. The benefits of negotiating FX trades, and the efficiencies of multibank FX tools are increasingly important. Because FX trading in the restricted markets of Asia must be executed in compliance with local market regulations, such as proof of an underlying security requirement, limited market hours and central bank reporting, market practices have necessitated custodian and sub-custodian involvement. Improved transparency around the costs associated with these trades is important for the buy-side if competitive FX pricing is an unrealistic option.