J.P. Morgan Asset Management’s Head of Trading, Kristian West, shares the role technology plays on the trading desk, including the time given to technology, evaluating tools and conveying that value to portfolio managers.
IT and the Trader’s Diary The technology function in many firms was previously seen as a necessary evil, whereas now it is seen as part of the core function of what we do. At J.P. Morgan Asset Management, we have eight technology specialists on the trading desk, so quite a lot of time is spent working with IT by virtue of them being on the desk. The trading function has become much more technology intensive and we wish it to be at the forefront of what we do. As a percentage I probably spend around 30% of my time specifically on IT related initiatives. We spend a significant amount of time thinking about and planning technology initiatives for their effectiveness and value as well as best practices.
In addition to this, there are a variety of obligations that keep me away from the desk, such as regulator meetings, compliance meetings, customer presentations, broker meetings, client reviews, TCA reviews, etc. We also have to ensure we have the relevant oversight and controls over our trading practices, which requires us to focus on relevant due diligence measures. Due to MiFID and various other potential regulatory changes, we spend a lot of time making sure that we have oversight and knowledge of all available options. We then take a view of what we think will be the likely outcome and focus our attention accordingly. Again, this requires the business and technology to work very closely together.
Communication with Portfolio Managers The central trading function is responsible for achieving the best possible outcome for the customer. There is no specific guidance from investors on how to trade or where to trade. However, from a communication perspective, there is a great deal of dialogue when we have an order on the desk. Certainly, if it is a multi-day order, then as much communication as possible is encouraged so that there is a fluid relationship between the investor and the trader, maximising our trading opportunities. More formally, we have monthly and quarterly reviews with all the CIOs and investment teams to discuss market activities, flows, broker relationships, transaction costs, etc.
We are able to optimise our performance when we understand the intentions and motives behind the orders. This allows us to adjust our level of participation in the market. In addition, we need to consider upcoming events or corporate actions, as acting quickly helps reduce market impact and slippage.
Active or Passive? When to Pick up an Order We have a defined flow process which is focused on liquidity, so if we have flow that meets certain criteria, it is automated and no trader is involved with it. The characteristics of the strategy that is chosen can actually be defined by the investor, allowing them to define how aggressive they want to be. With regards to allocation of flow, the OMS knows which area of the trading desk to send orders to and the automated engine will take orders that it feels it can execute. The automated engine then takes over and will alter its behaviour over the life of the order, depending on what is happening in the market.
A large chunk of business is fully automated, but if it does not fit defined criteria, it will go to the program trading team who will try and use their liquidity and the liquidity exposed to them to minimise market impact. If the order is too large (we have certain thresholds) or there is no natural liquidity in the market, then it will go to the single stock trading team. At that point we speak to specialists in those names. As a result of this process we have changed the way we interact with the market. We take on a lot more ownership and responsibility for the quality of execution and thus, less is left to the discretion of the broker. So whilst our execution process is not fully automated, the allocation process almost always is.
As the regulatory juggernaut gathers pace, Alexander Neil, Head of Equity and Derivatives Trading at EFG Bank examines the issues behind the tape, and what the buy-side wants.
Many of my buy-side peers have given up hope on a consolidated tape (CT), but the success of a CT is absolutely paramount now, even more so than it was a few years ago. Not just for industry insiders, but for politicians and the outside world to be shown that these can be transparent markets and that we are not penalised by misguided efforts to force volume onto lit markets, abolish dark pools or volume-enhancing factions such as certain HFT activities. In such a low-volume, low-commission environment I feel the stakes are especially high to get this right from the first day, and not let it drag on and into MiFID III . It shouldn’t be this hard to track trades in Europe and it’s funny to think that whilst we’ve seen a real race to zero in pre-trade latency, it feels like the post-trade space is being drawn out over years!
The Holy Grail is a ‘quality’ tape of record at ‘reasonable cost’. But what is a fair price for market data? Is it really something that can be left to market forces, or is it one of those things that should be regulated like electricity prices. After all, there are social responsibility aspects to market data, as ultimately higher costs for the buy-side are implicitly passed on to the broader (investing) public.
There were initially three routes that the European Commission (EC) wanted to take us down. The first route was to employ the same model as they did for execution and let the invisible hand of the market find the best solution and pricing through healthy competition. The second option was for a prescribed non-profit seeking entity to manage the CT, and the third option would be a public tender with just one winner. The EC seems to be leaning towards the fully commercial approach, and it has set the stage for a basic workflow where APAs (approved publication arrangements) collect and pass on the data to Consolidated Tape Provider(s) (CTP). But, if market forces alone could find a compromise between cost and implementation, we would have an affordable and reliable European Consolidated Tape (ECT) in place already, and MiFID II could instead concentrate on new problems.
So my first concern with the purely commercial approach is that so far, it hasn’t worked; incumbent exchanges are still charging pre-MiFID levels for their data (despite, or indeed because of, their diminished market share in execution), and the only real effort to break the stalemate (namely, the MTFs throwing down the gauntlet) will end up just penalising the buy-side more in the short-term. If the regulator doesn’t address data pricing head on, the buy-side may well end up suffering the effects of a scorched-earth move (wasn’t ESMA granted more powers than its predecessor CESR after all?).
Industry Initiatives However, it’s not all bad. The COBA Project has recently announced a proposal which promises to address these commercial obstacles and has initial support from exchanges and other venues which contribute more than 50% of total turnover. Their solution establishes a new commercial model for Consolidated Tape data which lowers the cost and incorporates the best practices recommendations developed by FPL and other industry working groups. The best practices provide details on how trade condition flags should be normalised thereby enabling consolidation of trade data across exchanges, MTFs, OTC and even BOAT. FPL’s best practices recommendations also bring together wide representation from across the industry, and has been concentrating on data standardisation (including timestamp synchronisation and a clear distinction between execution times stamps and reporting time stamps).
The Coba Project is spearheaded by two former exchange and MTF people, and seems to be the most ambitious in terms of setting a deadline (Q2 ’13). For their sake I would like to see that good work recognised, but the EC has not officially endorsed them and I see this as one of the main failings so far. Without this endorsement or intervention, I worry that the whole effort will run out of steam. And if that happens (if the regulator doesn’t give the industry a nudge) I worry it will ironically signal the failure of the freemarket approach and the regulator will have to make an embarrassing U-turn and go for the prescribed, utility model. Remember the case of BOA T, which had the potential to become an ECT, but it perhaps wasn’t endorsed enough.
So we’re in a position where the exchanges and data vendors are rushing to try and come to a mutually beneficial solution BEFORE the regulator steps in and forces a US-style consolidated tape, and by doing so, potentially remove the commercial benefits for exchanges and vendors.
Being a CTP in itself will be a tough business though, and I wonder if there’s such a thing as a commercially-viable CTP proposition: Not only will they operate in a highly regulated business, but a few years down the line there’s the possibility that Europe goes the same way as the US and starts looking at moving away from a CT and instead getting direct fees from the exchanges (a sort of parallel industry, not quite direct competition). Not only that, but because under current proposals their product will be free after 15 minutes, I expect more investors might just accept a 15 minute lag and get the data for free.
IOSCO Secretary General, David Wright, discusses the major factors influencing global markets, and the future of the global regulatory framework.
IOSCO is the International Organization of Securities Commissions. It brings together securities regulators the world over. We have 200 members representing the vast majority of regulators. Of all the international organisations covering financial regulation at the global level, we are the most inclusive because we have all the emerging market countries with us. This organisation has been running for 30 years and we have become the voice of the global securities regulatory community. For example, we have a series of global standards, what we call the IOSCO standards, which are the benchmarks for any securities market.
These standards form the foundation of all the reviews of financial regulation covered by the FSAP process led by the IMF. Our standards are the global benchmark. We have many interesting pieces of work right now; the multilateral memorandum of understanding, which about 90 of our members have signed, is basically a memorandum whereby all the participants agree to share information for enforcement purposes. The memorandum was used to solicit the exchange of information during the recent LIBOR scandal.
We have a lot of policy positions and we have a critical role in the whole process of global financial repair and reform. A lot of the work that you see referenced in the G20 or working with the Financial Stability Board is of IOSCO origin.
We have worked on high frequency trading, we work on shadow banking, OTC derivatives and credit rating agencies, we work on market structure, we work in accounting and auditing, enforcement and so forth. Of course, as a result of this crisis our work is particularly important. The other thing that we do, which is unlike other organisations, is that we provide technical assistance, and education and training for emerging market countries.
One of the things that we’re going to be working on is to build an IOSCO Foundation in which we seek support from the private sector to develop our members’ markets.
Should regulation be leading or following, who should decide what gets regulated, what gets left up to the market and to what extent do those forces interact?
Historically regulation has been following rather than leading. I think it’s right to say that this crisis shows that a significant number of incentives were wrong in the financial markets. I think the depth and scale of damage in this financial crisis, not in all parts of the world, but in certain parts of the world, show that serious repair is necessary, and that is the focus of the G20 and the Financial Stability Board agenda, which we are major contributors to.
The industry can’t complain, to the extent that they are primarily responsible for what happened, so there is a huge amount of work going on at the global level to try to make the financial system safer and less systematically risky. We are going to work on resolution and frameworks; we’re going to work on OTC derivatives, driving more OTC transactions onto exchanges, and through clearance systems. We are working intensively on the shadow banking system, which I think has surprised everybody with its scale, estimated at $65 trillion or 25% of all global banking assets, making that safer and more understandable; we are looking at money market funds, securitisation, and non-banking organisations, which can build up large amounts of leverage. Those are certainly among the most important areas of work, of course on top of bank capital, which is set by the Basel committee. The world has lost 15% of GDP so far; there are very serious worries of severe damage to certain economies and so we need very strong collective efforts at the local and global levels to try and put that right, and to try and make the system safer and more sustainable.
Are regulators struggling to keep up in terms of spending, and does this impact their oversight?
Regulators in general around the world always feel they are underresourced. When you look at the resources of one of the better resourced authorities, for example, the FSA, in London the FSA has over 3,000 people. But then you compare that to what used to be the head count of Citibank, which was 300,000 plus and that is just one organisation!
So when you multiply that across all the firms big and small they have to regulate and supervise, regulators in general feel underresourced. I think there are some good things happening though which may help them. For example the project being developed by the FSB called the Legal Entity Identifier which is a numbering system for all participants in financial markets. That I think would greatly simplify tracking market abuse, tracking data in markets, looking for systemic risk building up.
In general IT is helping the regulators detect market abuse, but there are huge markets to regulate and supervise. One of the problems has been particularly in the big complex markets, developed markets because, as has become clear, neither market participants nor the regulators or supervisors of those markets fully understood how they functioned.
We are now in year six of this crisis and we still are struggling our way through on the global regulatory level with the shadow banking system. Shadow banking is of enormous proportions, and we are still working it out. You can’t supervise or regulate a market unless you fully understand it.
I think that the one lesson of this crisis should be that unless you can fully understand not just the product, but how that product interacts, interconnects with other products, how risk can be propagated or, if things start to get difficult, what are the effects on liquidity etc, the effects on credit provision, and the effects on the system, then those products and processes should be held back until we are sure we understand.
Another area is measuring the impact of regulation; looking at the costs and benefits of regulatory change in highly interconnected complex markets, which is extremely difficult. Yet regulators should understand as far as they can the impact before calibrating final regulatory measures.
Edouard Vieillefond of Autorité des Marches Financiers looks at the factors that contribute to financial stability and how investor choice needs to be balanced with investor protection, market fairness and efficiency concerns.
FIXGlobal: How can the Commission and the European Securities and Markets Authority (ESMA) ‘encourage’ institutions to trade via multilateral facilities?
Edouard Vieillefond, Autorité des Marches Financiers (AMF): Market transparency, efficiency and integrity are essential to financial stability and to ensure that financial markets continue to play their core role of financing the real economy.
In the context of the financial crisis, in 2009 the G20 leaders declared that “all standardized over-the-counter (OTC) derivative contracts should be traded on exchanges or electronic trading platforms, where appropriate, and cleared through central counterparties by end-2012 at the latest”. In order to implement these objectives, in 2012 the International Organization of Securities Commissions (IOSCO) identified some key characteristics that electronic trading platforms should fulfil in this context, amongst which were pre- and post-trade transparency and “the opportunity for platform participants to seek liquidity and trade with multiple liquidity providers within a centralised system”. We believe that this multilateral criteria, which is not consensual amongst regulators, is absolutely essential in defining what a trading venue is and ensuring the real efficiency of the price formation process on financial markets.
As regards the perspective of the MiFID review, in Europe the Commission proposes an obligation for derivatives to be traded on multilateral trading venues, which shows progress in the right direction. On cash securities, unregulated trading has developed over recent years, including in the fully OTC bilateral space. The Commission’s aim of catching all these new trading spaces within a new EU regulatory framework is a positive one. However, without clearly defining the boundaries of the European trading environment, it leaves aside the possibility for new trading concepts to be developed, including bilateral ones. It also leaves aside more structural issues – such as the role that we want financial markets to play in the near future with regards to the real economy. An essential first step for legislators and regulators in Europe would therefore be to define in greater detail what the EU trading space shall consist of; and then to incentivize trading of standardized and sufficiently liquid financial instruments on genuine trading venues such as exchanges and multilateral trading facilities (MTF).
FG: Where is the balancing point between investor choice and encouragement towards certain venues?
EV: Investor choice is of course to be kept fully flexible but also, on the regulatory side, to be balanced with investor protection, market fairness and efficiency concerns.
In Europe, MiFID has led to excessive market fragmentation, despite the legitimate intention of the directive to enhance competition between exchanges and multilateral trading facilities. This approach has produced very mixed results, including no real overall cost reduction for final investors, an increase in dark trading and a decrease in the quality of pre- and post-trade transparency to the detriment of the market as a whole.
If financial markets are to remain a reference and to serve investors and the real economy, an essential step in reviewing MiFID is to ensure that orders be primarily executed on genuine trading venues. So, a clear distinction must be made between trading venues where prices are formed according to transparent, non-discretionary and publicly known principles that reflect real supply and demand (exchanges and MTFs), and the other trading spaces. To that extent, it is not possible to consider broker crossing networks (BCNs) and therefore organised trading facilities (OTFs) as equivalent to regulated markets (RMs) and MTFs as they do not offer the same degree of transparency (and hence efficiency) of the price formation process. Crossing networks should at best be considered as an intermediate way to execute transactions, for residual transactions that do not constitute addressable liquidity or with a very strict ceiling above which those BCNs should be transformed into truly multilateral MTFs.
Mizuho Securities’ Spyridon Mentzas discusses the status of the Japanese exchange merger and offers thoughts on how well the two systems will merge and the benefits investors can expect.
Compatibility The merger of Tokyo Stock Exchange (TSE) and Osaka Securities Exchange (OSE) is not yet finalized, but it appears they will merge in the beginning of 2013, with the details yet to be specified. The first impression is that they have nearly identical trading rules with some minor differences, such as the OSE trading until 3:10, while the TSE closes at 3:00. When TSE decided to shorten the lunch time in November, the OSE did the same. When one of the exchanges (usually, the TSE) changes the rules, then the other moves in tandem: for example, changing the tick sizes. If the merger does go ahead, it is likely that they are going to use the TSE’s cash system, arrowhead, and OSE’ J-GATE for derivatives. They will use the old systems in parallel, which will achieve a reduction in cost because they will not have to maintain two systems.
Further Industry Consolidation The ECN’s in the US enjoyed technological superiority versus the classic exchanges, where NYSE’s latency was significantly slower than Arca’s. This would have been reason enough for TSE to consider buying a PTS, but with arrowhead’s current latency of less than 2 milliseconds (and another upgrade in the next few months to target less than a millisecond), simply buying a PTS would not give them a noticeable advantage because the TSE and OSE are on par with the PTSs. The reason why PTSs are increasing their market share is that, unlike in the UK and US, where Reg NMS and MiFID have required trading on the exchange with the best price, in Japan the PTSs draw volume through decimal points and smaller tick sizes than the incumbents.
For example, Mizuho Financial Group might trade on the TSE at 105 yen bid, 106 yen offer. That one yen spread is close to 100 basis points or almost one percent, whereas the PTS trades at 0.1 yen. This is a major incentive for investors to buy and sell on the PTSs with their smaller increments to reduce market impact and trading costs. From the beginning, the regulators have not been overly concerned with the PTSs deciding to trade in decimal places and have 0.1 yen ticks. It was always up to the PTSs to decide and the TSE could do the same. If anything, I think the new exchange would rather reduce their tick sizes, than merge again.
However, not all participants would be happy to see new tick sizes, for example, some of the proprietary houses or small firms that trade with retail, as altering their downstream systems to handle decimal places would be costly.
This will also create a fragmentation of liquidity in tick sizes. The bids and offers on the TSE are often thick, with something like 50 billion shares sitting on the bid side, so with 0.1 yen ticks, the average order size might move to 3 million or 1 million shares. Traders who want to buy a large lot will have to scroll up and down to find out how much they have to go up to absorb the available liquidity. I think for the traditional long-only traders, this might mean an increased scattering of liquidity. There is sufficient liquidity in the market at present; even for stocks trading at a low price – there are market makers trying to make 1% during the day. If smaller tick sizes are introduced, that liquidity will likely be scattered or disappear.
AFME’s Securities Trading Committee Chairman Stephen McGoldrick unlocks the latest MiFID proposals and looks at the rules for Organized Trading Facilities, algo trading and a consolidated tape.
Organized Trading Facilities (OTFs) The OTF regime began life as a specific regulatory wrapper to put around broker crossing systems, (which are a new mechanism for delivering an existing service). Crossing, which is almost the definition of a broker, has become highly automated. Whilst most crossing activities have not changed, other aspects of the industry were seen to require regulation – namely increased automation and greater scope of crossing. The initial proposals outlined an umbrella category of systems called OTFs, with one category created to hold broker crossing systems and another to hold the systems for G20 commitments around derivatives trading.
When the MiFID II proposals came out at the end of 2011, the ‘umbrella’ aspect had been simplified into a structure intended to be ‘all things to all people’, which is where it has come undone. MiFID II has created a regulatory receptacle for a practice and the two things differ in shape. The broker crossing system does not fit into the receptacle that has been created for it because much of the trading is against the books of the system’s operators, which is prohibited under the current proposals.
The regulators do not want speculative, proprietary trading within these systems, but unwinding risk created by clients is both useful and risk-reducing. An opt-in mechanism for compliance, allowing traders to decide if they want their orders traded this way may be a solution. Conflict management of this sort is common in the financial sector, as it ensures that any discretion is not exercised against the interests of the client. Certainly, when it comes to measuring the client’s interests against the operator of an OTF, it is absolutely unambiguous that their interests must come first. Therefore, any exercise of discretion that disadvantages the client relative to the operator is already prohibited. A formal, documented process to ensure that segregation stays in place is good, but to effectively prohibit the vast majority of trading on broker crossing systems seems to abandon the regulators’ objectives – to increase transparency and protect clients.
Furthermore, trades allowed into a broker crossing system would be instantly reported, creating post-trade transparency. The current proposals call for OTFs to be treated in the same way as Multilateral Trading Facilities (MTFs), which fosters uncertainty about the waivers for pre-trade transparency. Currently, there are clear criteria for granting a waiver to a platform: one is that orders are large in size, the other is taking reference prices from a third party platform. The Commission will not, however, be making the decisions about waivers; they have been handed to the European Securities Market Authority (ESMA) to determine. There is a danger in specifying too stringent limits for these waivers, which would create a very different landscape from that explicitly envisaged by MiFID I.
Systemic Internalisers (SIs) Our understanding is that regulators did not want to split activity that was in an OTF into two, but rather to regulate the broker crossing systems and to remove the subjectivity of SIs. The current SI proposal is aimed at regulating automated market making by banks, so that institutions make markets by reference to market conditions, not by reference to their clients. In MiFID I, the SI regime was introduced to protect retail investors, but subsequently this seems to have changed. When the European Commission (EC) was asked by the Committee of European Securities Regulators (CESR) to clarify the rationale for an SI regime, they declined to do so. As a result there is a distinct lack of clarity regarding the intent of the SI rules. If we had a clearer vision of the direction the regulators wished to take the market, then it would be far easier to assess whether the regulations were moving us in the right direction – or not.
Wendy Rudd of the Investment Industry Regulatory Organization of Canada (IIROC) describes the Canadian approach to circuit breakers, minimum size and increment requirements and the role of dark liquidity.
What is currently driving the regulatory policy agenda with regard to circuit breakers? Globally, and Canada is no exception, we have seen the introduction of new rules in several areas related to the mitigation of volatility. Circuit breakers are just one of those areas. While some reforms may have been in the works already, the Flash Crash of May 2010 certainly served as a catalyst for a broader debate about market structure, trading activity and the reliability and stability of our equity trading venues.
Volatility is inevitable, so when does it become a regulatory concern? From our perspective – and we regulate all trading activity on Canada’s three equity exchanges and eight alternative trading systems – we see it as a priority to mitigate the kind of shortterm volatility that interrupts a fair and orderly market. We do not expect to handle this role alone; it is a shared responsibility that includes appropriate order handling by industry participants and consistent volatility controls at the exchange/ATS level.
What are the benefits of harmonizing circuit breaker rules with US markets? One main advantage to a shared or complementary approach is that it limits the potential for certain kinds of regulatory arbitrage in markets that operate in the same time zone. Many Canadian-listed stocks also trade in the US, and roughly half of the dollar value traded in those shares takes place on US markets each day.
Which approaches are you considering taking for market-wide circuit breakers? We are monitoring developments in the US, where regulators have proposed changes which include lower trigger thresholds calculated daily, using the S&P 500 (instead of the Dow Jones Industrial Average) and shorter pauses when those thresholds are triggered. We are currently exploring options for marketwide circuit breakers which include continuing our existing policy of harmonizing with the US, pursuing a ‘made-in-Canada’ alternative or identifying a hybrid approach that does a little bit of both. At this stage, we are soliciting industry feedback on the merits of these three approaches. With the help of that feedback, we expect to be able to choose the appropriate path soon. It is important to note that these kinds of circuit breakers are an important control but have traditionally acted more as insurance – they have only been tripped once in the US and Canada since being introduced in 1988.
How similar is IIROC’s new Single-Stock Circuit Breaker (SSCB) rule to the US rules? Single-stock circuit breakers are relatively new for both jurisdictions. The US and Canada have implemented SSCBs which are similar in that a five-minute halt is triggered when a stock swings 10% within a five-minute period. Otherwise, the Canadian approach differs in several ways. For example, our SSCB does not trigger on a large swing in price if a stock were trading on widely disseminated news after a formal regulatory halt.
Do you believe circuit breakers, market-wide or single-stock, have a deterrent effect on momentum trading? We did not set out with a prescriptive approach to influence or change trading behaviour or strategy. IIROC’s circuit breaker policies were developed to provide added insurance against extraordinary short-term volatility. We intend to study the impact of any changes and we may be able to learn more about the impact of policy changes on trading behaviour.
Richard Nelson, Head of EMEA Trading for AllianceBernstein, shares his perspectives on navigating volatility, prospects for developing exchanges, new regulation and the balance between transparency and best execution.
FIXGlobal: How much does volatility affect the way that you trade and what are you using to measure volatility on the desk?
Richard Nelson, AllianceBernstein: We use an implementation shortfall benchmark, so the longer we take to execute an order, the wider the range of possible execution outcomes. Volatility, in particular intraday volatility, increases that potential range, so you could see very good or very poor execution outcomes as a result. In reaction to that, we take a more conservative execution strategy or stretch the order out over a longer time period. And, for instance, if we get a hit on a block crossing network, we will not go in with as large a quantity as we would in a less volatile market. In that way we try to dampen down the potential effects that volatility might have on the execution outcome.
FG: How is AllianceBernstein using technology to improve performance and cut costs on the trading desk?
RN: It plays quite an important part and has done so for quite a while. We are pretty lucky in that we have a team of quant trading analysts. Most of them are in New York, but we have one here on the desk in London, and they help us to analyze the changing market environment and recommend the best ways we can adapt to it. Our usage of electronic trading has increased in the last year, we benefit from the quant trading analysts looking at the results we are achieving with our customized algorithms. We are more confident about getting good consistent execution outcomes because they are monitoring the process and making the necessary changes to ensure the results are what we are expecting. This, in turn, increases the productivity of the traders I have on the desk. They can place their suitable orders into these algorithms and let them run which allows us to focus on trying to get better outcomes on our larger, more liquidity-demanding orders.
On top of that, as market liquidity has dropped significantly, we are trying to make sure we reach as much potential liquidity as possible, and ideally we want to do that under our own name rather than go to a broker who then goes to another venue. We believe that going directly into a pool of liquidity is better done under your own name rather than via a broker because we can then access the ‘meaty’ bits of the pool rather than the ‘froth’. We are looking into ways of doing that but one of the problems is that, potentially, you get a lot of executions from a number of different venues, which results in multiple tickets for settlement. Our goal is to access all these potential liquidity pools, yet also control our ticketing costs, which are a drag on performance for clients.
FG: Was it an intentional change to increase electronic trading or was it a byproduct?
RN: It was a little of both. Our quant trader has been with us for two years and when he first arrived he had to sort out the data issues that exist in Europe and to clean things up. Once the data integrity was sorted out, we looked at different ways of employing quantitative analyses. Having somebody here who is constantly monitoring the execution outcomes means we can proceed down this path with real confidence. As a London firm, we were a little behind in our adoption of electronic trading, but now we are in the middle of the pack in terms of usage. It makes sense from a business and productivity perspective that there are many orders that do not need human oversight, which are best done in algorithms.