David Morgan, Marketing director, trading and client connectivity, SunGard’s capital markets business, Q&A on FPL’s recommended risk guidelines and SunGard position paper “Implementing effective electronic trading risk controls”.
What is your general opinion on the FPL Guidelines?
We were very pleased to see them, as clearly any initiative coming from an organisation with a lot of credibility looking to promote best practices in the market place must be a good thing, and of course also from a selfish point of view as a software vendor: those best practices need some good software in order to support them
This particular area of pre-trade risk management is one where we’ve been active for a long time; we feel we have some particular advantages with our well developed products. We were very keen when the first issue of the FPL guidelines was published in 2011 to use them as a benchmark to check whether we were covering the major items that were being brought to light by FPL as best practice recommendations.
So we went through that as an exercise and we have done the same thing again on the updated 2012, guidelines, which provide more detail on derivatives-specific requirements.
So the value is that it gives you a benchmark for comparison?
Yes, it gives a basis for discussions with individual clients when looking at how the product line should be moved forward, because different points will have different importance to different clients based on the nature of their business. There are some guidelines that 99% of people were already following, at the level of fat finger checks etc. At the other extreme you’ve got some points in the guidelines which I would say very few people are doing and even fewer are doing them on a pre-trade basis, as it might be impractical to do so. Others are of a more specialist nature where it would depend on the nature of the business as to whether they are relevant or a priority. So there is quite a variety in there from the absolute vanilla to the quite exotic.
Is there anything you think the FPL guidelines missed or could have done better as the organisation always welcomes industry feedback, or is there a deliberate intent to leave gaps for others to fill?
They appear to be pretty comprehensive. They are fairly prescriptive; in the second edition of the guidelines you can almost take it as your outline product specification and start writing the code; they don’t leave much to the imagination, which is a good thing. This isn’t an area where one should mess around with vague discussions.
There are a couple of minor areas that our products cover that the FPL guidelines do not. One is in strategy trading, as strategies are not easy to pre-validate. Normally buy and sell legs cover each other, so validation of the whole strategy has to be done: if you validate each leg independently you will be too restrictive. We cover this, but FPL doesn’t mention it. It’s important in many derivatives trading contexts, and for equity pairs trading.
Second is the area of alerts, where FPL doesn’t talk about their use. Before getting to the point where you have to block an order, it is often useful to alert the trader that he has reached a certain percentage of a limit: we provide this option.
Neal Goldstein, J.P. Morgan, Timothy Furey, Goldman Sachs and Greg Wood elaborate on the forthcoming FPL Risk Subcommittee’s Risk Management Guidelines including their extension to cover DMA, symbology and futures.
While margin checks do not fit into the typical pre-trade risk check, how can traders assimilate the risk limit functionality of FIX with their margin-level risk monitoring?
Neal Goldstein, J.P. Morgan:
Pre-trade risk checks are a key element of the comprehensive risk management strategy applied for business lines like prime brokerage. For electronic trading relationships where a client is offered leverage based on some level of collateral, real time positions for each client are usually calculated based on start of day, and intra-day drop copies of execution reports. A typical risk control is to link the post-trade position checks with the pre-trade checks applied at the gateway. If a client’s intra-day position approaches a level that exceeds the pre-arranged leverage or margin agreements, the post-trade system can send a cut off signal to the pre-trade gateway. The client would then be allowed to liquidate the position to reduce the long/short positions, but not go any further long or short.
The basic definition of DMA trading is that brokers provide access to a venue in the most efficient and effective way possible. What can brokers do to ensure they do not miss client risk limits, internal counterparty checks, rule 15c3-5 requirements, etc while maintaining speed of access?
Timothy Furey, Goldman Sachs:
Whether using algorithms, smart order routing and/or DMA to access the market, it is important to make sure that the rules are optimized and that automated testing and checkout processes are in place to verify that they are working. Appropriate risk controls are a key part of execution and are baked into the process. With all the advances in technology, development teams have the ability not only to better optimize the execution path for speed and efficiency, but also to provide benefits like automated testing to check that controls are functioning properly.
How important is symbology validation to equity risk controls? Can better technology remove fat finger errors from trading?
Greg Wood: Symbology validation is very important to any type of electronic order flow since the broker must clearly identify the instrument being traded by the client. An erroneous validation of a symbol could have serious repercussions in how the order is executed in the market, including inadvertent disruption to the market. One of the key rules of engagement when a broker certifies a FIX connection with a client or vendor is for both parties to agree what symbology is being used on the session and then not to deviate from that without a subsequent recertification.
Risk management technology is definitely evolving alongside trading technology to provide better controls for the way people are trading now. A simple fat finger check can prevent an inadvertently large order being sent direct to the market. However clients are increasingly using algos to trade large orders over a longer duration or using different types of interaction with the market. In this situation the fat finger check is deliberately large to allow the order to be submitted to the algo. The algo then needs to assess whether the parameters of the order - instrument, aggression, duration, time of day, etc - are suitable for the size of the order. If a large order has parameters that are too aggressive in comparison to the average daily volume of the instrument and the desired timeframe for execution then the algo should either reject or pause the order to avoid impact to the market. If this happens then the broker and client should discuss how to adjust the parameters of the order to avoid impact.
Fabricio Oliveira, Head of Risk Management at Mirae Asset Global Investments Brazil, discusses his approach to pre-trade risk controls and how local market structure influences the occurrence of risk.
Market Open At Mirae we do much of our trading with offshore entities. For example, we have funds that are administered in Hong Kong, Luxembourg, Brazil, US and Korea and this geographical disparity creates operational risk. Differences in settlement price, currency and the timing of financial transfers are all aspects that must be considered when using offshore funds. The ability to settle a US trade in the US and not in another time zone is also important. This is particularly true of Hong Kong as our time difference is a huge barrier to trades in Asia. It is almost impossible to book these trades in Hong Kong even though our traders here see the opportunity to do so.
When I focus on the risks for open trading, the settlement movement is an important concern. Whether you are focused on market risk or liquidity risk, all risks need to be monitored, so you can have a clear view of what potential risks lie ahead.
High Frequency Trading There is much discussion in the industry and at conferences about high frequency trading (HFT) in Brazil, but we are not yet ready for high frequency strategies. The industry is starting to see how HFT works, but liquidity in Brazil across asset classes is insufficient to support these strategies. There are approximately 300 listed companies in equities and about half that number in derivatives, whether in bonds or yield curves or currency. The local players who run HFT strategies focus on the few stocks and derivatives with liquidity, which does not give them many options to find alpha over short periods. It will be interesting to see how it works in North America and Europe and for us to consider what might be possible in Brazil. For now, I do not see many players in HFT and I can count on one hand the number of funds using HFT.
Our pre-trade risk controls have not had to account for HFT volumes and speeds yet, so we have focused more on core control mechanisms. We have some vendors who can produce risk controls for the current liquidity. If we have liquid stocks, derivatives or OTC products, then we can define our own risk controls. Fund houses with hundreds of funds will have difficulty in applying those controls to the trading systems, but as Mirae mainly focuses on equities, our implementation burden is much lower. Today, all our pre-trade risk controls are done in real-time, including automatic limits. Beyond this, we still have a layer of control in the trader on the desk.
Working with Brokers When discussing risk controls, it is important to mention that in Brazil all brokers employ significant risk controls on their side, to prevent them from taking on more risk than they can carry. When the brokers start to trade with the exchange, the exchange provides them with risk guidelines and limits. As clients of the sell-side, buy-side desks cannot exceed their assigned broker limits and their orders will be automatically paused if the broker’s limits are reached. The broker’s risk controls are complete; they will not take on risk. As a result, their clients do not have much help in implementing their own controls. This is exacerbated because a fund house may trade with many brokers – in our case we deal with 35. It is impossible to implement one solution per broker, so we rely on our OMS provider to connect with the brokers and to match up risk controls.
Brian Ross of FIX Flyer talks to Buy- and Sell-side presenting the latest lessons on high frequency trading and algorithms from the Indian market.
India’s capital markets are experiencing increased interest from local and global firms and new rules are set to attract high frequency trading (HFT).
The capital markets regulator, the Securities and Exchange Board of India (SEBI), the exchanges, brokers and many investors are in favor of abolishing Securities Transaction Tax (STT). Eliminating STT will have a positive impact on market turnover, will help high frequency traders to be more profitable and, at the same time, narrow spreads should drive up trading volumes.
STT has been levied for all trades, domestic or foreign, on all transactions in either equities or derivatives markets since 2004. At the time, the purpose was to generate tax revenue and to protect market integrity by slowing down the pace of technological advancements of a few, well-funded players. Revenue generated by STT amounted to around USD 1.5bn in 2011.
It is widely expected that STT will be eliminated this spring, bringing new opportunities for HFT in one of the world’s biggest and fastest growingcapital markets.
To better understand the situation, we asked five panelists who are leading the charge in HFT in India, to share their insights with us.
You never forget your first algo. When you first got involved in algorithmic trading, what problem were you trying to solve? What was your decision process, and what technologies did you use?
Sanjay Rawal, Open Futures: We started off using algos for trading purposes and the first one we built was for a specific type of arbitrage that was getting difficult to run using manual input. We used third party software for the exchange connectivity and wrote our algo in C#.
Vishal Rana, IIFL Capital: My first experience with HFT was trying to create a straight-arb model on a real-time basis. Although it was a simple model, the most difficult thing was to clean the data. We got the data dumps and it took a lot of effort to clean it. Most of the coding was done using C++.
Rohit Dhundele, Edelweiss: At the onset of the project, the easiest yet most important task was gathering the business intelligence to be subsequently converted to algorithms. Some of the more intricate decisions were the selection of order, execution and risk management systems to ensure a stable back-bone to the platform. Other equally important criteria were a flexible programming environment and a friendly interface for users. To achieve these objectives, we had to decide whether to build or buy this technology.
At Edelweiss, we realized relatively quickly that there is a sweet spot between the two extremes of in-house vs. outsourced solutions. We have since been following this model – combining the best of both worlds, which has helped us deliver customized solutions within acceptable turnaround times, whilst still protecting our IP.
Sanjay Awasthi, Eastspring Investments (Singapore) Limited: In the Indian markets, propelled as they are by rapid information dissemination systems, anonymity becomes a key factor in determining efficient trading. It was this need for anonymity that propelled us towards algorithmic trading. Continued use and familiarity lead to further benefits by way of better execution control. Algorithmic trading has thus become an important part of our execution arsenal.
Chetan Pandya, Kotak Securities:
The first algo I worked upon and put in production was calendar rolls for derivatives. Our trading desk had huge positions to roll from the current month to the next and manual execution was leading to slippages and erroneous executions at times. Using the 2 legged order of NSE we created a simple algorithm which would roll the position at desired spread.
My first observation regarding algorithmic trading was to appreciate the difference between an individual trading manually versus a machine trading automatically. There are so many things that come naturally to a human being but needs to be told to the machine. Sometimes I wonder whether an algorithm can fully replace a human being ever. There are those nuances of the market and events that lead to erratic market behaviour that cannot be fully programmed for reaction.
Also, I had to ensure that there is no room for error when you are trading using an algo platform, primarily because of the sheer number of orders that it can process in a single second and also the inability to spot something going awry with the naked eye given the sheer speed. Hence, I had to also think of risk management capabilities of the Algorithmic platform while needing to ensure that risk management does not lead to inefficient execution due to latency.
In terms of technology, we were limited to applications that conformed to our market regulations. Once we had the base framework and architecture ready, we integrated it rapidly with our existing applications for order routing and downstream workflows.
CIBC’s Thomas Kalafatis maps out the new CSA rules regarding direct electronic access and suggests its potential effects on brokers and institutional traders.
Are the updated Direct Electronic Access (DEA) requirements a response to patterns endemic to Canada or are they a response to patterns observed elsewhere? Given the existing Investment Industry Regulatory Organization of Canada (IIRO C) rules and the timing of the Canadian Securities Administrator (CSA)’s DEA rule proposal, it is fair to say that the rules proposed by our regulators are intended to maintain consistency with changes in other jurisdictions and prevent regulatory arbitrage. We do not believe that the rules are the result of a specific effort to solve a localized Canadian problem, but rather a preventative measure to ensure structural issues that have arisen elsewhere will not take root in Canada.
The issues around direct electronic access raised in the United States (who is accessing marketplaces directly, and how they are ensuring automated systems will not malfunction) are less of a concern in Canada. TMX rule 2-501 limits who is eligible to receive DEA access, restricting DEA to wellcapitalized firms, or firms that are registered and regulated in certain other jurisdictions.
IIRO C Notice 09-0081 addresses how automated systems should be managed to mitigate the risk of malfunctions. It requires brokers to manage the risk of electronic trading by clients in the same way that they manage the risk of their own electronic trading. This includes ensuring that automated risk filters are in place, that order flow from an automated system can be interrupted/switched off by the broker, and that strategies are tested prior to being deployed to market. These basic, principlesbased protections have been effective at mitigating risk in Canada since well before the wave of automation hit our markets in 2008.
The proposed DEA rules are a movement away from the IOSCO principles-based approach that has traditionally been taken in Canada, towards a more prescriptive regime more like the 15C-3-5 rules introduced by the SEC in the United States this year. This builds consistency between the Canadian and American jurisdictions that are so closely intertwined.
Automated pre-trade risk filters are in place for many brokerdealers. How difficult will this regulation be to implement? Broker-dealers will need to monitor the proposed rules closely, particularly with regard to their Sponsored Direct Market Access (SDMA) clients. These clients have their own sophisticated automated risk management systems in place – as required by UMIR rules and, more importantly, as a result of their own risk aversion. They connect directly to exchanges to minimize latency. The DEA rule proposes to change this, in parallel to 15C-3-5 in the US, in that brokers will need to have “direct and exclusive control” over the risk filters on client flow; this means that a duplicative set of filters operated by the broker will have to be put in place.
In this case, Canadian brokers benefit from the earlier adoption of 15C-3-5 in the United States where various technologies have been developed to meet SEC rules that went into effect in the summer of 2011. Depending on the needs of its client base, a Canadian broker can choose between several types of risk filter offerings operating in a latency range from the low milliseconds to the low microseconds. The only differentiator is cost, with a significant premium on the single-digit microsecond lowest latency offerings.
Generally, it is not economic for a Canadian broker to develop the ultra-low latency solutions in-house, and the Canadian broker community benefits from the availability of third party technologies developed to meet the US rules that came in to effect earlier this year.
AllianceBernstein’s Global Head of Quantitative Trading, Dmitry Rakhlin, discusses the problem of fragmentation and what makes a good aggregator, along with Ned Phillips of Chi-East, Greg Lee of Deutsche Bank, Steve Grob of Fidessa and Instinet’s Glenn Lesko.
Dmitry Rakhlin, AllianceBernstein
How does aggregation improve trading and best execution? Institutional traders usually demand (remove) liquidity from the markets, which in turn creates market impact. Being able to interact with aggregated liquidity (e.g. all available liquidity) lowers this market impact. Aggregated liquidity also gives a trader the ability to interact with many more liquidity sources randomizing the way the liquidity is taken from the market. This decreases the amount of information leakage and protects the trade from being exploited by predatory strategies.
Does aggregation spell the end of fragmented markets? No. The US equity market is highly fragmented, yet all liquidity centers are interconnected, which allows traders to build various aggregator strategies. No doubt, there is cost and complexity associated with this. Fragmentation also introduces so called latency arbitrage and a potential increase in information leakage (the information leakage can be drastically reduced by using appropriate trading strategies).
The positive aspect of fragmentation is that it creates rich market microstructure (traditional exchanges and exchanges with inverted fee structures, block crossing networks, auctions, conditional order types, aggregators of retail liquidity, etc.). These choices give the buy-side the ability to match their investment strategies to the appropriate liquidity sources and ultimately benefit by being able to trade more nimbly and at lower cost.
From your perspective, is aggregation about greater access to liquidity or reducing trading costs? Both.
Ned Philips, Chi-East
How does aggregation improve trading and best execution?
A good aggregator brings order to fragmented markets by concentrating order flows, and liquidity, from a large number of matching venues. It is a tool that allows all participants to access multiple venues from one easily accessible point, reducing the technology costs and other difficulties involved in monitoring different trading venues.
Does aggregation spell the end of fragmented markets?
No. Even if one good aggregator attracts a majority of trading flows, it would not represent a throw-back to a single exchange monopoly. Aggregators are there to make the process of using multiple markets easier and more efficient and can only exist as long as participants have a choice of matching venues.
What are the risks inherent in aggregation and how can an aggregator ensure improved execution?
Theoretically there is a risk that an aggregator will be so successful that it monopolises the market but competition and risk management would keep things in check.
An aggregator ensures improved execution by concentrating liquidity which reduces spreads and improves execution.
Brown Brothers Harriman’s Garvin Young explains the decision to adopt a Software as a Service (SaaS) trading system in lieu of traditional on-site architecture.
In its capacity as a global custodian, Brown Brothers Harriman (BBH) takes a holistic view of its trade execution process. This view includes front-end connectivity and execution, all the way through to settlement. The firm continually assesses the current and future needs of its clients to ensure that its products and solutions fully meet their requirements.
Searching for a Cutting- Edge Solution
In late 2010, given the rapidly changing landscape of the brokerage industry related to connectivity, regulation, algorithms and back-office efficiencies, we initiated a RFP process to identify an order management system that could best position its clients for the future.
Specific details of the project included a buy versus build analysis, cost/resource considerations, client retention rates, etc. Given the timeframe that we had set for implementation, it became clear that a build-from-scratch solution would have been both costly and impractical. Such a solution would have required BBH to add staff, incur IT spend, expand occupancy space, and bear significant ongoing maintenance costs.
Through the RFP process, we looked for a provider with a reputation for stability. In an environment of microsecond execution, an OMS must be reliable, stable and flexible. The ability to customize the solution was also important. The solution had to include a robust front-end while also keeping with BBH’s requirements of high-quality middle- and back-office processing. Our integrated execution and settlement product required a solution provider with strong expertise around maintaining high straight-through processing levels and real-time client reporting.
As a privately held organization, BBH maintains a high focus on risk management, which meant that a strong track record of regulatory reporting and risk management tools was also critical. The firm’s global and sophisticated client base has complex connectivity requirements, such as Reuters, Bloomberg, ULLINK, SWIFT and virtual private networks (VPNs), to name a few. Further, its clients have specific FIX tag requirements and run multiple versions of the FIX Protocol. We required a solution that was able to meet all these demands.
Identifying the Right Provider
BBH narrowed the search to six top providers of equity execution platforms and went on to select Fidessa. BBH’s Investor Services clients recognize us as a leader in technology solutions, with the capability of offering them a sustainable, long-term and flexible solution that allows them to access new markets to grow their business. We determined that their platform aligned well with these needs, and offered an ideal complement to its existing proprietary solutions.
Put three men and a FIXGlobal’s Edward Mangles around a table; serve them lunch and let the tapes roll. FIXGlobal listened in on a conversation that ranged from regulators to risk and from FX to FIX.
Edward: In defense of the regulator … how should they know what’s going on when neither the sell nor buy-side seem to know?
Vincent: Recent events have shown the divide between the financial market participants and the regulator. For example, the Lehman’s mini bond issue has forced a strong dialogue between the regulator and, in particular, the broker side. But the engagement is slow.
Kent: Retail brokers tend to have a strong voice here in Hong Kong and over the years have developed a strong working relationship with the regulators. Local brokers can at times be pretty outspoken and have proven on many occasions to be an effective lobbying group. From our perspective international brokers tend to be less visible in some of these debates. We see certain common characteristics across Asia where understandably there is a good deal of focus on protecting the retail investor given the high retail investor participation in many of the stock markets in Asia including Taiwan and Korea. The challenge has certainly been in the retail space where there is an overlap of regulatory responsibility in approving and offering products.
Edward: Are we asking the impossible of the regulator to create the same rule book for retail and institutional investors?
Kent: The general principal is that retail investors are less savvy and experienced and regulations need to be explicit. There is a general assumption that as professional investors, institutions can operate with greater flexibility since they can understand the risks in a more sophisticated way. Taking account of this framework then it will not be possible to standardize for both types of investor. The risk is that setting minimum requirements to protect the retail investor may not suit the way business is transacted at an institutional level. Here we advocate consultation and support stronger trade associations.
Vincent: I don’t think you can realistically expect the same regulations for retail traders as for big institutional investors. That’s a utopia that’s never going to exist. These two groups of investors have different needs. Many regulators – in Europe for example and Luxembourg in particular with their efforts to push through the UCITS 4 protocol – understand that you need different protocols for retail investors.
Kent: But Vincent, every investor has the same goal: making money. It’s only the detailed requirements that are different.
Gerry: There’s certainly a larger burden on the big firms to uphold ethical, legal and fiduciary standards.
Kent: Yes. Retail investors don’t generally have the same constraints on their activities. Institutional investors need a more developed investment process and must ensure fair treatment across all clients regardless of size and fees. Institutional investors will undoubtedly be looking at different investor objectives – for one, they need to be able to implement their strategies in much greater volumes, and in scale, for example.
Edward: How about the role of regulators in curtailing short-selling in many markets? Knee jerk or long-term strategy?
Kent: I’d like to see the ability to short-sell fully resumed as soon as practically possible. We’re now in a situation where some markets have suspended it, and some are allowing it again. This is not ideal. I certainly see the temporary prohibition as a knee-jerk reaction and understandable given the groundswell of public opinion and corporate pressure as the financial crisis took hold – not all of this opinion was entirely rational. In fact, short-selling restrictions can reduce volumes for trading in the markets overall. For one, we have a 130-30 fund. So in this fund, if we’re limited in the number of attractive long-short pair trades we can put on then we’ll just end up trading less. So it’s business that never happens and the unknown would-be client on the other side of our trade – whether they’re institutional or retail – through the exchange, never gets to take advantage of the liquidity. What we need is a greater understanding of how shorting operates. There is a lot of misconception around this issue.
Gerry: I see the value and merit in allowing short selling in varied markets. In markets that don’t allow it, the regulators need to develop this functionality. It encourages more liquidity and volume. But I do understand that in the current environment the regulators have little choice. We won’t know the full impact until later on.
Vincent: The problem is that there’s no consistency among the regulators. Some only forbid short selling on financials. It’s a disruption to competitiveness between various sectors.
Kent: Yes. And not being able to short, will reduce derivatives trading. The fact is, a lot of the shorting that goes on isn’t just one-way, but a strategy with a ‘long’ component to it as well. And funds that relied on the little performance boost from securities lending fees have also seen their returns diminished. The equity finance desks at the brokers have seen a real drop-off in trade volumes because of this.
Vincent: Now the regulators are trying to encourage investors to buy again in a bear market – and there’s a lot of inconsistency between the messages they’re sending now and what they were telling us six months ago.