Neal Goldstein, J.P. Morgan, Timothy Furey, Goldman Sachs and Greg Wood elaborate on the forthcoming FPL Risk Subcommittee’s Risk Management Guidelines including their extension to cover DMA, symbology and futures.
While margin checks do not fit into the typical pre-trade risk check, how can traders assimilate the risk limit functionality of FIX with their margin-level risk monitoring?
Neal Goldstein, J.P. Morgan:
Pre-trade risk checks are a key element of the comprehensive risk management strategy applied for business lines like prime brokerage. For electronic trading relationships where a client is offered leverage based on some level of collateral, real time positions for each client are usually calculated based on start of day, and intra-day drop copies of execution reports. A typical risk control is to link the post-trade position checks with the pre-trade checks applied at the gateway. If a client’s intra-day position approaches a level that exceeds the pre-arranged leverage or margin agreements, the post-trade system can send a cut off signal to the pre-trade gateway. The client would then be allowed to liquidate the position to reduce the long/short positions, but not go any further long or short.
The basic definition of DMA trading is that brokers provide access to a venue in the most efficient and effective way possible. What can brokers do to ensure they do not miss client risk limits, internal counterparty checks, rule 15c3-5 requirements, etc while maintaining speed of access?
Timothy Furey, Goldman Sachs:
Whether using algorithms, smart order routing and/or DMA to access the market, it is important to make sure that the rules are optimized and that automated testing and checkout processes are in place to verify that they are working. Appropriate risk controls are a key part of execution and are baked into the process. With all the advances in technology, development teams have the ability not only to better optimize the execution path for speed and efficiency, but also to provide benefits like automated testing to check that controls are functioning properly.
How important is symbology validation to equity risk controls? Can better technology remove fat finger errors from trading?
Greg Wood: Symbology validation is very important to any type of electronic order flow since the broker must clearly identify the instrument being traded by the client. An erroneous validation of a symbol could have serious repercussions in how the order is executed in the market, including inadvertent disruption to the market. One of the key rules of engagement when a broker certifies a FIX connection with a client or vendor is for both parties to agree what symbology is being used on the session and then not to deviate from that without a subsequent recertification.
Risk management technology is definitely evolving alongside trading technology to provide better controls for the way people are trading now. A simple fat finger check can prevent an inadvertently large order being sent direct to the market. However clients are increasingly using algos to trade large orders over a longer duration or using different types of interaction with the market. In this situation the fat finger check is deliberately large to allow the order to be submitted to the algo. The algo then needs to assess whether the parameters of the order - instrument, aggression, duration, time of day, etc - are suitable for the size of the order. If a large order has parameters that are too aggressive in comparison to the average daily volume of the instrument and the desired timeframe for execution then the algo should either reject or pause the order to avoid impact to the market. If this happens then the broker and client should discuss how to adjust the parameters of the order to avoid impact.
FIXGlobal speaks with the buy-side in China about the prospects for China’s equity market, IPOs and how new technology and competition will improve domestic trading.
GDP and Trading Volumes The property market might continue to cool down in 2012, but it is not reasonable to expect the Chinese economy to shrink significantly this year because the Chinese government will allocate resources to other sectors of the economy. Because of the Lunar New Year effect, it looks as though Chinese Consumer Price Index (CPI) is heading upwards. Based on adjusted CPI, the property asset bubble is a political issue rather than an economic one. The Chinese government has pledged to continue monitoring property prices, and its strong fiscal position gives them various options in terms of how they address this situation. Trading volumes are expected to be much the same as 2011 and inflation should be heading downwards.
Major Driver: IPOs or Economics? There has been a rapid increase in the number of IPOs in China, but the regulators are questioning the quality of some of the IPO companies. Of those companies newly listed in 2011, valuation declined quite significantly. Investors used to think an IPO was like a lottery – buying new shares virtually guaranteed a profit. Many investors did not consider the actual valuation and quality of the company, and many are now realizing that not all investments are worth their list price.
The Chinese equity markets are in a transition stage; they are moving from being somewhat amateur to being much more economic and investor-driven. There were instances of listed companies in one industry that changed industries after the IPO (often moving into property development) and occasionally changing the name of the company, leaving investors uncertain about their strategy and focus.
Listed companies used to have considerable power, but the market is changing in a positive direction. However, we do not know how quickly the market will become transparent and trustworthy. The regulators, media and institutional investors are now more serious about issues of valuation, transparency, corporate governance, etc. The regulators should consider increasing Qualified Foreign Institutional Investor (QFII) and ways of improving the dissemination of information to investors in order to set a good example in the domestic market.
A primary focus of the Chinese Securities Regulatory Commission (CSRC) this year is insider trading. Addressing this matter will improve the quality of listed companies and give investors greater protection. The regulators are working on improving access to information for investors and institutional funds will benefit significantly from this transparency. Regulators are concerned with addressing both the difficulty of access to information and the quality of information about IPOs, and it is quite likely that they will be able to improve both aspects.
Applying New Technology The biggest technology upgrade implemented in the past six months has been algorithmic trading. Most Chinese buyside use their brokers’ algos, but in China, domestic mutual funds are not allowed to route orders to brokers. So what many dealing desks have done is to install the brokers’ algo engine on their side, so for every algo they choose, they go through their server and send the order to the exchange. In this way, dealers achieve efficiency in their algo usage because they do not use any brokerage; as a dealer, they are almost like their own broker. Algo trading also provides the buy-side with more precise post-trade analysis; specifically, the ability to analyze how much alpha has been captured and the transaction costs involved.
The primary benchmark used by most Chinese buy-side traders is Implementation Shortfall (IS), which is used to generate information to help the fund manager improve their investment strategies. For example, it might provide data about the delay cost created by an investment decision made an hour after the market opens, showing the fund manager that if the decision had been made earlier they could have saved a certain amount on the investment.
Senrigan’s Head of Trading, John Tompkins, and RBS’ Andrew Freyre-Sanders discuss the way event based funds use liquidity and the effect of ID markets in Asia.
Andrew Freyre-Sanders, RBS: What would you say Senrigan is known for among Asia hedge funds?
John Tompkins, Senrigan: What we are most known for now is being an event-driven fund that is entirely based out of Asia. Nick Taylor founded Senrigan in 2009, and he is known for doing event-driven trading and has been verysuccessful at it. Nick was at Goldman Sachs and Credit Suisse, where he ran Modal Capital Partners for nine years before going to Citadel with his team. Senrigan’s capital raising and first year metrics made the first two years a success.
AFS: I know you trade in the US and Europe as well, so is the global fund entirely based out of Asia?
JT: The entire firm is based in Hong Kong, although we have some analysts who spend extended periods of time in the regions of focus. If we do any US and European trading, it always has an Asian bent to it; for example, a UK or European listed company that has a large percentage of their business located in Asia. The few examples are Renault-Nissan, all the Chinese Depository Receipts (DRs) in the US and some Canadian companies doing M&A into Australia.
AFS: Event driven funds require quick access to liquidity. How does the type of deal or event catalyst affect the relative weighting of these items?
JT: The exchanges and companies are smarter, so they generally halt or suspend the names coming into the announcement, and then you have a short window until a given stock starts to trade up towards the terms. Any reasonably-sized fund is not going to be able to get anything done in that time period. After the event, the main concern is your targeted rate of return for the particular deal, which is impacted by the closing timeframe, surrounding risk, regulatory approval, dividend payments, etc, and you set levels where you want to be involved.
Traditionally safe deals with very tight spreads are viewed as the simplest way to risk-reduce, so people take those off and we give liquidity then because we are comfortable with what we are taking on. A lot of people think about the event as just the announcement on the day, but it is actually the time between when you see it and the range gets set. Only if it closes sporadically do you need access to greater liquidity; most of the time, you just need to be in touch with providers rather than have direct access.
AFS: From a trading perspective, once a deal is gone, it is not about that deal. The only speed liquidity advantage is in having systems that can take advantage of the spreads when they may be moving around a certain level. Is that the case for you?
JT: It definitely is. The big differences between Europe and Asia are the number of auctions and the number of times stocks stop trading, which is quite significant. Between three and four distinct times a day, you will have dislocations in spreads for a variety of reasons, and this is an opportunity to improve. Beyond that, a majority of sell-side firms are setting up their own dark pools and there are alternative exchanges in Japan. In those venues, we deal with liquidity providers and market makers who do not care about the individual mechanics of a name; they simply care about the level of spread that they can access.
The most relevant thing is making sure you have the connectivity turned on to access all the forms of liquidity that exist. There is a big differentiation between counterparties in Asia from an executing broker’s standpoint: e.g. what is their default, what do they turn on for you right away, whatcountries do they have their crossing engines in, who do they have in their pool as liquidity providers? You have to know to ask those questions, and it has been very helpful to do that.
The FPL Americas Electronic Trading Conference, for those in electronic trading, is always a year-end highlight and this year was no exception. Sara Brady, Program Manager, FPL Americas Conference, Jordan & Jordan thanks all the sponsors, exhibitors and speakers who made this year’s conference a huge success.
The 6th Annual FPL Americas Electronic Trading Conference took place at the New York Marriott Marquis in Times Square on November 4th and 5th, 2009. John Goeller, Co-Chair of the FPL Americas Regional Committee, aptly set the tone for the event in his opening remarks: “We’ve lived through a number of challenging times… and we still have quite a bit of change in front of us.” After a difficult year marked by economic turmoil, the remarkable turnout at the event was proof that the industry is back on its feet and ready to move forward with the changes to the electronic trading space set forth in 2009.
Market Structure and Liquidity Two topics clearly stood out as key issues that colored many of the discussions at the conference – regulatory impact on the industry and market structure as influenced by liquidity, and high frequency trading. An overview of industry trends demonstrated that the current challenges facing the marketplace are dominated by these two elements. Market players are still trying to digest the events of 2008 and early 2009, adjusting to the new landscape and assessing the changing pockets of liquidity amidst constrained resources and regulatory scrutiny. The consistent prescription for dealing with this confluence of events is to take things slow and understand any proposed changes holistically before acting on these changes and encountering unintended consequences.
The need for a prudent approach towards change and reform was expressed by many panelists, including Owain Self of UBS. According to Self, “Everyone talks about reform. I think ‘reform’ may be the wrong word. Reform would imply that everything is now bad, but I think that we’re looking at a marketplace which has worked extremely efficiently over this period.”
What the industry needs is not an overhaul but perhaps more of a fine-tuning. Liquidity is one such area that needs carefully considered finetuning. Any impulsive regulatory changes to a pool of liquidity could negatively impact the industry. The problem is not necessarily with how liquidity is accessed, but the lack of liquidity that results in the downward price movements that marked a nightmarish 2008. Regulations against dark liquidity and the threshold for display sizes are important issues requiring serious discussion.
Rather than moving forward with regulatory measures that may sound politically correct, there needs be a better understanding of why this liquidity is trading dark. While there is encouraging dialogue occurring between industry players and regulatory bodies, two things are for sure. We can be certain that the evolution of new liquidity venues is evidence that the old market was not working and that participants are actively seeking new venues. We can also be assured that the market as a messaging mechanism will continue to be as compelling a force as it has been over the last two decades.
Risk One of the messages that the market seems to be sending is that sponsored access, particularly naked access, is an undesirable practice. Presenting the broker dealer perspective on the issue, Rishi Nangalia of Goldman Sachs noted that while many agree that naked sponsored access is not a desirable practice, it still occurs within the industry. A panel on systemic risk and sponsored access identified four types of the latter: naked access, exchange sponsored access, sponsored access of brokermanaged risk systems (also referred to as SDMA or enhanced DMA) and broker-to-broker sponsored access.
According to the U.S. and Securities Exchange Commission (SEC), the commission’s agenda includes a look specifically into the practice of naked access. David Shillman of the SEC weighed in on the commission’s concern over naked access by noting, “The concern is, are there appropriate controls being imposed by the broker or anyone else with respect to the customer’s activity, both to protect against financial risk to the sponsored broker and regulatory risk, compliance with various rules?” Panelists agreed that the “appropriate” controls will necessarily adapt existing rules to catch up with the progress made by technology.
On October 23, NASDAQ filed what they believe to be the final amendment to the sponsored access proposal they submitted last year. The proposal addresses the unacceptable risks of naked access, and the questions of obligations with respect to DMA and sponsored access. The common element of both of these approaches is that both systems have to meet the same standards of providing financial and regulatory controls. . . Jeffrey Davis of NASDAQ commented on his suggested approach: “There are rules on the books now; we think that they leave the firms free to make a risk assessment. The NEW rules are designed to impose minimum standards to substitute for these risk assessments. This is a very good start for addressing the systemic risk identified.”
These steps may be headed in the right direction, but are they moving fast enough? Shillman added that since sponsored access has grown in usage there are increasing concerns and a growing sense of urgency to ensure a commission level rule for the future, hopefully by early next year. This commission proposal would address two key issues – should controls be pre-trade (as opposed to post-trade) and an answer to the very important question, “Who controls the controls?”
Timothy Furey, Goldman Sachs, Neal Goldstein, Nomura and John Goeller, Bank of America Merrill Lynch, shed light on the process of managing risk in electronic trading.
At the start of this year, FPL announced the completion of an initial set of guidelines, which recommends risk management best practices in electronic trading for institutional market participants. In the third quarter of 2010, FPL launched a group to raise awareness regarding the implications of electronic trading on risk management and to develop standardized best practices for industry consideration. Over the last few months, the group, which consists of a number of senior leaders in electronic trading from the major sell-side firms, has been working on developing this set of guidelines to encourage broker-dealers to incorporate a baseline set of standardized risk controls.
The objective of the guidelines is to provide information around risk management and encourage firms to incorporate best practices in support of their electronic trading platforms. In today’s volatile marketplace, the automation of complex electronic trading strategies increasingly demands a rational set of pre-trade, intra-day and pattern risk controls to protect the interests of the buy-side client, the sell-side broker and the integrity of the market. The objective of applying electronic order risk controls is to prevent situations where a client, the broker and/or the market can be adversely impacted by flawed electronic orders.
The scope of the particular set of risk controls included in the guidelines is for electronic orders delivered directly to an algorithmic trading product or to a Direct Market Access (DMA) trading destination. The recommended risk controls included provide the financial services community with a set of suggested guidelines that will systemically minimize the inherent risk of executing electronic algorithmic and DMA orders.
In what area are sell-side and buy-side firms’ risk controls most in need of improvement?
Timothy Furey, Managing Director, Goldman Sachs and FPL Risk Management Committee Co-Chair: One of the observations coming from the FPL risk sessions was that the buy-side and sell-side had really given considerable thought to their own individual firm’s risk controls. That said, both the sell-side and the buy-side should continue to focus on pulling together a standard, consistent base set of controls that their respective firms can reasonably implement. Therefore, it is more a question of standardization than a need for specific improvement.
John Goeller, FPL Americas Regional Committee Co-Chair and Managing Director, Global Execution Services, Bank of America Merrill Lynch: This effort was not necessarily to address an apparent deficiency in how the buy-side or the sell-side handles risk management, but to codify a set of best practices for all firms to use. It was generally accepted when we started this process that all firms implement some level of risk controls around their business. Our goal was to identify the most common ones and ensure that we have a base set of controls that all firms can implement.
Neal Goldstein, Managing Director, Nomura Securities International and FPL Risk Management Committee Co-Chair: It is important for the buy-side community to recognize that their efforts to implement risk management controls for electronic trading will be more effective when a collaborative effort is made with their sell-side executing brokers. For algorithmic and conventional (low frequency) DMA orders, the first line of defense should be the risk controls incorporated within the buy-side OMS/EMS. The most effective risk control is to prevent a questionable order from leaving the buy-side OMS/EMS. A specific factor that the buy-side should be looking at more closely is the impact a given order has on available liquidity. While the order validation employed by many buy-side clients accounts for notional value and order quantity, another factor that needs more consideration is the Average Daily Volume (ADV) during the trading interval. Creating an order to trade, where the volume participation rate may exceed ADV for a given interval, can have significant adverse impact on execution price and algorithmic performance, particularly for illiquid names.
What role, if any, should the exchanges play in implementing risk controls?
John Goeller: Most exchanges have technology solutions (in certain situations it is mandatory) around risk management. In some cases, these tools are optional and only work when accessing a particular exchange. Regardless, if a firm is utilizing exchange provided tools, home-grown, or vendor-supplied, they can still leverage our efforts to understand whether their tools are implementing industry best practices.
Recent predictions from industry groups suggest US equity option quote volumes will nearly double over the next twelve months, severely straining the technology fabric that underpins the industry’s quoting, trading and risk management systems. We see particular vulnerabilities in processes and systems that require a direct un-throttled options market data feed, such as smart routing engines, algorithmic trading engines and portfolio risk systems, as well as the infrastructure that supports these processes and systems.
Options Market data fed from Options Price Reporting Authority (OPRA) has been steadily increasing at a current annual rate of about 40%. Currently we see around 1.3 million messages per second, with a recent high-level mark at almost 1.5 million messages per second in December. The Financial Information Forum (FIF) is projecting growth to reach 1.8 million contracts in the next twelve months. The projection does not take into account the ramp up of new exchanges or any added expansion of the Penny Pilot, and we know that as these changes go into full effect, growth will only accelerate. We expect that with the new symbology, clients will be able to execute against more strikes in all underliers. The above projections also do not take into account the added granularity in striking which we anticipate will lead to more products filling our screens, adding to the technology crunch.
Given the recent OCC symbology changes, new and existing exchanges will be fighting for order flow with new products and different business models. Some new models include hybrids of payment for order flow and maker/taker for certain names. Some exchanges are trying to attract new business by introducing non-standard options that allow clients to trade options in new ways, such as binary options. More products on more exchanges will increase the need for better, more efficient technologies. The technical hurdles to maneuver this business will likely only get higher and higher as we move forward.
Firms’ needs vary from order routing through an EMS, some of which showcase advanced options analytics, to proprietary technology systems that require a huge amount of technical horse power to consume the ever growing OPRA market data feed. These systems pipe through options algorithmic orders that spawn hundreds of child orders or advanced volatility quoting strategies.
As needs grow, it will become increasingly more important for firms with trading needs to partner up with vendors and brokerdealers who have developed specialties in these spaces and who understand how to efficiently handle the sheer mass of messages being sent now in terms of orders and market data. The underlying technology has become so specialized that it’s no longer a matter of throwing money at a software or hardware problem, but to find the best combination of hardware and software.
It will also be important for all firms involved to smartly throttle the feed to certain processes that do not necessarily need every tick to ensure that sub systems are not over saturated. We expect that every single process that tries to consume OPRA market data will need to be bolstered or re-engineered for almost all existing systems. There also has been a recent push towards publishing the depth of the options market, provided as direct feeds from the exchanges, to trading front ends and algorithmic engines. The thought is that with the proliferation of pennies, the current OPRA feeds, which only reflect the tops of books at each exchange, are less useful when trying to identify liquidity for larger block executions. Besides providing more clarity into the book, direct feeds also tend to be faster than the feeds through OPRA. Tools designed to obtain blocks in the electronic markets will become important when chasing after institutional, larger block trades. There also is some thought that using the depth of book to derive analytics will provide customers clarity into where they may get filled given the depth of book feed.
With wider use and availability of depth of book, we expect to see development in pre-trade execution analytics for those clients who need more liquidity than that published at the top of book. Customers, in turn, can get a sense for the average price they are likely to achieve if they sweep the book. Over time, this should lead to increased confidence on the likelihood of filling larger block orders electronically. This will likely draw chunkier flow that is important to this business. We expect that if the market depth becomes important for execution, this will only multiply the resources needed to handle the complete options market data feed.