Summing up the year – FPL Americas Conference

Share

By Sara Brady
The FPL Americas Electronic Trading Conference, for those in electronic trading, is always a year-end highlight and this year was no exception. Sara Brady, Program Manager, FPL Americas Conference, Jordan & Jordan thanks all the sponsors, exhibitors and speakers who made this year’s conference a huge success.

The 6th Annual FPL Americas Electronic Trading Conference took place at the New York Marriott Marquis in Times Square on November 4th and 5th, 2009. John Goeller, Co-Chair of the FPL Americas Regional Committee, aptly set the tone for the event in his opening remarks: “We’ve lived through a number of challenging times… and we still have quite a bit of change in front of us.” After a difficult year marked by economic turmoil, the remarkable turnout at the event was proof that the industry is back on its feet and ready to move forward with the changes to the electronic trading space set forth in 2009.
Market Structure and Liquidity
Two topics clearly stood out as key issues that colored many of the discussions at the conference – regulatory impact on the industry and market structure as influenced by liquidity, and high frequency trading. An overview of industry trends demonstrated that the current challenges facing the marketplace are dominated by these two elements. Market players are still trying to digest the events of 2008 and early 2009, adjusting to the new landscape and assessing the changing pockets of liquidity amidst constrained resources and regulatory scrutiny. The consistent prescription for dealing with this confluence of events is to take things slow and understand any proposed changes holistically before acting on these changes and encountering unintended consequences.

The need for a prudent approach towards change and reform was expressed by many panelists, including Owain Self of UBS. According to Self, “Everyone talks about reform. I think ‘reform’ may be the wrong word. Reform would imply that everything is now bad, but I think that we’re looking at a marketplace which has worked extremely efficiently over this period.”
What the industry needs is not an overhaul but perhaps more of a fine-tuning. Liquidity is one such area that needs carefully considered finetuning. Any impulsive regulatory changes to a pool of liquidity could negatively impact the industry. The problem is not necessarily with how liquidity is accessed, but the lack of liquidity that results in the downward price movements that marked a nightmarish 2008. Regulations against dark liquidity and the threshold for display sizes are important issues requiring serious discussion.
Rather than moving forward with regulatory measures that may sound politically correct, there needs be a better understanding of why this liquidity is trading dark. While there is encouraging dialogue occurring between industry players and regulatory bodies, two things are for sure. We can be certain that the evolution of new liquidity venues is evidence that the old market was not working and that participants are actively seeking new venues. We can also be assured that the market as a messaging mechanism will continue to be as compelling a force as it has been over the last two decades.
Risk
One of the messages that the market seems to be sending is that sponsored access, particularly naked access, is an undesirable practice. Presenting the broker dealer perspective on the issue, Rishi Nangalia of Goldman Sachs noted that while many agree that naked sponsored access is not a desirable practice, it still occurs within the industry. A panel on systemic risk and sponsored access identified four types of the latter: naked access, exchange sponsored access, sponsored access of brokermanaged risk systems (also referred to as SDMA or enhanced DMA) and broker-to-broker sponsored access.

According to the U.S. and Securities Exchange Commission (SEC), the commission’s agenda includes a look specifically into the practice of naked access. David Shillman of the SEC weighed in on the commission’s concern over naked access by noting, “The concern is, are there appropriate controls being imposed by the broker or anyone else with respect to the customer’s activity, both to protect against financial risk to the sponsored broker and regulatory risk, compliance with various rules?” Panelists agreed that the “appropriate” controls will necessarily adapt existing rules to catch up with the progress made by technology.
On October 23, NASDAQ filed what they believe to be the final amendment to the sponsored access proposal they submitted last year. The proposal addresses the unacceptable risks of naked access, and the questions of obligations with respect to DMA and sponsored access. The common element of both of these approaches is that both systems have to meet the same standards of providing financial and regulatory controls. . . Jeffrey Davis of NASDAQ commented on his suggested approach: “There are rules on the books now; we think that they leave the firms free to make a risk assessment. The NEW rules are designed to impose minimum standards to substitute for these risk assessments. This is a very good start for addressing the systemic risk identified.”
These steps may be headed in the right direction, but are they moving fast enough? Shillman added that since sponsored access has grown in usage there are increasing concerns and a growing sense of urgency to ensure a commission level rule for the future, hopefully by early next year. This commission proposal would address two key issues – should controls be pre-trade (as opposed to post-trade) and an answer to the very important question, “Who controls the controls?”
High Frequency Trading
Defining what constitutes high frequency trading is a necessary component to removing some of the ambiguity surrounding this practice. When firms use sophisticated, automated high-speed algo type strategies and are seeking to move in and out of the market very quickly, their strategies can generate high amount of volume, order traffic, messages and cancellation rates, resulting in a significant amount of turnover. High frequency trading is marked by the use of both historical and real-time data to exploit efficiencies in the market, from within a single stock to across sectors and asset classes, to leverage technology on a high turnover basis. The practice becomes a bitclearer when it is differentiated from flash trading. Flash trading occurs when a market receives a marketable, immediately executed order than the best displayed quotes, but the receiving market does not itself have that price.

The connection that flash orders can have to high frequency trading is twofold. It is probable that some of the market participants that receive flash order information are high frequency traders; it is also possible that high frequency traders, who are looking to earn a rebate offered by some exchanges for executing the order at the best displayed price, never really intend to route that order and instead turn it over (this process has since been banned). The SEC is opposed to this practice because it potentially creates a two tiered market and can detract from publically displayed quotes and can prevent the people whose orders were displayed from receiving best execution.
There is additional concern around the short- term volatility created by these strategies, the ability of some firms to access the market more quickly than others, and the exploitation of some services like co-location, market data and access to markets. High frequency trading has significant benefits; this type of activity can stabilize the marketplace by allowing objectivity of trading goals, thereby allowing counterparties to back a trading party whose goals are different from their own. In addition, these firms tend to be very aggressive price setters, and they also promote faster execution. Tightening of spreads, increasing amount of displayed liquidity to the marketplace, contributions to price discovery and reduction of trading feeds were all cited as additional benefits of high frequency trading.
There needs to be transparency in the market, and according to Tim Cox of Bank of America Merrill Lynch, “the market works best when everyone is allowed access to a similar playing field, and then you make decisions with your capital.” The industry needs to be careful when regulating these firms because one of the advantages is that they’re investing a tremendous amount of money into systems and into algorithmic strategies that allow them to execute very quickly.
Technological Considerations – Latency and Algorithmic Trading
Donal Byrne of Corvil, Ltd. succinctly summed up the complexities of achieving low latency when he stated, “In the latency game, what you see is not always what you get,” to which Conor Allen on NYSE Technologies added, “you can’t know something unless you can measure it.” The challenge in the latency game was unanimously expressed as the difficulty of obtaining true numbers and, once obtained, utilizing these numbers effectively. Byrne noted that “fast” is primarily a relative term in that latency is only low in relation to competitors. How fast a trade is made is dependent upon the speed at which the market price is obtained and how fast a trade can be executed on a given opportunity.

Expanding on the latency session, the panel on algorithmic trading addressed, how users are interacting with algos and algo enrichment screens and how brokers can distribute their algos more quickly. While there has always been ongoing evolution in algo trading space, the question remains of how algos get from the broker-dealer providers to the buy-side? Currently the industry lacks an efficient distributionmechanism, but panelists agreed that FIXATDL can bring efficiencies to this process by putting the industry on the path to faster and more mass customization for all parties.
Panelists also projected that the winds of change are about to blow through the algo space, as the relatively stagnant algo arena over the past year will pick up again as broker- dealers move away from rebuilding and resituating themselves. Beyond Equities Foreign Exchange was featured, on its own and paired with fixed income respectively. Panelists conveyed their conviction that the fragmentation vs. aggregation debate in this space is positively outdated. Rather, fragmentation and aggregation are both the result of and the driving force, for the other.
In terms of derivatives, the listed market, as well as certain OTC products, fared well during last year’s credit crisis, though the bilateral and systematic risk faced by dealers was not fully understood by most of the industry. The treasury pushed for standardized OTC derivatives into a centralized clearing place, while the increase in algorithmic trading for derivatives accompanied a big push for clearing. Regulators are now considering moving OTC derivatives, the “culprit” of the crisis, to be cleared on exchanges. Bills in the house mandate clearing of OTC standardized derivatives on exchanges and regulators are pushing for transparency. If this happens, it will push the volumes of derivatives onto listed markets in terms of clearing and automation.
As in prior conferences, the end of the event marks the beginning of a series of important follow-up activities. Those that arose out of this year’s conference are the creation of a new buy-side working group, a latency measurement subgroup, and the introduction of new members to FPL who became familiar with FPL’s work through the conference. Streaming videos and PowerPoint presentations are now available online at www.jandj.com/fpl/2009/login.php for both FPL members and conference attendees.