The Content Cloud: A Silver Lining for Traders?

By Andrew Feig, Adrian Kunzle
Andrew Feig, UBS, and Adrian Kunzle, JP Morgan Chase, open up to FIXGlobal about the applications for cloud-based services in the electronic trading arena.
Andrew Feig and Adrian Kunzle represent their respective firms on the Open Data Center Alliance (ODCA) Steering Committee, an independent consortium comprised of leading global IT managers who have come together to resolve key IT challenges and fulfill cloud infrastructure needs into the future by creating an open, vendoragnostic Usage Model Roadmap.
Are concerns about the security of data on clouds unfounded?
Andrew Feig: It all depends on what type of clouds you are talking about. If the discussion is about public clouds, then yes. Multitenacy introduces a whole new level of security concerns. There are many issues including making sure the data is secure enough and understanding the risks associated. Realistically, not all workloads will be candidates. Private clouds have a much lower hurdles, since they are usually only for one corporation, but may or may not be outside their own datacenter.
Adrian Kunzle: The concerns are not unfounded but also not insurmountable. Much of our current security mechanisms are transferable.
Where will financial firms see cost reductions through using cloud-hosted services?
AK: Cost savings will vary, depending on the size of the firm. At JP Morgan Chase, we would see the largest cost reductions through sizing. To elaborate, we currently scale all of our infrastructure at peak usage, so our systems are always prepared to cope with the greatest demand. Clouds will enable us to scale our infrastructure to average use and simply outsource during peak times. It’s like we currently pay for a service all year round, yet only use it a few times a year.Smaller firms would see savings across the board and be able to run more efficiently all around. The difference in cost reductions lies in the fact that larger firms like JP Morgan Chase have economies of scale that smaller firms do not necessarily have.
How much should a financial firm migrate onto a cloud, as opposed to retaining traditional physical architecture?
AF: Whether a firm chooses to do its own private cloud or use a public one comes down to their specific situation. If the firm has enough scale, then it may make sense to do their own or a hybrid model of private and some public. It will also come down to specific use cases, as some applications need a very high service level and it may not be cost effective to move outside the walls of the firm.
AK: Financial firms need to consider what applications would work and what data they are considering migrating onto a cloud. Obviously some software would operate better in the cloud, where others will not. Customer Relationship Models (CRM), for instance, have proven to be more useful and effective in the cloud. Platforms that accommodate employees who are often on the road would work well in the cloud. Where there is a competitive advantage to keeping our applications internal (in a private cloud or on physical hardware), we will.
For financial firms whose trading strategies rely on speed and high frequency trading, will cloud services aid their quest for speed?
AF: I envision specific cloud offerings will be developed to address these requirements. For example, exchanges will do a lot more in the cloud space, so it’s the next evolution of their current offerings like colocation.
AK: Almost certainly not. Trading platforms for top tier banks like JPMorgan Chase will most likely not go to the cloud.

How will cloud services, whether computational or archival, fit into the US regulatory mandate that traders achieve ‘best execution’ on behalf of their clients?
AK: There is a clear opportunity to source archival applications from the cloud. However,‘best execution’ is a very specific regulation, where requirements are specific to certain trading systems, which do not fit into cloud services as they currently exist. Trading systems will most likely not be migrated to clouds at this time.
How easily can cloud-based services be integrated into a bank’s existing physical data/ computational architecture?
AF: Once again that will depend on the firm. The Open Data Center Alliance is looking to make this much simpler. Hopefully, the adoption of the Usage Model Roadmap by vendors will make this a lot easier than it is now. Many of these challenges, around security, cloud on-boarding, etc. are going to be addressed by the roadmaps that the Alliance will begin to publish in early 2011.
AK: Integration with a cloud can be done relatively easily for computational architecture because it is stateless; there is no data attached. We have been working on public cloudbased services for computational architecture and expect to have a concept finalized by early-mid next year.
Cloud-based services that contain data will be more difficult to integrate. There are high costs as well as security issues associated where physical data is involved. We have to pay to move the data, pay to keep the data there, and then pay again to bring the data back.
Jeffrey Banker, Executive Vice President, Real-Time Market Data and Trading Solutions for Interactive Data, considers how managed services via clouds enable an expansion of electronic trading and wealth management.
New, innovative technology and the need for efficiency and cost containment have continued to drive financial institutions to closely examine their business processes and determine new areas that could be more effectively managed as outsourced functions. It has become clear to many firms that financial information, especially real-time market data and related applications, can be outsourced.
Growth of managed services in electronic trading
Two polar forces are impacting the institutional demand for hosted managed service offerings. This has led to the rapid evolution of vendor and exchange-supported product platforms and commercial models. Electronic trading, which is on an eternal quest to reduce latency in the trading stack – as well as desiring rapid connection to an evolving set of markets that are affected by changing market structures – has expanded its use of managed service platforms. Aggregated direct market access, raw market data, risk management and low-latency backbone connectivity are now broadly available and offer a cost-effective access point to over 50 high volume exchanges.
Historically, most firms developed and managed their own infrastructure to enable these capabilities, but the control premium has decayed as scalable vendor offerings have enabled more cost-effective, nimble access to the markets. Highly resourced firms will continue to manage and deploy their own assets, but many firms that use latency-sensitive trading strategies are increasingly adopting managed offerings. As the industry expands and smaller institutions enter the marketplace, outsourced solutions lower the barrier to entry, while maintaining the same latency and strategy benefits enjoyed by top-tier firms. As arbitrage routes evolve between cash and futures and the FX markets, the cost of modifying infrastructure for the self-enabled firms can be significant, and requires constant analysis and attention for optimization.
Meeting the evolving needs of the wealth manager and active trader
On the other end of the spectrum are the challenges related to the changing roles of the wealth manager and the active trader. Electronic trading has increased message rates for market data beyond the ability of the human mind to interpret or react. According to statistics compiled by the Financial Information Forum (FIF), maximum OPRA messages per second (MPS), or ceiling rates, rose from 12,000 in 2000 to a staggering 2,053,000 in 2009 – and OPRA projects that these rates will rise to 5,067,000 mps in 2012. The overall MPS for Interactive Data’s own consolidated feed were less than 50,000 back in 2003; in October 2010 a new peak of approximately 4.4 million MPS was reached.
Increasingly, displayed market data is being utilized as a reference point for advisors interacting with clients and assets – and less around trade execution optimization. The costs of supporting market data have rapidly increased partly due to higher message rates that impact bandwidth utilization across an enterprise, while the value proposition of the data has decreased due to the microsecond algo-based execution capabilities of many of the industry participants.
These and other factors are motivating many institutional sell-side firms to migrate or pilot content cloud-based offerings, which essentially consolidate and manage all the necessary market data sources in a hosted environment accessible by Java, Flex, or Silverlight front ends, as well as a variety of application programming interfaces (APIs).Attractive model for market data consumers
Aside from the cost benefits that can be realized by shared infrastructure and bandwidth, there are significant advantages related to change management, disaster recovery, portability and agile product development. Resistance to the adoption of the content cloud has previously been driven by the negative perception of internet reliability and packet loss, as well as by the lack of control. However, the internet is a considerable determinant for revenue growth for most financial services firms and is, therefore, broadly adopted on the consumer side, but resisted in some circles for institutional applications. Given the increasing cost of delivering market data – and the reduced need to show every tick for displayed applications – the content cloud is an attractive model to service internal and external market data consumers. Firms who have adopted this approach have reduced total cost of ownership by up to 60 percent, while maintaining reliability and improving mobility and application utility for their subscribers.
Although it is typically the high volume, high frequency data and applications that are most appealing to clients today, other types of data and applications are expected to be in demand via cloud offerings in the future.
As high-frequency and electronic trading becomes even more competitive, firms are looking for new differentiated data sets in their real-time trading models and the outsourcing of new real-time content sets can provide that competitive differentiation. These content sets include any application that is data intensive, such as reference or event data, hosted tick data, depth of data for direct market access (DMA), and more powerful analytical tools that can help institutions to manage risk.
While many financial services firms are still evaluating the pros and cons of the content cloud and are working to identify applications that are more cost effective in the cloud than as part of in-house infrastructure, the benefits outlined above could be a turning point for many.

Related Articles

Latest Articles