By Gary Stone, Chief Strategist, Bloomberg Tradebook.
The French phrase déjà vu literally means, “already seen.” Have you ever gone to the cinema and felt that you have seen the story before in a different film? Well, Dances With Wolves (1990), FernGully (1992), Last Samurai (2003), District 9 (2009) and Avatar (2009) are different films, however their plots all resemble one another.
Sequels are another example. Story or plot “resemblance” is actually a common occurrence in the film industry — perhaps because producers believe that if the story was successful before, it will be successful again. Over the past 19 years, the equities markets have evolved and transformed. Now, a similar evolution is under way in the Foreign Exchange market. Will the FX story resemble the equities experience? Of course, the market structure and the details will be different but, like films, this story feels eerily similar.
The evolution of the equity market started in 1996. Over the ensuing 17 years, trading technology became more sophisticated. Now, other asset classes and markets across the globe appear to be following a similar evolutionary trajectory (Figure 1).
Equities started as a manual market where orders were communicated to voice brokers. Market makers in the OTC Nasdaq marketplace and members of the NYSE used electronic bulletin boards to display their IOIs or “axes.” Sell-side brokers developed and distributed in-house electronic platforms to communicate to their buy-side customers to capture their flow.
The FIX Protocol, emerged — standardising how platforms could “communicate” with one another. This enabled vendors to develop broker-neutral platforms and solve fragmented buy-side workflows. Electronic order routing then began as a workflow and straight through processing solution.
As transparency improved, so did trust and empowerment. The buy-side sought greater control over orders. Different pricing models (maker/taker) and innovative order-type algorithms (e.g. Iceberg, Pegging and Discretion) to help control execution were developed. Competition in the marketplace emerged, thus fragmenting liquidity. Soon “smart order routing” aggregated the fractured liquidity and the early adopters on the buy-side started to deal directly in the marketplace.
The empowered marketplace that we now know in the equity markets gained significant momentum when algorithms that controlled all aspects of the execution were released. The buy-side and sell-side, rather than controlling each slice of a large order, were able to use algorithms to try to achieve a desired result. Early adopters significantly reduced implementation costs with these algorithms. The Association for Investment Management Research (AMIR/The CFA Institute) and mutual fund boards began to discuss electronic trading and algorithmic execution as part of “best practices” in buy-side trading rooms.
FX is standing on the shoulders of the equity experience. Many of the execution innovations developed in the equity market are being “tropicalised” and are starting to be adopted by FX market participants. During 2012, FX electronic trading grew both in usage and sophistication. According to a 2012 Streambase Systems survey of more than 240 institutional FX traders primarily located in EMEA and the United States, 69% of participants said that they used multibank platforms for execution, up 17% from 2011. However, the marketplace is now starting to make the leap beyond simple order routing. This is why the FX “film” is starting to feel eerily similar to the equity evolution.
Both the Streambase survey and the Bloomberg Tradebook FX experience suggest that that buy-side and sell-side participants are demanding more algorithmic execution capabilities to enable them to control how they want their orders to be transacted in the market.
The FX market is a highly fragmented market of different pools of liquidity. Banks have developed their own pools of liquidity and many ECNs have emerged as alternative sources. The Streambase survey noted that “liquidity aggregation algorithms” (smart order routing) were among the most commonly used execution algorithms by buy-side (54%) and sell-side (64%) survey participants. More advanced order handling execution management algorithms — algorithmic trading strategies that manage all aspects of an order to achieve a desired result — are also starting to gain traction.
Algorithmic trading strategy usage topped 48% in 2012, a 14% increase from a year earlier. The survey also suggested that further growth can be expected as 75% of the buy-side participants said that they plan to use or increase their use of algorithms for execution. Algorithms can be tactical or benchmark driven. The survey noted that the buy-side has begun to leverage the more sophisticated algorithms including; (passive) Floating (30%), time-slice (29%), TWAP (22%) and VWAP (29%).
Tradebook’s experience is consistent with the Streambase’s findings. More and more traders are using algorithmic trading strategies with increasing degrees of sophistication to implement their institution’s investment approach. Tradebook’s clients’ total notional executed with algorithms grew by more than 23% from Q1 2012 to the end of Q1 2013. Buy-side traders used tactical algorithms such as IF/Done, One-Cancels-Other, Economic Event and Target orders that control when an order is released into the market. Usage of more sophisticated algorithms such as Reserve Scale Back (efficiently manages the accumulation/distribution of a position/order), Discretion, Passive Pegging, Time Slice and TWAP also grew significantly.
Fabricio Oliveira, Head of Risk Management at Mirae Asset Global Investments Brazil, discusses his approach to pre-trade risk controls and how local market structure influences the occurrence of risk.
Market Open At Mirae we do much of our trading with offshore entities. For example, we have funds that are administered in Hong Kong, Luxembourg, Brazil, US and Korea and this geographical disparity creates operational risk. Differences in settlement price, currency and the timing of financial transfers are all aspects that must be considered when using offshore funds. The ability to settle a US trade in the US and not in another time zone is also important. This is particularly true of Hong Kong as our time difference is a huge barrier to trades in Asia. It is almost impossible to book these trades in Hong Kong even though our traders here see the opportunity to do so.
When I focus on the risks for open trading, the settlement movement is an important concern. Whether you are focused on market risk or liquidity risk, all risks need to be monitored, so you can have a clear view of what potential risks lie ahead.
High Frequency Trading There is much discussion in the industry and at conferences about high frequency trading (HFT) in Brazil, but we are not yet ready for high frequency strategies. The industry is starting to see how HFT works, but liquidity in Brazil across asset classes is insufficient to support these strategies. There are approximately 300 listed companies in equities and about half that number in derivatives, whether in bonds or yield curves or currency. The local players who run HFT strategies focus on the few stocks and derivatives with liquidity, which does not give them many options to find alpha over short periods. It will be interesting to see how it works in North America and Europe and for us to consider what might be possible in Brazil. For now, I do not see many players in HFT and I can count on one hand the number of funds using HFT.
Our pre-trade risk controls have not had to account for HFT volumes and speeds yet, so we have focused more on core control mechanisms. We have some vendors who can produce risk controls for the current liquidity. If we have liquid stocks, derivatives or OTC products, then we can define our own risk controls. Fund houses with hundreds of funds will have difficulty in applying those controls to the trading systems, but as Mirae mainly focuses on equities, our implementation burden is much lower. Today, all our pre-trade risk controls are done in real-time, including automatic limits. Beyond this, we still have a layer of control in the trader on the desk.
Working with Brokers When discussing risk controls, it is important to mention that in Brazil all brokers employ significant risk controls on their side, to prevent them from taking on more risk than they can carry. When the brokers start to trade with the exchange, the exchange provides them with risk guidelines and limits. As clients of the sell-side, buy-side desks cannot exceed their assigned broker limits and their orders will be automatically paused if the broker’s limits are reached. The broker’s risk controls are complete; they will not take on risk. As a result, their clients do not have much help in implementing their own controls. This is exacerbated because a fund house may trade with many brokers – in our case we deal with 35. It is impossible to implement one solution per broker, so we rely on our OMS provider to connect with the brokers and to match up risk controls.
Mizuho Securities’ Spyridon Mentzas discusses the status of the Japanese exchange merger and offers thoughts on how well the two systems will merge and the benefits investors can expect.
Compatibility The merger of Tokyo Stock Exchange (TSE) and Osaka Securities Exchange (OSE) is not yet finalized, but it appears they will merge in the beginning of 2013, with the details yet to be specified. The first impression is that they have nearly identical trading rules with some minor differences, such as the OSE trading until 3:10, while the TSE closes at 3:00. When TSE decided to shorten the lunch time in November, the OSE did the same. When one of the exchanges (usually, the TSE) changes the rules, then the other moves in tandem: for example, changing the tick sizes. If the merger does go ahead, it is likely that they are going to use the TSE’s cash system, arrowhead, and OSE’ J-GATE for derivatives. They will use the old systems in parallel, which will achieve a reduction in cost because they will not have to maintain two systems.
Further Industry Consolidation The ECN’s in the US enjoyed technological superiority versus the classic exchanges, where NYSE’s latency was significantly slower than Arca’s. This would have been reason enough for TSE to consider buying a PTS, but with arrowhead’s current latency of less than 2 milliseconds (and another upgrade in the next few months to target less than a millisecond), simply buying a PTS would not give them a noticeable advantage because the TSE and OSE are on par with the PTSs. The reason why PTSs are increasing their market share is that, unlike in the UK and US, where Reg NMS and MiFID have required trading on the exchange with the best price, in Japan the PTSs draw volume through decimal points and smaller tick sizes than the incumbents.
For example, Mizuho Financial Group might trade on the TSE at 105 yen bid, 106 yen offer. That one yen spread is close to 100 basis points or almost one percent, whereas the PTS trades at 0.1 yen. This is a major incentive for investors to buy and sell on the PTSs with their smaller increments to reduce market impact and trading costs. From the beginning, the regulators have not been overly concerned with the PTSs deciding to trade in decimal places and have 0.1 yen ticks. It was always up to the PTSs to decide and the TSE could do the same. If anything, I think the new exchange would rather reduce their tick sizes, than merge again.
However, not all participants would be happy to see new tick sizes, for example, some of the proprietary houses or small firms that trade with retail, as altering their downstream systems to handle decimal places would be costly.
This will also create a fragmentation of liquidity in tick sizes. The bids and offers on the TSE are often thick, with something like 50 billion shares sitting on the bid side, so with 0.1 yen ticks, the average order size might move to 3 million or 1 million shares. Traders who want to buy a large lot will have to scroll up and down to find out how much they have to go up to absorb the available liquidity. I think for the traditional long-only traders, this might mean an increased scattering of liquidity. There is sufficient liquidity in the market at present; even for stocks trading at a low price – there are market makers trying to make 1% during the day. If smaller tick sizes are introduced, that liquidity will likely be scattered or disappear.
AFME’s Securities Trading Committee Chairman Stephen McGoldrick unlocks the latest MiFID proposals and looks at the rules for Organized Trading Facilities, algo trading and a consolidated tape.
Organized Trading Facilities (OTFs) The OTF regime began life as a specific regulatory wrapper to put around broker crossing systems, (which are a new mechanism for delivering an existing service). Crossing, which is almost the definition of a broker, has become highly automated. Whilst most crossing activities have not changed, other aspects of the industry were seen to require regulation – namely increased automation and greater scope of crossing. The initial proposals outlined an umbrella category of systems called OTFs, with one category created to hold broker crossing systems and another to hold the systems for G20 commitments around derivatives trading.
When the MiFID II proposals came out at the end of 2011, the ‘umbrella’ aspect had been simplified into a structure intended to be ‘all things to all people’, which is where it has come undone. MiFID II has created a regulatory receptacle for a practice and the two things differ in shape. The broker crossing system does not fit into the receptacle that has been created for it because much of the trading is against the books of the system’s operators, which is prohibited under the current proposals.
The regulators do not want speculative, proprietary trading within these systems, but unwinding risk created by clients is both useful and risk-reducing. An opt-in mechanism for compliance, allowing traders to decide if they want their orders traded this way may be a solution. Conflict management of this sort is common in the financial sector, as it ensures that any discretion is not exercised against the interests of the client. Certainly, when it comes to measuring the client’s interests against the operator of an OTF, it is absolutely unambiguous that their interests must come first. Therefore, any exercise of discretion that disadvantages the client relative to the operator is already prohibited. A formal, documented process to ensure that segregation stays in place is good, but to effectively prohibit the vast majority of trading on broker crossing systems seems to abandon the regulators’ objectives – to increase transparency and protect clients.
Furthermore, trades allowed into a broker crossing system would be instantly reported, creating post-trade transparency. The current proposals call for OTFs to be treated in the same way as Multilateral Trading Facilities (MTFs), which fosters uncertainty about the waivers for pre-trade transparency. Currently, there are clear criteria for granting a waiver to a platform: one is that orders are large in size, the other is taking reference prices from a third party platform. The Commission will not, however, be making the decisions about waivers; they have been handed to the European Securities Market Authority (ESMA) to determine. There is a danger in specifying too stringent limits for these waivers, which would create a very different landscape from that explicitly envisaged by MiFID I.
Systemic Internalisers (SIs) Our understanding is that regulators did not want to split activity that was in an OTF into two, but rather to regulate the broker crossing systems and to remove the subjectivity of SIs. The current SI proposal is aimed at regulating automated market making by banks, so that institutions make markets by reference to market conditions, not by reference to their clients. In MiFID I, the SI regime was introduced to protect retail investors, but subsequently this seems to have changed. When the European Commission (EC) was asked by the Committee of European Securities Regulators (CESR) to clarify the rationale for an SI regime, they declined to do so. As a result there is a distinct lack of clarity regarding the intent of the SI rules. If we had a clearer vision of the direction the regulators wished to take the market, then it would be far easier to assess whether the regulations were moving us in the right direction – or not.
Wendy Rudd of the Investment Industry Regulatory Organization of Canada (IIROC) describes the Canadian approach to circuit breakers, minimum size and increment requirements and the role of dark liquidity.
What is currently driving the regulatory policy agenda with regard to circuit breakers? Globally, and Canada is no exception, we have seen the introduction of new rules in several areas related to the mitigation of volatility. Circuit breakers are just one of those areas. While some reforms may have been in the works already, the Flash Crash of May 2010 certainly served as a catalyst for a broader debate about market structure, trading activity and the reliability and stability of our equity trading venues.
Volatility is inevitable, so when does it become a regulatory concern? From our perspective – and we regulate all trading activity on Canada’s three equity exchanges and eight alternative trading systems – we see it as a priority to mitigate the kind of shortterm volatility that interrupts a fair and orderly market. We do not expect to handle this role alone; it is a shared responsibility that includes appropriate order handling by industry participants and consistent volatility controls at the exchange/ATS level.
What are the benefits of harmonizing circuit breaker rules with US markets? One main advantage to a shared or complementary approach is that it limits the potential for certain kinds of regulatory arbitrage in markets that operate in the same time zone. Many Canadian-listed stocks also trade in the US, and roughly half of the dollar value traded in those shares takes place on US markets each day.
Which approaches are you considering taking for market-wide circuit breakers? We are monitoring developments in the US, where regulators have proposed changes which include lower trigger thresholds calculated daily, using the S&P 500 (instead of the Dow Jones Industrial Average) and shorter pauses when those thresholds are triggered. We are currently exploring options for marketwide circuit breakers which include continuing our existing policy of harmonizing with the US, pursuing a ‘made-in-Canada’ alternative or identifying a hybrid approach that does a little bit of both. At this stage, we are soliciting industry feedback on the merits of these three approaches. With the help of that feedback, we expect to be able to choose the appropriate path soon. It is important to note that these kinds of circuit breakers are an important control but have traditionally acted more as insurance – they have only been tripped once in the US and Canada since being introduced in 1988.
How similar is IIROC’s new Single-Stock Circuit Breaker (SSCB) rule to the US rules? Single-stock circuit breakers are relatively new for both jurisdictions. The US and Canada have implemented SSCBs which are similar in that a five-minute halt is triggered when a stock swings 10% within a five-minute period. Otherwise, the Canadian approach differs in several ways. For example, our SSCB does not trigger on a large swing in price if a stock were trading on widely disseminated news after a formal regulatory halt.
Do you believe circuit breakers, market-wide or single-stock, have a deterrent effect on momentum trading? We did not set out with a prescriptive approach to influence or change trading behaviour or strategy. IIROC’s circuit breaker policies were developed to provide added insurance against extraordinary short-term volatility. We intend to study the impact of any changes and we may be able to learn more about the impact of policy changes on trading behaviour.
Matteo Cassina of Citadel Execution Services Europe comments on the development of a European consolidated tape as well as a unified concept of best execution.
The long awaited proposals on the review of the Markets in Financial Instruments Directive (MiFID) were published in October 2011. The so-called MiFID II and MiFIR proposals aim to address, among other things, changes in the European market structure and competition between trading venues. Whilst the proposals, in their current form, do not provide as much detail as market participants had hoped, they represent a unique opportunity to address fundamental issues impacting the efficient functioning of Europe’s equity markets.
The EU legislative process is such that the European Parliament and the European Council will agree their negotiating positions, before embarking on a trialogue process mediated by the European Commission. The final legislative text may not be ready for implementation until as late as 2014, but this timeframe represents a good opportunity for the rules and their impact to be given adequate consideration. In particular, the issues of best execution and consolidated tape need to be given greater prominence during this review process, if policymakers are to honour the original objectives of MiFID, protect the retail investor and ensure Europe’s equity markets become efficient and competitive.
A key benefit of regulation is that it drives standardization of behaviour but thus far, this has not materialised (in the retail broker community in relation to best execution requirements). Large institutions have the capabilities to take advantage of the proliferation of alternative trading venues and are benefitting from cost reductions by being able to execute their orders in the venue which offers the lowest price for a security at a given time. The majority of retail investors, however, are still either unaware of, or do not have, the opportunity to access alternative trading venues. This means they do not always benefit from prices equal to, or better than, those available in primary venues.
While the principle of best execution is reiterated in MIFID II, it is not included in MIFIR which means that — once again — best execution is a principle, not a rule and therefore open to interpretation at the national level. This is in stark contrast to the best execution model in the US, where the requirements to achieve best execution are much more stringent. Currently, a retail broker in Europe can chose to route all of its trading to one single venue, on the basis that it has a good commercial relationship with that venue, or that it is too costly for the broker to connect to multiple venues. The broker may choose to send all orders to a venue with the highest chance of getting the best price, without necessarily guaranteeing that it is the best price at that moment in time. This is an unfair outcome for the retail investor and MiFID II/ MiFIR proposals, regrettably, do not go far enough to redress this.
Enforcing best execution will take time and will depend on broader market harmonization, but now is the time for regulators and retail investors to demand a more compelling definition of best execution. In particular, greater clarity is required around the execution policies provided by retail brokers to their clients. These policies are documents in which retail brokers explain how their best execution obligations are fulfilled under MiFID. Trading venues and brokers should also be required to provide execution quality statistics, detailing how well they performed in achieving best execution. This much needed clarity would, for example, result in firms having to justify — to both regulators and clients — why certain trading platforms are listed on their best execution policy and, why others have been omitted. In short, how and why some orders are routed to specific venues and not to those with the best price.
Daniel Ciment of J.P. Morgan details the development of Brazilian algos and outlines the most effective strategies for trading in Brazil.
Using Algos in Brazil Already accustomed to trading with algorithms or using algorithms to trade strategies in different markets around the world, as international buy-side traders look to Brazil, they want to trade there in the same way they have traded elsewhere. Even though having just one exchange makes the data feed more streamlined, because of the low liquidity profile of certain stocks in Brazil, you cannot use algorithms to trade all stocks electronically. For the more liquid names, many traders are using benchmark algorithmic strategies, like VWAP, percentage of volume, or arrival price. Most algorithmic strategies are based on benchmarks for now, as buy-side traders seek to replicate the methods they use elsewhere, while obviously taking into account the intricacies of the market structure. In the end, if they trade with algorithms in the US, Europe and Asia, they want to trade with algorithms in Brazil as well.
Infrastructure and Volume Spikes This is one of the challenges that we face as an industry. As you are building electronic infrastructures, you have to build for growth and not just for where we are today. When we look at a market, whether it is Brazil or more developed markets like the US, Europe or Asia, we know what we are trading today, but we have to build to accommodate what we will trade in a year, two years and what we think the peak might be. Just because a market trades a couple of hundred million in a day, or in the US, 8 billion shares a day, it does not mean you build your plan to support 8 billion shares a day because a year from now, that figure might be 20% higher.
More so, if a major event happens next week, then that figure might double, so you need to build sufficient headroom. Right now, we can handle a lot more than what we manage on a daily basis, but that is on purpose to make sure that at times of stress we are there for our clients and that they can trade through us with full confidence.
DMA or Boots-on-the-Ground? To be successful in a market like Brazil, brokers need to have people on-site who know the local investor community and know the local financial community. J.P. Morgan has a major trading presence in Sao Paulo, and that is just one piece of the offering in Brazil. For small firms who want access, outsourcing is a realistic option, but if you are going to be big in a market, especially in a market like Brazil, an in-country trading team is required.
Technical Challenges Reliable trading requires market data and telecommunications systems, which are present in Brazil, along with data center space and algorithms that are tuned to the local market and market structures. This tuning includes the liquidity profiles of the stocks as well as the rules and regulations of the exchange; you cannot apply the same algorithms from one region to another and expect them to work. We spend a lot of time and effort, fine tuning our algorithms, testing them on our desk and then rolling them out to clients. It is not just copy-and-paste.
Annie Walsh of CameronTec spoke to FX users to better understand the topical issues and challenges facing the OTC Foreign Exchange market and the central role FIX can play in addressing these challenges.
Undoubtedly the capital markets in 2011 will be remembered for many history-making moments including some of the largest currency moves the market can remember. We have witnessed the global foreign exchange market — the most liquid financial market in the world with an average daily turnover in the vicinity of USD4 trillion — bear the brunt of one political crisis after another, causing widespread volatility and difficult to pick currency moves.
Currency friction in Europe and between the US Administration and China will no doubt remain a prominent feature of the global economy for at least the next 1 – 2 years. On top of this remains uncertainty of government, particularly in Europe, and the implications for continuity of fiscal and monetary policy.
Many investment banks too in their search for alpha have been left wondering ”where did the black box get it wrong?” following lack lustre P&L performance, almost industry-wide over recent months.
Without a formal open or close, the FX market presents a true ‘follow the sun’ global market, with inherent levels of opportunity and risk.
Against this uncertain backdrop, the FIX Protocol has great potential to centrally feature in what is undoubtedly the single greatest threat (opportunity, if you prefer) facing the global OTC FX market. That is of structural uncertainty compounded by impending regulatory change to be ushered in, courtesy of Dodd Frank, and MIFID II and III.
With no unified or centrally cleared market for the majority of trades, and little cross-border regulation, due to the over-thecounter (OTC) nature of currency markets, these are rather a number of interconnected marketplaces, where different currencies’ instruments are traded. Inevitably OTC FX will move, however grudgingly, away from its long-standing (self-serving) model of self-regulation, toward greater levels of transparency, regulatory oversight (either directly or indirectly) and centralised clearing.
A Two Speed FX Market
As currently drafted, spot, outrightsand swaps are to be exempt from Dodd Frank’s requirement to be traded via Swap Execution Facilities (SEFs) and be centrally cleared; FX options, Cross Currency (CCY) swaps and Non-deliverable Forwards (ND Fs), however, are not. A perhaps unintended consequence of this two speed approach is the potential for jurisdictional arbitrage, product/financial re-engineering and further fragmentation of execution venues and liquidity.
In the short term, it also means that the sell-side needs to fundamentally reconsider strategies for design, development and deployment of Single Dealer Platforms (SDPs). Multi asset class SDPs will now necessarily evolve to become simultaneously both an execution venue as a destination and a gateway to a SEF, depending on the instrument traded.
Raymond Russell, of the FIX Inter-Party Latency (FIXIPL) Working Group and Corvil lays out the use cases for the FIX Inter-Party Latency standard and the functionality of Version 1.0.
Goals for FIXIPL
The principal goal of the Inter-Party Latency Working Group is to ensure interoperability between different latency monitoring vendors. Interoperability is essential because latency monitoring is vital to running a low-latency service, therefore the people building systems need confidence that they can start with one vendor and still migrate to another. What we have seen through the proliferation of latency monitoring systems across the trading world, whether DMA providers, market data providers or trading desks, is that often the problems in managing latency within an environment happen between the cracks. Most firms have a good handle on latency in their own environment because they have engineered it well, but when they connect into a counterparty, it gets tricky.
A trader who sees a slowdown in response time will want to understand why they have missed trades or why their fill rates are low, but there are multiple places where that latency could have occurred. One place is in the exchange matching engine, which in some respects is unavoidable. If there is considerable interest and activity in a symbol at the same time, those orders will have to queue in the matching engine, purely as a result of market activity. The latency might also have occurred in the exchange gateway. It is common practice for exchanges to load balance across multiple gateways to accommodate high volumes, and you might have hit a slow gateway. Perhaps the service provider you connect through may have oversubscribed their network and you could be caught in cross traffic unrelated to trading. We have seen all these things happen, so the ability to see where the latency is occurring requires a consistent set of time stamps across the architecture.
Most exchanges already employ latency monitoring in their own environment, and inter-party latency and the sharing of time stamps, while less important within the exchange, enables them to work with their members to identify areas of latency. The benefits unlocked through interparty latency are somewhat biased towards the end traders, but they also extend to brokers and market data providers, who receive better quality execution feeds and market data speeds, respectively.
For exchanges, the need for latency transparency is becoming a standard requirement as latency has become a competitive differentiator. To the extent that exchanges are comfortable with their own infrastructure and are ready to compete on their latency, they will want to share their latency measurements with members. In my experience, venues and brokers are no longer as reticent to share their latency figures as they were before.
Version 1.0 Rollout
Much of the work that we have done with Version 1.0 involved deciding how to produce a standard that on one hand is simple enough to be easily implemented, while ensuring it can still perform in all the basic use cases. Version 1.0, due out in December 2011, is clean and simple and emphasizes the core capability to publish time stamps. We have agreed on the technical scope and it is now going through the formal review procedures required to be standardized by FPL, including a public review. The other important part to be done before it is real is to get two different implementations. There are a number of things that will be ready in a few months’ time, such as distribution through multicast and the ability to automatically group several measurements together across the trade, which we will include in the next version later next year.