Castles In The Sand

By Huw Gronow, Head of Dealing, Newton Investment Management

A standardised, consolidated transaction tape utility would provide consistent and complete liquidity data which would improve forecast future outcomes.

screen-shot-2018-09-12-at-9-29-51-am

Nearly 200 years ago, the scientist John Brown made the startling discovery that particles inside pollen grains in a drop of water moved around due to their collisions with the water molecules, themselves fast-moving, in a process later proved by Albert Einstein.

Brown’s discovery gave rise to the eponymous description of random motion that we know today. To the casual observer, a beaker of water remains a still object in equilibrium; without closer inspection, this assumption does not see that at the molecular (and even quantum) level there is, to put it in layman’s terms, a lot going on.

The extension of the discovery of this phenomenon gave rise to what we now describe as stochastic processes, with several extensions into the world of finance, and as a way of applying these to and thinking about the way securities markets work.

As it is, our current equity market structure in Europe, and to a greater extent in the US, reflects this. The advent of ultra-low-latency access to markets, with trades now measured in nanoseconds from order to entry to arrival at the exchange matching engines, has developed along with the new fragmented market structure. It is now necessary to atomise the intended trade into much smaller particles and direct to many different venues simultaneously to retain a high degree of probability of success, designed so that a significant number of “messages” are sent to the market in excess of the desired intent to trade.

This inevitably means that the data seen at one level on the computer screen accounts for just a fraction of the activity that comprises the ecosystem of fast, interconnected exchanges, multilateral trading facilities (MTFs), systematic internalisers (SIs) and the rest.

Contributions and benefits

In Europe, it is a matter of sometimes-heated debate about whom the development of market structure benefits, in terms of the advances in technology and regulation. The issue most discussed concerns the short-term, ultra-low-latency proprietary trading participants or high-frequency trading (HFT) firms, as they are broadly labelled. These HFT firms are seen as generally either good for the market, claiming to supply liquidity, or predatory, seeking to detect the signals of predictable trading strategies.

The reality is that all participants in the ecosystem make a contribution, of whatever value, and that it is the job of regulators, as well as the responsibility of market participants themselves, to police what is deemed illegal, or potentially so in the environment.

The aim should be to eliminate any informational advantage given by the composition of the ecosystem itself, where those advantages are identified and considered detrimental to the role of capital markets – which is to allocate capital efficiently and fairly.

The exponential increase in data that the market has produced by ever diminishing trade sizes and frequency of trading means that the equity market overall can arguably now be viewed in better definition than 20 or even 10 years ago. It is unsurprising that the interest from the academic world, of mathematicians, engineers and applied statisticians, schooled in Markov, Monte Carlo, Wiener and so on, has grown substantially.

Balancing alternative trading approaches

The task of the institutional trader is to navigate two difficult paths: one is between seeking significant size inventory, and therefore (at least for the early part of the risk transfer process of liquidity consumption) eschewing the exposure to pre-trade transparent, displayed, “lit” environments, and thus incurring possibly punitive transaction costs. The other path is followed by participating actively in all parts of the ecosystem and therefore subjecting the portfolio to potentially deleterious levels of market impact costs.

At all times, the balance of these two approaches is determined by the urgency and progress of the trade.

The undertaking for large inventory transfer is therefore, a complex one to pre-programme and is better managed in vivo by skilled and experienced human traders rather than predictably and deterministically. It follows that the higher aim, and the opportunity provided by the recent changes in regulation in Europe, is to apply this learning to the contribution of transaction costs to the investment process and, in particular, to the efficiency of portfolio construction.

What tools are necessary to accomplish the task effectively? Aside from the required efficacy and efficiency of routing capabilities, and exposure and access to all desired execution venues, the ability to analyse, forecast and implement overall trading strategy is vital to delivering the best possible result.

The latest revisions to Europe’s Markets in Financial Instruments Directive (MiFID) and Regulation (MiFIR) have the central tenet of transparency coursing through the text, whether it is via the restriction on non-displayed trade activity under large-in-scale size, or the share trading obligation, as well as the requirement to publish most trades as close to real-time after the event as possible, among other stipulations. Whatever the debate about transparency and the impact of real-time disclosure to large trades and the cost of capital incurred by broker-dealers who may provide liquidity on this scale, the central desire must be consistency in data labelling and disclosure.

screen-shot-2018-09-12-at-9-31-53-am

Accessible liquidity

At present, much debate centres on what is “accessible liquidity”; what one may define as the ability for any market participant to take part in a trade at any given moment. This is easy to delineate for a public exchange’s order books or those of an MTF; arguably less so for a periodic auction where “broker preferencing” may be a feature, or an SI source of liquidity, just to take two of a myriad examples.

And therein lies the issue. Most if not all measures of the ease of investment in a security have liquidity as a major factor. While it may be marginal if that liquidity is not clearly and consistently defined, even if after the event, the inputs into the liquidity forecast become unsound to a greater or lesser extent.

When one then attempts to forecast the market impact associated with the decision, the portfolio construction assumptions may suffer. The highest aim for the next revision of the legislation, unless an enterprising entity takes the opportunity beforehand, is to grasp the opportunity to fill this gap. This is by way of a mandate of a standardised, consolidated transaction “tape” utility.

This is clearly within the bounds of possibility for equities, but may be some years away for other asset classes. It is a truism that any attempt to try to model or forecast future outcomes based on inconsistent data and classification amounts to an approximation, however narrow the distribution of these outcomes is.

Incorporating this inconsistency into the important task of integrated efficient portfolio construction for the benefit of end-investors and their returns runs the risk of building castles in the sand.

We’d love to hear your feedback on this article. Please click here

globaltrading-logo-002

Related Articles

Latest Articles