Getting a Good Trade: Quality Assurance Testing For Brokers In The Modern Regulatory Environment

By Daniel Pense, a 20 year industry veteran of Electronic Trading Technology, having held various roles in vendor and broker dealer firms including Credit Suisse, UBS, Donaldson Lufkin and Jenrette, and Morgan Stanley.
Daniel PenseSuppose that your marketing pitch from your trading desk to their institutional customers is successful, and they agree to try out your new algorithmic offering and make it part of their trading strategy in the next quarter. Now that you have the commitment for the trade, it is the job of the bank’s IT department to make sure that once the order gets in the door, it passes cleanly through the chain of systems that will place it at market, successfully manages the life cycle of the order, and returns every order execution intact with the detail requisite to satisfy the client. With the increasing complexity and nuance of how markets allow orders to approach liquidity, and the lengthening chain of systems that enable them to engage in this dance, successfully completing a trade has become more complex than ever before. When the scrutiny and governance detail of increased regulation comes to bear, the requirement for precision in managing that complexity drives the difficulty to a new level. It has been said that mastery of technology is the difference between success and failure for a trading firm, but it is not mere skill at the trading process, but also the management of technical complexity that comes into play. All the good ideas and implementation for the latest algorithmic product offering may get front page press, but change management, design that accommodates instrumentation and tooling, and yes, plain old fashioned good QA testing make all the difference in how a firm will do over time.
Customisability
Broker dealers deal with a set of problems specific to their niche. Generally it is expected that they will provide accommodative technical solutions for their customers that allow the Institutional side to interact with their set of servicers as uniformly as possible. Of course each institution will have requirements on all servicers that vary from institution to institution. Markets also may impose some of these same requirements for uniformity, but of course each of them also has their own flavours of trading requirements and offerings. This means that broker dealers must build systems that have a high degree of flexibility and customisability at least at the boundaries, and they need to be able to effectively manage the configuration complexity that comes with that flexibility. This typically requires customised tooling.
As systems evolve over time, there needs to be an effective way to test not only the base scenarios, but also how the systems perform with various customisations applied. In addition, the test process must include the instrumentation and tooling to assure that these remain functional with the deployment of each new build. When this requirement is multiplied across the various systems that are used by each trading desk the task of just maintaining system integrity is very daunting.
Put another way, good QA testing should always include both functional and non-functional specifications, and when the latter is neglected in favour of the former, plant manageability is compromised. When an outage occurs it will find the weakest link, and from a business relationship perspective, it may not matter if the problem is with TCA of the algorithm or simply the amount of time it takes to get back an execution from a broken trade. Anything that hurts the customer’s trading experience has the potential to impact their cost and can impact the relationship. Technology is always perceived to be only as good as the relationship that it facilitates. This is as much dependent upon consistency and manageability as it is on the latest trading fad.
Firms that consciously consolidate portions of their technology do better at managing the consistency of their delivery, and simplify the task of the quality assurance process. By centralizing certain aspects of the technology: having common data stores across desks or even asset classes, maintaining a tight internal data model (even if this is mostly centred on a common internal FIX spec), enforcing paradigms for resiliency and redundancy across the plant, firms can re-use the same QA techniques and even components.
Changing regulatory regime
The ongoing intensification of the regulatory regime poses additional challenges for quality assurance. In many cases these regulations do not mandate specific technical implementations, as the systems that manage trading are too diverse for this level of detail in the regulation itself. Broker dealers are left to interpret the regulation and translate it into specific technical implementations in their systems. Where there is a level of interpretation, a conservative approach is usually taken, because no one wants to develop a paradigm that will later lead to a failure in an audit by regulators. But once the specific technical methodology of addressing the regulation is decided, the QA regime must expand to include use cases that will exercise those specific controls.
A case in point here would be the Market Access rule, instituted a few years back, where the specifics of the risk control procedures are left to the individual bank. Each must determine how to create controls around and within their systems to meet the requirement.
The technical implementations that I have seen addressing this are varied, but all involve counterparty limit management and some supervisory override ability. In terms of QA testing the approach here would be to add cases to the functional testing which would breach limits and trigger supervisory review.
In some cases while the regulation is more specific in the trading behaviour that it enforces, it leaves the implementation details to the broker dealer. For example Limit Up/Down controls have specific price limit percentage bands boundaries to adhere to. The mathematics is set, but each system must manage these within the constraints of how it manages market data. Again this typically will be addressed by additional use cases in the test bed, but here the test harness will need to include simulated market data and may exercise the instrumentation as well as the software under test.
In some cases the regulation is very technically specific. For example, in MiFID II there is a requirement for clock synchronisation in the general trading plant which is intended to live with the trade and transcends the systems it traverses, institution, broker or market. To implement this requirement, clock synchronization within fairly tight tolerances (100 microseconds in some cases) must not only be established in every system the trade touches, but that tolerance must be also maintained at the appropriate level. It is the latter requirement that is the most challenging as anyone who has familiarity with managing clock synchronicity in trading systems will attest.
While many tools exist to facilitate clock synchronisation, in practice keeping the drift within tight tolerances across many platforms (which may have different mechanisms for clock synchronisation) is logistically challenging. It is likely that new monitoring will need to be added across many bank’s trading plants which is specific to this function.
Quality Assurance of this kind of thing is essentially non-functional type of testing, but with a monitoring system of the kind mentioned perhaps it can be reduced to functional testing, albeit in a different domain. Possibly there are vended solutions to this problem as well, and as the regulation rolls forward perhaps more will appear, but of course at an added cost to managing the trading plant.
Standardisation
Where it is possible to implement, standardisation generally helps reduce the cost and increases the reproducibility of the QA program. Generic software test harnesses that get created within a bank often help the efficiency the QA process, sometimes to a significant degree. However their efficacy generally stops at the walls of the organisation, and since the whole purpose of electronic trading is messaging between counterparties, QA programs built around industry accepted standards are necessary also.
The FIX protocol has been the standard-bearer in this realm and is the most successful of all crossparty protocol initiatives. Cost savings realised by the industry due to the wide adoption of FIX protocol are so substantial that it would be difficult to calculate. The desire of industry participants to deliver a cutting edge offering is always there however, and this drive for the competitive edge runs counter to the spirit of standardisation. Proliferation of binary protocols and even customisations within FIX protocol itself have increased the complexity of the trading process and the technology behind it, even while improving things like trading latency and optionality within trading vehicles.
For those who wish to succeed in this environment the QA process and tooling must keep pace. Generally broker dealers implement a mix of vended and proprietary solutions to manage their systems, and those who build or buy the strongest offerings in this space reap the rewards of that investment. The challenge of increasing regulation and requirements for system complexity are not likely to reduce anytime in the near future, but attention to designing and maintaining a good QA profile across the technology plant is a focus that will improve cost efficiency and competiveness in the long term.
We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002

Related Articles

Latest Articles