TCA Q&A: Dave Cushing, Clearpool

Clearpool Group in June appointed trading and market strategies expert and Transaction Cost Analysis pioneer Dave Cushing as Advisor. Markets Media recently caught up with Cushing to discuss the state of TCA.

Can you provide an overview of TCA, from past to present?

Dave Cushing, Clearpool

For better or for worse, I’ve been around since essentially the inception of TCA in the mid 1980s. I have seen it evolve from small-scale datasets, with not necessarily great timestamps, and with limited ability to create unique benchmarks using tick data and other data sources around that execution data. Now in our age of big data, atomic clocks, cloud, AI and everything else, there has been an explosion in the ability to do more sophisticated work.

But it’s still very difficult to do high-quality TCA, because the data is noisy, because there are many factors outside the trader’s control, and because datasets can be incomplete. So even though we have much better tools at our disposal today, in some ways the challenges that have been part of TCA since the beginning are still with us.

What’s helping TCA now is that there is a regulatory imperative to invest more in this area, and firms need to eke out every last basis point of alpha. So tools are improving, datasets are larger and cleaner, timestamps are better, and computing power is cheaper. So I think we’ll start to see the actual practice improve in quality and sophistication because of these surrounding factors.

In a way, people are the limiting factor in TCA. I’m hoping to see an industry awakening to the greater performance that a well-crafted TCA approach can bring. There is a lot more potential to be unlocked as people become more curious and more systematic about how they approach it. So as the data, the visualization, the processes, and the automation around TCA continue to improve, people will come along with it.

What are you hearing from buy-side and sell-side market participants regarding TCA?

A point of focus is having data be both accurate and rich, and then being able to conveniently access the insights contained in that data.

The state of the art is evolving here, and I think there’s going to be a lot of value placed on being able to attack this on the scale it needs to be attacked. But there’s also a counterbalancing of being accessible and being actionable. You can spend a lot of money building a repository and building analytics that sit in that repository, but then if you don’t know where and how and when to apply that information, its value will be underutilized. And by the same token, you can build analytics, but if you don’t have the data to go with it, then you have an incomplete picture.

It’s essential to have both of those things — have the underlying infrastructure be both accurate and complete, and have the right tools sitting on top of that. People approach this from a lot of different perspectives, and you want to be able to reach as broad of a swath of people as possible. So that means you have to be really thoughtful about what does the interface look like, how does it function, where does it function? How easy is it to customize? These are all really important questions, and we’re trying to find that sweet spot via Clearpool’s Venue Analysis and other trading tools.

What attracted you to the position at Clearpool?

Clearpool has a state-of-the-art infrastructure, and because they run so much volume through their platform, they have what’s needed to produce meaningful analytics at scale. Those are a couple of the things that attracted me to partnering with Clearpool as an advisor. Many places don’t have enough data, or they don’t have enough development resources or compute power for TCA. Clearpool has all of that in place, and the tools built on top of that are very visual and very intuitive. They make it easy to ask scientifically based questions about choices that traders are making, and the consequences in terms of cost or execution quality of those choices. Bringing those analytics to the point of sale is really powerful.

In my role, I want to help Clearpool shape the form, the timing, and the delivery method of those analytics to make it easier for a trader or an analyst to drill into the data. What’s needed is insight — insight into venue selection, insight into order types, insight into the settings for those order types. From there the trader or analyst can begin to improve the foundation, and then elevate those improvements to ultimately be able to match the order type to the tools in an optimal way.

What is the future of TCA?

There’s going to be a more sophisticated repository for data, and there’s going to be more control over that repository. Firms spend a lot of money to try to get data right. I think in a few years we’re going to see some robust solutions that have complete and accurate data, and in a way that the firms who generated that data don’t lose control of it. Control is going to be a big theme, as people are going to insist on having more control.

We’ll also see the sophistication of analytics improve, as well as the ability to differentiate between things that don’t matter and things that do.

If we’re looking three-plus years out, I think things like AI will play a big role, because it’s well-suited to TCA. These are huge datasets with lots of different ways to look at them, and people need all the help they can get to look at that data most productively. AI won’t necessarily give you the answers, but it will help people zero in on what data matters most.

Today it’s more of a one-way cycle, where you place the order, you get the execution, you analyze the execution and you measure the quality. But the future will be about the ability to take that output and cycle it back to the front end of the process, so you can complete that loop, and more actionable insights will come out of the data.

Related Articles

Latest Articles