Mat Gulley, Global Head of Trading, Franklin Templeton Investments, is changing how his firm embraces alpha, and is making the PM, research desk, and trader a more integrated unit.
‘Alphatizing’ the technology
Today, it is hard to separate trading from technology. To have an effective trading team means having an equally effective technology platform which provides information, speed, flexibility and analysis. Over the last decade Franklin Templeton has put significant effort into four strategic technologically focused projects. All four projects; we believe, add targeted value to our process and have allowed us to rethink how we are involved in the overall investment process, interact with PM’s and handle and analyse each individual trade.
Our marquee project is the Investment Dashboard. This proprietary platform brings together all of our internal data in a user friendly and graphically robust front-end inter-face; allowing the trader to instantly understand portfolio positioning and to help generate focused PM/Analyst/Trader dialogue and targeted investment ideas. We wanted the data to be easily accessible, thus allowing our traders additional insight into all investment products broken down by portfolio manager, portfolio, strategy, geographical region or investment idea. Because of our product breadth and global reach, having well-organised and understandable data is paramount. Having access to a clear picture of our data allows each trader a greater opportunity to present value added ideas which can be accretive to the investment process.
To read Part I, click here.
The second project is an analytics platform for screening ideas and viewing intraday liquidity and trends. We started working with an outside vendor 3-4 years ago, which has led us to deeper aggregate market analysis on a pre-trade basis, as well as position screening for numerous metrics such as implied volatility, CDS changes or insider activity. With this tool, we are able to analyse these metrics across all our securities in order to recognise divergent patterns. We manage many portfolios and invest in thousands of securities in over 80 countries, so it is necessary to get timely pre-trade stock snapshots, understand valuation momentum, and have targeted market factor analysis to guide and focus the trader’s attention. The question we have always struggled with has been, “How do you synthesise all the information about a stock and market in order to have the best conversation possible with a portfolio manager?” Ultimately, this has led us to more intraday trading analysis and the ability to think about daily liquidity patterns and a trade’s relative trading profile. A system like this helps us to identify anomalies in the market and better understand liquidity and intraday momentum. This in turn is helpful to drive interaction and dialogue which we believe helps us make better trading decisions.
After restructuring our internal data and implementing an analytics process around our holdings and orders, we decided we needed to address transparency across our electronic execution world. About four years ago when algorithms were really gaining traction and dark pools and execution venues seemed to be multiplying at an exponential rate, the leaders within our trading group got together and discussed the lack of trading transparency. The question that kept being asked was, “How do we know what is happening and how do we find out?” We understood it was important for us to be able to get this intraday trade data, synthesise and then analyse. About 2 ½ years later, we had the data. We now require any broker whose algorithm, methodology or market access points we use for execution to provide a certain level of intraday trade reporting transparency (through the FIX Protocol) and complete ping and venue and time stamp data transparency in a data file on a daily basis. We then implemented a complex event processing (CEP) engine, which allows us to quickly analyse this information on a dynamic and historical basis; all the child orders, routes, algorithms. So, finally, this year we were able to have insight into “all” levels of executions and order routing; a look into the world of electronic trading we had never had access to. The light had finally been turned on.
To read Part III, click here.
We have a tremendous number of algorithms and data; and now we are able to more clearly use the data to help understand our process. I am now able to tell you information such as: in using 55 algorithmic strategies to execute about $20 billion in US capital, we sent out over 9 million child orders equaling around 21 billion shares with over 8 million child orders receiving no fills and the executed fill rate was 2.24% while the order multiplier was 20.37x. We feel this level of transparency is critical in the electronic age and we feel it helps us manage our trading, understand strategies and add value to the investment process. The questions we look to answer daily are, “What are the historically best venues, and best strategies and then how do we produce risk based alerts and risk based matrixes around this data that give us some indication of how to optimise our computer based executions?” For example, we have discovered algorithms that will work well in one environment, but not in another. Likewise, we have discovered algorithms that work well for a period of time and then their performance regularly will trail off. Some of our analysis has suggested a three month half-life for the algorithms we use. We have some examples where we have lowered the volatility of a trader’s blend of algorithms used while also improving their performance significantly. We like to refer to this phenomenon as creating a better “Algorithm Information Ratio.” These value added adjustments are hard to capture without the proper technology. The goal is to transform and analyse all of this data that we were getting through event processing, and we are now well on the way to meeting this goal.