In a world where signals and noise dominate the minds of investors, relying on the uncompromising clarity of data to find a path can be the most complicated part of asset allocation.
Alan Marantz, managing director at Fort L.P. said change needed to be at the core of an investor’s process.
“As humans, we’re very much influenced by what’s just happened to us and by the most adverse things that have happened to us,” Marantz said.
Bayesian statistics loosely posits that what’s happening in the current environment has relevance and connection to what’s going to eventuate in the near future.
“By taking in data and analysing what’s happening over recent times, we can use that to optimise our risk allocation,” Marantz said.
Fort tends to pull data over a three-year window and weights the most recent data more importantly.
When looking at how quantitative investors can leverage data, Marantz’ strategy spans three main dimensions. Firstly, the fund questions how risk is allocated across holding periods; a more classic risk allocation view. Secondly, how fast or slow should the models be. And thirdly, which model types work best or worst in any given environment.
“This kind of statistical learning forms the basis of our process and is 100 per cent reliant on data,” Marantz says.
To illustrate how these dimensions play out across different time periods, Marantz points to the time between the Plaza Accord in 1985 – where the G5 nations agreed to depreciate the US dollar relative to the Japanese yen and the German Deutsche mark – and the early 2000s.
The interest rate differentials globally offered foreign exchange investors tremendous returns, whereas today, a much more synchronised global economy and correlated currencies means the opportunities have dwindled.
“If you look at and measure, on a trailing basis, the risk adjusted returns investors received over those periods of time,” Marantz says.
“To grade them, you’d have given them a higher grade in the early days and a very low grade now.”
This historical data helps perform relative grading between asset classes, and allows investors to drive a much better risk allocation process.
“That way you’re much more reliant on data than on human decision making because we’re very easily influenced,” Marantz said.
Building a quantitative system leads investors down one of two paths, he said.
Firstly, the path of risk parity, which posits that the world is random and investors want a risk adjusted equal distribution across a wide range of instruments and strategy.
Or, investors can embrace the idea that the world is evolving and that within certain periods or cycles where different opportunities exist and you may want to adjust your system to these new cycles.
Marantz points to a strategy Fort has employed for 18-years, which models a currency risk exposure to short term interest rates.
Between 2005 and 2008, the model hit a very low point – below 5 per cent – of risk allocation before rising between 2007 and 2010.
“When we went through the global financial crisis, and central banks were cutting rates quite substantially, our models began to make substantially more money, relative to other instruments in this asset class,” Marantz said.
“So the risk steadily adjusted upwards to almost 30 per cent, which is a 6 times change in exposure.”
Building a regime that can adapt to regime changes or shocks is the most powerful tool, Marantz says, so investors can extract the persistent inefficiencies across a trend.
“But rather than a momentum style, this Bayesian component allows us to actually get in front of the trend and build a position as prices are falling, so we can sell as prices are rising.”