Whether it be satellite data measuring the shadows cast by oil tankers to predict supply imbalances, or statistics that capture hidden patterns in daily stock movements forming the basis for short-horizon trading forecasts, quantitative equity strategies integrate some highly novel elements and innovative approaches.

Some believe that overcrowding and underperformance in quantitative strategies cloud their future. Philip Kearns, a managing director with the global investment and technology firm, the D. E. Shaw group, argues that the foundations of quantitative investing, often referred to as quant by institutional investors, remain sound.

Quant approaches to investing offer a framework for using statistical data to validate hypotheses explaining the performance of stocks, sectors, or specific factors, Kearns said during Deconstructing Equity Returns: Separating the Opportunities from the Risks, a virtual roundtable hosted by Investment Magazine and sponsored by the D. E. Shaw group.

Quant equities represents the D. E. Shaw group’s longest-running strategy, Kearns said. This area includes shorter- and longer-horizon forecasts that are constructed and validated using “a ton of data” about the movements of thousands of stocks.

Despite that intensive use of data, the quant research process is often analogous to the work of a discretionary portfolio manager in one important respect, he said, in that it starts with the formulation of a hypothesis about why a given stock could outperform or underperform its peers.

Much of the performance of stocks can be attributed to a range of certain well-understood exposures, trends, and other factors. The unexplained frontier beyond those factors can be described as “idiosyncratic” phenomena, and it constitutes a “very fertile area” for formulating and testing hypothesis-driven quant investing approaches, Kearns said.


Once a hypothesis has been conceptualised, it is validated against data on thousands of stocks globally. Failure to disprove the hypothesis “increases your confidence that the effect will be there going forward,” Kearns said.

Rigorous testing

Still, the real world is complicated, and so is setting up a successful quant strategy. Despite enormous leaps in the computing power and amount of data available for testing quant-based hypotheses, significant barriers remain for investors interested in this space, Kearns said.

Investors must manage risk and achieve sufficient conviction in the strategy to withstand inevitable periods of underperformance. They also need to invest in the expertise required to properly manage data and ward off the myriad ways in which it can be polluted or skewed. Signals that initially look convincing can turn out to be mere data aberrations.

“Having a very well-resourced data collection and scrubbing effort is incredibly important, and I think it is a bit of a barrier to entry,” Kearns said.

The testing of hypotheses must be extremely thorough. Even setting the initial t-statistic for establishing statistical significance is no simple task, Kearns explained, as this figure may vary across different hypotheses.

Testing validity “is a function of effectively how many different variants of things you’ve tried, and how many times this has been tried in the past, and potentially even how many times it’s been tried outside of your institution by other people who have written or talked about it,” Kearns said.

Once trading commences, investors must continually assess a forecast, Kearns said. The effectiveness of some forecasts tends to decay as others discover them or structural changes take hold in markets.

Staying ahead of the crowd

Leslie Mao, director of investments and head of equities at Willis Towers Watson, said he viewed the market’s use of quants as a kind of “arms race” where players rush to gain access to the next model and add it to their stockpile of ammunition in an effort to stay ahead of peers.

Some funds take a more cautious, “old-school” approach when evaluating alternative datasets, while others are more adventurous, he said.

Ronan McCabe, head of portfolio management for Mercer in the Pacific, and CIO for its New Zealand business, said even compared to a decade ago, “the efficacy of a lot of factors is probably gone now” because of crowding.

“There’s definitely a proprietary edge,” McCabe said. “It’s their own IP, but it’s that ability to delve a bit deeper. There is still definitely a lot of edge out there.”

Linda Trusler, head of investment strategy at Legal Super, noted the macroeconomic environment may change, affecting factors that potentially outperform in certain environments.

Kearns noted that longer-horizon models typically require 20 or more years of data, and over such a span, there is risk that some condition material to the thesis could change, such as shifts in central banks’ monetary policy approach. Dealing with these issues is indeed a challenge, he said.

“You have to try, at the forecast proposition stage, at the forecast research stage, to immunise yourself, to the extent you can, to those sorts of changing macro environments,” Kearns said.

Investors also should develop risk models that are “as adaptive as possible to the macroeconomic environment,” he said. While a forecast will remain fixed, the risk models that are overlaid can be more creative and adaptive, which helps investors “avoid taking too much exposure to risks that are a function of the macroeconomic environment that you’re experiencing at the time.”

Maintaining a competitive edge involves hiring people from a range of fields, Kearns said, noting that the D. E. Shaw group has attracted astrophysicists, computational scientists, and other specialists to serve within its quant groups, and these specialists often had had little to no experience in finance when hired and required several years of training.

Chris Tse, investment manager of listed shares at SunSuper, said expanding exposure to quantitative strategies at the fund involved improving risk analysis by onboarding systems internally, while still relying on external managers to implement the strategies.

Onboarding newer, more sophisticated quants is a significant undertaking, owing to challenges around hiring the right skillsets, Tse said.

“It’s really bringing those managers into one portfolio and recognising the risk and the different characteristics that they bring to a portfolio,” Tse said.

Idiosyncratic all the rage

Roundtable participants noted an ongoing shift towards idiosyncratic gains and away from factor investing in the quant world. Fraser Jennison, senior investment analyst at UniSuper within the global strategies and quantitative methods team, said he is keen to ensure tracking error across the fund’s equities book consists largely of idiosyncratic exposure.

A top-down process generates thematic insights into things like macro demographics, and these are translated into risk factors in the portfolio. But the fund doesn’t “want it to the point that it dilutes significantly that idiosyncratic element,” Jennison said.

David Bell, executive director at the Conexus Institute, asked whether any of the panelists still saw a role for factors, noting the progression towards greater idiosyncratic focus across the industry despite some quant equity managers having “factors form the basis of what they do in a pretty low-cost way.”

Robert Graham-Smith, senior investment manager at Colonial First State, said managers began crowding into more generic factors as assets in the space grew, contributing to several periods of severe underperformance across a range of alternative risk premia products. Even managers using more sophisticated methods were sideswiped, he said.

“I don’t know whether that was just a byproduct of deleveraging, or other things going on in the market, but even areas you thought were not that adjacent to the mainstream crowd did get impacted.”

Charles Hyde, head of asset allocation for the Guardians of New Zealand Superannuation, said his fund did have a substantial allocation to an equity factor strategy, with 20 percent of its portfolio allocated to a global equity factor strategy that invests only in developed markets.

The fund began with a “large, purely passive broad market equities exposure,” and the factor strategy “lowered slightly the risk in our portfolio and offered the promise of a higher return,” Hyde said.

The Guardians have “shied away from the very bespoke implementation of factors” offered by some managers, he said, opting for a “very vanilla implementation”.

Hyde elaborated: “How do you get confidence that this bespoke strategy that they’re proposing, which has very little transparency to it — it’s not something that’s been published on, it’s proprietary — how do you get confidence that thing’s going to deliver for you on a net-of-fee basis?”

Changing market behaviour

Roundtable chair Amanda White, director of institutional content at Conexus Financial, asked if the behaviour of markets has changed as a result of the rise of quantitative investing, and whether generic risk management products and techniques are keeping up with the changes.

Kearns acknowledged he has seen more “crowdedness in certain areas” of quant, and also greater leverage in certain areas and stocks. The pace of trading has increased, particularly with respect to some very short-horizon trades or more technical signals, he said.

Trusler said the expansion of retail investing and the rise of algorithmic trading has skewed market expectations. Actions by the U.S. Fed, for example, have given retail investors greater confidence to take on risk.

“I would expand this discussion from saying that there is a change in market behaviour that’s limited just to quant to one that encompasses essentially the whole gamut of markets and investing as well,” Trusler said. “I think, in some ways, the riskiness of risky assets has diminished somewhat, and I think that causes quite a blur between what are considered risky assets and risk-less or ‘risk less’ assets at this point in time.”

McCabe asked if the flow of money into passive investments in recent years had distorted signals and processes, and how investors should navigate that.

Kearns said long-biased active managers tend to have a small cap bias, and this created a statistically significant effect when growing amounts of capital began shifting into passive.

“If they’re a long-only portfolio manager, they are very likely going to have a higher small cap exposure than the S&P does, and so when those portfolios are liquidated and moved to the S&P, they’re selling small caps, and they’re buying large caps,” Kearns said.

Other factors didn’t appear to be affected by the broad shift to passive, but he admitted he was “perfectly willing to believe there may be something there we haven’t uncovered yet, because it’s the elephant in the room, it’s the most massive flow out there.”

Quant and ESG

Kearns said quant is well-suited for implementing ESG tilts because it ensures the approach is data-driven. That said, obtaining fully useful data remains a key problem in the implementation of ESG because datasets often lack depth, with some “decent causal datasets” only stretching back to 2013. There is also little consistency across data providers.

Efforts are under way to deploy sophisticated techniques involving machine learning and artificial intelligence to backfill datasets that lack historical depth, he said, but doing so may prove difficult.

While taking an activist approach may be outside the reach of quants, the quant world does offer solutions for investors wanting, say, to construct a portfolio with half the carbon footprint of the MSCI World Index, he said.

The Guardians have built a climate benchmark into their reference portfolio, which Hyde acknowledged had “added a lot of complexity to that portfolio.” While the fund believes strongly in incorporating ESG principles, and that this is an “unstoppable train” investors need to board, Hyde said he is concerned it does complicate an investor’s mandate.

“From where I sit in asset allocation, a big consideration is if we greatly concentrate our actual portfolio, at least the passive equity portion of our actual portfolio, as a result of ESG screening, and we don’t do it to the reference portfolio, it creates a lot of active risk that needs to be accounted for,” Hyde said. “How does that flow through to impact return expectations? Is there return attached to that active risk that you’re creating?”

He has yet to see evidence that portfolio performance can be improved by implementing ESG strategies, he said, acknowledging there could be a longer-term trend that isn’t visible yet.

Jennison said he’s somewhat skeptical of the ex ante loadings to ESG scores made available by third-party data vendors and prefers UniSuper’s internal stock-by-stock scoring approach. That makes it challenging for his organisation to work with external quant managers in the ESG space.

“At this stage, given that data quality…we’ve probably got the most value from our internal sort of deeper efforts on a stock-by-stock basis.”

Kearns concluded the meeting by recognising some investment community disillusionment with the performance of quant, particularly in alt risk premia. But while quant is not the only way to access returns in the idiosyncratic and factor space, he said it is an efficient way to do so, and it can generate attractive returns when done properly.

“We’re always evolving,” Kearns said. “We see challenges down the road, there’s no doubt about it, some of which we don’t know about, but I’m pretty hopeful and pretty optimistic about the future of quant equities.”

This article was sponsored by the D. E. Shaw group and does not convey investment advice or an offer of any type with respect to any securities or other financial products. No assurances can be given that any aims, assumptions, and/or expectations expressed or implied in this article were or will be realized, or that the activities described in this article have continued or will continue at all or in the same manner as described.

Join the discussion