Scenario analysis can be a useful tool in determining investment strategy, but it is not easy. It certainly involves more than good and bad perturbations around a base case. Excellent scenario analysis requires deep thinking and imagination to create a set of plausible futures. Each scenario does not have to be likely, but if it is possible, then a great deal can be learned from exploring the narrative.

In my 2011 article, Imagining futures: using scenario analysis to build robust investment strategy, I discuss the approach that Peter Schwartz describes in his book, The Art of the Long View: Planning for the Future in an Uncertain World.

Schwartz believes that scenario analysis comes to life if you take the time to develop a rich narrative and likens this to writing a fictional novel: create a plot, locate it in time and space, and even add characters. In other words, tell a story. In so doing you can tease out aspects of a potential future that would remain hidden using a more analytical approach. An analytical person, I found this notion both confronting and liberating at the same time.

Create a plot, locate it in time and space, and even add characters. In other words, tell a story.

More recently I came across the psychological evidence for such an approach in Daniel Kahneman’s recent book, Thinking, Fast and Slow. Kahneman is one of the founders of behavioural finance, but this new work is not about finance. Rather, it explores the notion that the human brain has two thinking systems:

System one operates automatically and quickly, with little or no effort and no sense of voluntary control. System two allocates attention to the effortful mental activities that demand it, including complex computations.

Understanding how these two systems operate has many applications for investors and is also a great read.


Story telling and abstraction

Kahneman supports Schwartz’s suggestion that it is through story telling that you can get the best out of people when building scenarios. He provides evidence that people are very poor at using abstract statistics to make correct probabilistic judgements. However, people perform better when statistics are presented with some form of causality.

Kahneman provides this problem: A taxi was involved in a hit-and-run accident at night. Two taxi companies, Green and Blue, operate in the city. You are given the following data: 85 per cent of taxis in the city are Green and 15 per cent are Blue.

What is the probability that the cab involved in the accident was Blue rather than Green?

A witness identified the taxi as Blue. The court tested the reliability of the witness under the circumstances that existed on the night of the accident and concluded that the witness correctly identified each one of the two colours 80 per cent of the time and failed 20 per cent of the time.

What is the probability that the cab involved in the accident was Blue rather than Green?

Most people answered 80 per cent. They ignored the fact that there are more Green taxis than Blue, and just focused on what the witness saw. In fact, this is a standard Bayesian- inference problem. The true probability is about 41 per cent.

This would not be particularly surprising if Kahneman’s subjects were drawn at random from the population. However, the results were more or less the same when the problem was posed to psychology students, who had received proper training in statistics and probability.

Chris Condon

This got Kahneman wondering if there was more going on than just ignorance – even highly trained humans do not think like Mr Spock. It turns out that when confronted with a problem framed as a mix of abstract facts and a story, the story dominates thinking. Abstract facts struggle to get the attention they may deserve.

This was demonstrated when the problem was reframed with causal information, providing the following data: the two companies operate the same number of taxis, but Green taxis are involved in 85 per cent of accidents. The information about the witness is as in the previous version.

These two problems are the same from a mathematical perspective and yet the people who were posed the second problem gave much better estimates.

Kahneman suggests the reason for the poor performance of subjects faced with the first question is that people ignore bland (but essential) statistical information (such as the relative number of taxis). There is no interesting causal link with the accident, so they overlook it.

But when this information is presented as a story, such as the image of crazy drivers in Green taxis, they factor this into their deliberations. Even though the witness saw Blue, they discounted this evidence as they knew Blue taxis drivers tended to be safer than their peers in Green taxis.

Lessons for asset allocation

The implications for investment strategy are stark. We know that it is not good enough to advise investment committees that one investment strategy has a higher Sharpe ratio than another. We rightly criticise the Sharpe ratio as condensing all notions of risk into the second moment of the distribution of returns, that is, volatility. This ignores the shape of the return distribution; in particular the reality that extreme events occur more frequently than would be expected if volatility completely described risk.

The implications of Kahneman’s observations suggest it may be hard to correctly interpret the results of such analysis if the narrative behind each scenario is not brought to the fore.

This challenged the way that I conceived scenario analysis to be used. Before reading Kahneman, I used scenario analysis as a much richer and relevant way of devising assumptions to feed into asset- allocation modelling. But of course, the asset-allocation model is not human – it does not understand narratives. Instead it needs narratives to be summarised into a set of numbers. It then uses these numbers to find the investment strategy that best meets an investor’s objectives and risk preferences.

While such an approach remains valid, it is not the end of the story. The narratives in the scenarios need to be resurrected to bring the recommended investment strategy to life. It is insufficient to say something such as, “This strategy provides the highest mean return subject to the probability of a negative return occurring is less than X per cent”.

That may be how the investment objective was specified, but it does not mean very much to most people. It is difficult to really understand what this means without imagining some sort of plausible chain of causality that would lead to extreme outcomes (positive and negative). This is where the narratives behind the scenarios become useful.

For example, if the investment objective is specified as above, then use the set of scenarios that falls in the left-hand tail to spin a yarn to the investment committee that allows them to imagine how a bad outcome could occur. This should bring the investment strategy to life.

But if you left it there, you would scare the horses and communication would not be balanced. To complete the story, equal time should be spent regaling the committee with a story drawn from scenarios on the right- hand side of distribution, that is, plausible futures that are benign.

And to finish, you should return to a handful of the most likely scenarios and outline a plot for each.

Probability distributions are just too hard to internalise, but bringing them to life with plausible and balanced narratives should help.”

This is well summarised by the following statement attributed to psychologists Richard Nesbitt and Eugine Borgida:

Subjects’ unwillingness to deduce the particular from the general was matched by their willingness to infer the general from the particular.

As asset allocators, we must not only do the best analysis possible, but we must communicate the results in a way that the non-analytical members of the investment committee can understand – spin a good yarn. But remember, it is just as easy to use this power for good as it is for evil.

Chris Condon is principal of Chris Condon Financial Services. 

Join the discussion