by futurist Richard Worzel, C.F.A.
Why is risk something we want to avoid? I suspect that any attempt to answer this question will deal with the downside or cost of risk, and that’s where any assessment of risk should start. Indeed, my definition of risk is just that: Risk is the cost of being wrong.
Let me provide a simple illustration of this. Imagine that you have a thick, wooden board that’s 15 feet long and 2 feet wide laid out on the grass, and that your task involves walking across it without stepping off. It’s a simple task, and most people would have little difficulty doing it. The risks involved are non-existent.
Now imagine walking across the same board lodged solidly and securely across a 1000 foot-deep chasm on a still day. Suddenly, this physically simple task takes on a much higher risk profile, yet the only thing that’s changed is the penalty for failure. Whereas in the first case, there was no penalty, or cost, of failure, in the second the penalty would be catastrophe.
I developed this definition of risk when I worked as a research analyst in the stock market in the 1970s because I strongly disagreed with the then-emerging definition that risk equates to volatility. As someone with a quantitative background (I have degrees in math and computer science), this struck me as simplistic and suspiciously convenient: volatility is easy to measure, so that makes it attractive as a measure of risk. That doesn’t make it right; it just makes it convenient.
How a futurist’s view differs from the conventional business view
But if risk is the cost of being wrong, we need to ask: wrong about what? That’s where a futurist’s view of risk diverges most widely from the conventional, business view, because most business people have, in my opinion, too narrow a view of where they could be wrong, and therefore of the risks they are running. Let me provide a recent case study, based on a blog I wrote about the European economic/financial crisis. (‘ A Spectre Is Haunting Europe ‘, June 15th, 2012.)
The European crisis is, as I’ve described elsewhere, the world’s most boring disaster. It has been making scary headlines for almost three years now, and promises to continue to do so for years yet to come. It’s reached the stage where no one wants to hear anything more about it. It has exhausted our patience and our interest.
Because the crisis has dragged on for so long, the conventional wisdom was, and remains, that Europe will continue to muddle through. Oh, no doubt there will be more scary headlines, more late-night, emergency meetings, more last-minute patched-up solutions, but really, there’s too much at stake for the EU countries to let something truly nasty happen. Yes, it’s true that Greece is functionally bankrupt, and yes, Spain has taken on perhaps too many of the debts of their banking system, but Germany and France will find a way of working this out, even if they have to throw too much money at it. Europe will muddle through.
This is the conventional wisdom, and so far it’s proven to be right. But let's reassess the situation from the point of view that risk is the cost of being wrong.
A case study: The costs of being wrong about Europe
First, where might we be wrong? Well, even though the Euro countries have – so far – managed to avoid a calamity, that doesn’t automatically mean that they will be able to do so over and over again in the future. As they say in the investment business: ‘Past results are no guarantee of future performance.’ Someone could goof, or someone might push someone else too hard towards to the brink, and then bad things will happen. Or the countries, central bankers, and governments involved may not actually be able to solve the problems, no matter how hard they try. They may simply be postponing the day of reckoning, hoping that some miracle will occur. I believe this latter point is the case. Let’s examine why.
These series of crises are actually one, underlying crisis that keeps re-emerging and shredding the carefully stage-managed fixes that the players keep throwing up as solutions. The underlying cause of this crisis is that the people of Europe are too old, their governments have promised too many people too many goodies for too long, and now the bills are starting to come due and the governments can’t afford to pay. It doesn’t help that governments like Greece have used the Euro to borrow on other peoples’ (read: Germany’s) credit cards, but that’s only a relative flourish on top of the much larger, underlying problem. Pension and health care entitlements, for both civil servants and the general public, are compound interest problems: the longer you leave them grow, the more expensive the solutions become. By the time people are actually retiring, there are no good solutions left, only varying degrees of bad solutions, ranging from unsatisfactory to downright disastrous.
With that in mind, can Europe make their population younger? No. Can they turn back the clock and increase the savings for their pension and health care needs? No. Can they get investors to cancel their debts so they don’t have to pay them back? Well, yes, sort of: they can force investors to accept losses (indeed, Greece has already done this once), but if those losses get to be too big, then investors will balk, demand payment, and worse, start to question the creditworthiness of the entire EU. If that happens, it kicks the situation up to an entirely new level of risk.
My point is that most pundits miss this central issue in their analysis of the European situation. They focus on the financial aspects of it, and then, thinking that it’s only a money problem, decide it can be fixed if Germany throws enough money at it. I don’t believe there’s enough money in all of Europe to solve this problem, so it must eventually blow up, no matter how clever, or hardworking, or willing the players are to fix things.
A natural – and dangerous – tendency
This is typical of a conventional business view of things: What is the immediate problem, and how can it be fixed easily and quickly? A futurist view is broader, more encompassing, and does not assume that problems can be fixed at all, let along quickly and easily. The reason this difference exists is that futurists know that there is a natural and universal tendency to foresee the future that you want to have happen. Business people don’t want uncertainty, they don’t want unsettled financial markets, and they definitely don’t want bad economic times. As a result, they have a natural tendency to foresee a muddle-through solution when that may not be the most likely result. Indeed, there is a range of possible alternatives that might emerge in Europe, and almost all of them are worse than the more-of-the-same, muddle-through scenario that markets are expecting, and that corporate managers are planning for – or not planning for.
So, in assessing a scenario that projects some form of muddle-through solution, you should also ask: What else might happen if we’re wrong about Europe muddling through? I’ll come back to the concept of alternative scenarios a little later.
Next, let’s turn to the cost part of the equation. A Greek default, either with or without an exit from the Euro, could very well precipitate a domino effect that could bring down all of Europe, and the global economy with it. In the short-run, it would almost certainly precipitate a run on Spanish bonds, and re-ignite concerns about Italian debt. But even worse, it could well start a run on the banking system, starting with Greek banks, as Greek citizens and outside investors seek to get their money out of Greece before the country reverts to the drachma, and the drachma plummets in value. And bank runs are like avalanches: they start small, but build very quickly, and can wipe out everything in their path.
A bank run in Greece could quickly become a bank run on all European banks, particularly with the German banks who hold so much Greek debt. And as Euro credits were called into doubt, investors would begin to wonder how much European debt North American banks held, and whether they were secure. Hence, a very European crisis could quickly develop into a much wider crisis, much as the mortgage securitization troubles of Wall Street, and the housing problems of Main Street lead to the pivotal bankruptcy of Lehman Brothers in September of 2008, which, in turn, led to a global financial crisis and recession. Panics, once started, don’t always need a reason to spread, which is why they’re called panics.
So the cost of being wrong about Europe muddling through could be comparable to the financial and economic ramifications of the Panic of 2008 and the Great Recession – but this time, we might not get off so easily. And that raises another aspect of this risk: the costs of being wrong about this are high enough that it’s worth thinking about.
But this is not an article specifically about the risks involved in the European crisis, but about a futurist’s approach to risk management, so let me circle back to that.
How is futurist thinking different from conventional business thinking?
Joseph Coates has been a consulting futurist for decades, and some time ago came up with a list of about 20 differences between conventional business thinking and futures thinking. Among the ones that I think are key to risk management are:
• Futures thinking has a broad, cross-disciplinary awareness of what’s going on in the world instead of focusing narrowly on just one industry, or just one economy. Connections between apparently unconnected events are sought and studied in a systematic way.
• Business thinking assumes that change will be gradual, continuous, and generally positive, whereas futures thinking deliberately considers and seeks to identify possible black swan events and significant discontinuities in future developments, in addition to tracking the more obvious trends.
• Business leaders discourage pessimistic thoughts or ideas, preferring to ‘accentuate the positive.’ Futures thinking deliberately speaks about the unpopular, the unthinkable, and the unspeakable, as well as overlooked and unexpected positives.
• Business thinking has a relatively short time horizon, usually no more than 3-5 years. Futures thinking considers much longer horizons and the consequences and implications of long-term trends, such as demographics, compounding liabilities, or technology, then relates them back to near-term objectives.
• Businesses usually focus on the best current idea. Futures thinking explores a wide range of possibilities to elicit multitudes of fresh ideas, not all of which will be immediately obvious as winners but many of which may yield important, cumulative, long-term results.
• Finally, in this list, businesses usually follow mainstream thinking, considering, projecting, or forecasting a single future based on conventional wisdom (also called ‘group think’), whereas futurists deliberately consider a range of possible futures, knowing that the future is inherently unpredictable.
How do you deal with an inherently unpredictable future?
Let me pursue this last point in particular: the future is inherently unpredictable. It is true that we can often identify trends and events that do come to pass, and it is true that there are times when we can anticipate developments well enough to take advantage of them – or to avoid being hurt by them. Clearly this is desirable, and we should seek to do so wherever possible.
But it is also clear that there are events we miss, and even in those events we do anticipate, there are implications or downstream consequences that we haven’t considered, or that develop in ways we hadn’t quite anticipated. Indeed, I often start my public presentations with a list of events of the past 10+ years that caught us largely by surprise, but which had important implications, both for business, and the society within which we live. Included in this list are:
* The earthquake and tsunami that devastated Japan in March of 2011, knocked out a third of Japan’s electric generating capacity, and put a dent in global economic growth.
* The financial panic of 2008 and the Great Recession that followed. There were a number of groups and individuals that correctly forecast both of these, including The Economist newsmagazine , but the panic in the marketplace, and the economic fallout clearly demonstrated that the large majority of people affected didn’t.
* Hurricane Katrina, which devastated New Orleans and the American Gulf Coast, knocked the American economy on its heels, and punctured the American administration’s smug demeanor.
* The SARS epidemic of 2003 that flattened the tourism industry in most of Canada, including places that never had a single case of the disease.
* And, of course, the terrorist attacks of 9/11 that rocked American society, the developed world’s opinion of its own safety, and imposed dramatic new costs on air travel, as well as draconian infringements on personal privacy.
Each of these events can be thought of as black swans (or wild cards, as futurists call them). And there are two particularly interesting things about this list. First, every single one of these events had been forecast in general terms by competent individuals who tried to warn the relevant authorities. And in every single case, those warnings were pooh-poohed or ignored as far-fetched or alarmist by such authorities. (Indeed, in the case of the 9/11 attacks, the World Trade Towers had actually been bombed – unsuccessfully – in 1993, but that attack was negligently dismissed as unimportant by the authorities.)
This illustrates the second important point about this list: it clearly illustrates that we have a gross lack of imagination about the future. We expect our future to be well-behaved and, yes, predictable, deliberately ignoring the harsh reality that the future is unlikely to be either of these things.
Futurists know that the future will catch us by surprise. I tell my clients that it’s not a question of if you are caught by surprise; that’s a virtual certainty. What is important is how quickly you recover from being surprised, and how constructively you respond to unexpected developments.
And because futurists know that the future will be surprising, they don’t try to make one, perfect prediction of the future. Instead, we prepare and consider a range of possible futures, then develop contingency plans to deal with the most important ones. Such alternative futures are called scenarios, and the process of developing and preparing for them is called scenario planning.
When I work with a new client, and get to this point in the discussion, I usually get a knee-jerk response: ‘Oh yeah, we know all about scenario planning!’
‘Excellent,’ I reply. ‘So do you do it?’
‘Well, no, but we know all about it.’
Trust me: if you don’t practice scenario planning, you don’t really know anything about it. It’s not rocket science, and the concepts are relatively commonsensical, but if you haven’t actually used them, then you really don’t know what they can do for you, or how powerful they are, even if you’ve read books or articles that discuss scenario planning. And in the context of defining and managing risk, scenarios are invaluable. Indeed, they fit perfectly with many other techniques in the risk management field.
You can’t protect against risks you don’t know you’re running
Recall that my definition is that risk is the cost of being wrong, and I commented that you had to consider what you might be wrong about. Scenario planning dramatically broadens your horizons, and involves you in thinking about, and planning for, contingencies that you would otherwise not have considered. By so doing, it alerts you to things you might be wrong about that you had never even imagined might be relevant, like the state of Greek banks, or the yield on Spanish bonds, or the security of Japanese nuclear reactors, or how much fissile material Iran has produced, or how severe the flu season was in Australia during their recent winter/our summer, or why manufacturing might be migrating back to America from China, or what’s not happening in the school systems in Canada, or the effects of automation on employment and consumer spending, or the tax implications of the rising costs of health care, or … but you get the picture.
Being wrong doesn’t just refer to judgments you’ve made about things you’ve studied; it also refers to things that will affect you, positively as well as negatively, that you aren’t aware of, or aren’t paying attention to.
You can’t protect against risks you don’t know you’re running.
So let me end with two things. First, let me suggest that you consider the future in a much broader, more systematic manner. Everyone pays lip-service to the future, and how important it is, but very few organizations actively study it, and even fewer prepare for it in a systematic way.
If actions speak louder than words, then by their actions, most organizations seem to be saying that the future isn’t really important at all, or that they are so much smarter than everyone else in the world that they are the only ones that have a perfect prediction of what’s going to happen.
Neither of these approaches takes risk management seriously. They smack of the old joke of the policeman who sees a drunk crawling around in the gutter late one night, and asks him what he’s doing. ‘I’m looking for my keys,’ slurs the drunk.
‘Well, where did you lose them?’ asks the cop.
‘Back there in that dark alley.’
‘So why are you looking here?’ asks the cop.
‘Because the light’s better,’ replies the drunk.
Companies that look for risk in convenient, narrowly-defined fields, and only within the context of the future that they expect to happen are not considering the cost of being wrong. They are, in fact, only fooling themselves that they are practicing risk management when in fact they are running severe risks they know nothing about.
Risk management is a serious business, but you’re only taking it seriously if you’re actually considering the true risks you are running and not just the ones you find easy or convenient to think about. The future is more complicated than we might wish it to be, but we don’t get a choice about what happens, only in how we respond to it. It’s your choice. It’s your future. And if I can help you in preparing for it, please contact me.
© Copyright, IF Research, November 2012.
|
|