John Adams | May 2007, Public Risk Forum, PRIMO
In popular imagination, rocket science is the totemic example of scientific complexity. As Britain’s leading academic expert on risk, I will argue here that risk management is in fact much more complex. To put it another way, the scientist studying turbulence “the clouds do not react to what the weatherman or physicist says about them”.
The risk manager must, however, deal not only with risk perceived through science, but also with virtual risk – risks where the science is inconclusive and people are thus “liberated to argue from, and act upon, pre-established beliefs, convictions, prejudices and superstitions.”
The affluent world is drowning in risk assessments. Almost everyone now has a “duty of care” to identify formally all possible risks to themselves, or that they might impose on others, and to demonstrate that they have taken all reasonable steps to “control” them. It is not clear that those imposing this duty of care appreciate the magnitude and difficulty of the task they have set.
In 2004 I participated in a conference on terrorism, World Federation of Scientists’ International Seminar on Terrorism, Erice, Sicily. Most of the other participants were eminent scientists, and I found myself in a workshop entitled Cross-disciplinary challenges to the quantification of risk. Lord Kelvin famously said:
“Anything that exists, exists in some quantity and can therefore be measured.”
This dictum sits challengingly alongside that of another famous scientist, Peter Medewar who observed:
“If politics is the art of the possible, research is the art of the soluble. Both are immensely practical minded affairs. Good scientists study the most important problems they think they can solve [my emphasis]. It is, after all, their professional business to solve problems, not merely to grapple with them.”
Terrorism undoubtedly exists, and some of its consequences can be quantified. One can count the numbers killed and injured. With the help of insurance companies one can have a stab at the monetary value of property destroyed and, for those with business continuity insurance, the value of business lost. But what units of measurement might be invoked to calculate the impact of the terror that pervades and distorts the daily life of someone living in Chechnya, or Palestine, or Darfur or …. ? Or the loss of civil liberties resulting from the anti-terrorism measures now being imposed around the world.
The problem becomes more difficult when one moves on to the challenge of quantifying the risk of terrorism. Risk is a word that refers to the future. It has no objective existence. The future exists only in the imagination. There are some risks for which science can provide useful guidance to the imagination. The risk that the sun will not rise tomorrow can be assigned a very low probability by science. And actuarial science can estimate with a high degree of confidence that the number of people killed in road accidents in Britain next year will be 3500, plus or minus a hundred or so. But these are predictions, not facts. Such predictions rest on assumptions; that tomorrow will be like yesterday; that next year will be like last year; that future events can be foretold by reading the runes of the past. Sadly, the history of prediction contains many failures – from those of stock market tipsters to those of vulcanologists seeking to predict eruptions, earthquakes and tsunamis.
Risk: a booming term
Today risk management seems a booming term. Every day politicians, managers and experts use the term, in close relation to the financial crisis we face today. And every day the term is often mispronounced and not used in the right context. It is what happened with the term ‘sustainable’. It has thousands and thousands of interpretations. So let us be careful with the use of “risk” and “risk management”.
When virtual risks – sometimes called unconfirmed hypotheses – get mistaken for risks about which science has clear and useful advice to offer, much confusion results. This is the circle in which one finds problems characterised by complexity and uncertainty.
Type “risk” into an Internet search engine and you will get over 100 million hits. You need sample only a small fraction to discover many unnecessary, and often acrimonious, arguments. Risk is a word that means different things to different people.
It is a word that engenders a sense of urgency because it alludes to the probability of adverse, sometimes catastrophic, outcomes. Much of the acrimonious urgency, or the urgent acrimony, that one uncovers searching for “risk” on Google, stems from a lack of agreement about the meaning of the word. People are using the same word, to refer to different things, and shouting past each other. Figure 1 is proffered in the hope of clearing away some unnecessary arguments.
“Types” of people (related to risk)
The less conclusive the science, the more influential become the filters through which risks and rewards are perceived. This typology captures commonly encountered types:
- The egalitarian is fearful and risk averse – if you can’t prove it’s safe assume it’s dangerous and invoke the precautionary principle.
- The individualist is an optimist and pragmatist – if you can’t prove its dangerous assume it’s safe – tends to focus on the rewards of risk.
- The fatalist ducks if he sees something about to hit him – otherwise que sera sera.
- Institutional risk managers are professional hierarchists. They commission more research to find the right answer. Very uncomfortable in the presence of virtual risk.

Fig. 1. Three types of risk.
Types of risk
It is important to be clear about the type of risk you are dealing with. Directly perceptible risks are dealt with instinctively and intuitively. Virtual risks are culturally constructed – when the science is inconclusive people are liberated to argue from pre-established beliefs, convictions and prejudices.
Risk management is a balancing act. Institutional risk management commonly ignores this fact. The job specification of most institutional risk managers obliges them to deal only with the bottom loop – judgements about safety ought not, they insist, to be “corrupted” by concerns about rewards.
A one-sided concern for reducing accidents without considering the opportunity costs of so doing fosters excessive risk aversion – worthwhile activities with very small risks are inhibited or banned. conversely, the pursuit of the rewards of risk to the neglect of social and environmental “externalities” can also produce undesirable outcomes. Some seek to distinguish between “objective” or “actual” risk and “perceived” risk. But risk is a word that refers to a future that exists only in the imagination – all risk is perceived.

Fig. 2. Three types of risk (re-draw). An attempt to highlight the strict limits to the ability of science to foretell the future.
The three types of risk (see figure 2) as they may be considered:
- Directly perceived risk (much operational risks) are dealt with using judgement – a combination of instinct intuition and experience. One does not undertake a formal, probabilistic, risk assessment before crossing the road. Crossing the road in the presence of traffic involves prediction based on judgement. One must judge vehicle speeds, the gaps in traffic, one’s walking speed, and hope one gets it right, as most of us do most of the time.
- Risk perceived through science. Most of the published literature on risk management falls into this category. Here one finds not only biological scientists in lab coats peering through microscopes, but physicists, chemists, engineers, doctors, statisticians, actuaries, epidemiologists and numerous other categories of scientist who have helped us to see risks that are invisible to the naked eye. Collectively they have improved enormously our ability to manage risk – as evidenced by the huge increase in average life spans that has coincided with the rise of science and technology. This realm of quantified risk assessment is often, but not always – successfully. However objective in appearance, assessments in this circle rest ultimately on subjective assumptions.
- Virtual risk. But where the science is inconclusive we are thrown back on judgement. We are in the realm of virtual risk. These risks are culturally constructed – when the science is inconclusive people are liberated to argue from, and act upon, pre-established beliefs, convictions, prejudices and superstitions. Such risks may or may not be real but they have real consequences. In the presence of virtual risk what we believe depends on whom we believe, and whom we believe depends on whom we trust. Here we encounter arguments about values, the nature of nature, standards of proof, the precautionary principal and the role of regulation. In this circle, as with directly perceptible risk, we are thrown back on judgement.
Most of the published literature on risk management falls into the category of risk perceived through science. Here one finds not only biological scientists in lab coats peering through microscopes, but physicists, chemists, engineers, doctors, statisticians, actuaries, epidemiologists and numerous other categories of scientist who have helped us to see risks that are invisible to the naked eye. Collectively they have improved enormously our ability to manage risk – as evidenced by the huge increase in average life spans that has coincided with the rise of science and technology.
But where the science is inconclusive we are thrown back on judgement. We are in the realm of virtual risk. These risks are culturally constructed – when the science is inconclusive people are liberated to argue from, and act upon, pre-established beliefs, convictions, prejudices and superstitions. Such risks may or may not be real but they have real consequences. In the presence of virtual risk what we believe depends on whom we believe, and whom we believe depends on whom we trust.
A participant at the conference on terrorism was one of the world’s foremost experts on turbulence, notoriously the most intractable problem in science. In the mythology of physics Werner Heisenberg is reported as saying:
“When I meet God, I am going to ask him two questions: Why relativity? And why turbulence? I really believe he will have an answer for the first.”
I would trust the physicist I met at the conference to tell me the truth about turbulence, so far as he knew it. But the problems he is studying are simple compared to those of the risk manager, because the clouds do not react to what the weatherman or physicist says about them.
Risk management: where are the keys?
We are all risk managers. Whether buying a house, crossing the road, or considering whether or not to have our child vaccinated, our decisions will be influenced by our judgement about the behaviour of others, and theirs by their judgements about what we might do.
The world of the risk manager is infinitely reflexive. In seeking to manage the risks in our lives we are confronted by a form of turbulence unknown to natural science, in which every particle is trying to second guess the behaviour of every other.
Will the vendor accept less in a falling market? Will the approaching car yield the right of way? Will enough other parents opt for vaccination so that my child can enjoy the benefits of herd immunity while avoiding the risks of vaccination? And, increasingly, if things go wrong, who might sue me? Or whom can I sue? The risk manager is dealing with particles with attitude.
It is wise to alert to the strict limits of natural science in the face of such turbulence, warned that we were in danger of becoming the drunk looking for his keys, not in the dark where he dropped them, but under the lamp post where there was light by which to see. This caution prompted the re-drawing of the previous figure into a new figure, an attempt to highlight the strict limits to the ability of science to foretell the future.
“As the likelihood of physical harm has decreased the fear, and sometimes the likelihood, of being sued has increased”
In the area lit by the lamp of science one finds risk management problems that are potentially soluble by science. Such problems are capable of clear definition relating cause to effect and characterized by identifiable statistical regularities.
On the margins of this area one finds problems framed as hypotheses and methods of reasoning, such as Bayesian statistics, which guide the collection and analysis of further evidence. As the light grows dimmer the ratio of speculation to evidence increases. In the outer darkness lurk unknown unknowns. Here lie problems with which, to use Medawar’s word, we are destined to “grapple”.
As the light of science has burned brighter most of the world has become healthier and wealthier and two significant changes have occurred in the way in which we grapple with risk. We have become increasingly worried about more trivial risks, and the legal and regulatory environments in which we all must operate as individual risk managers have become more turbulent.
Perhaps the clearest demonstration of this can be found in the increase in the premiums that doctors must pay for insurance, and the way this varies according to the type of medicine practiced. The Medical Protection Society of Ireland has four categories of risk: low, medium, high and obstetricians. Between 1991 and 2000 the premium charged to those in the low category increased by 360 percent to €9854, and that charged to obstetricians increased by 560 percent to € 54567.
Measured in terms of its impact on peri-natal mortality rates, obstetrics and gynecology can claim a major share of the credit for the huge increases in average life expectancy over the last 150 years.
This most successful medical discipline is now the most sued – so successful that almost every unsuccessful outcome now becomes a litigious opportunity. I don’t know of any risk assessment that predicted that.
There is a distinction, frequently insisted upon in the literature on risk management, between “hazard” and “risk”. A hazard is defined as something that could lead to harm, and a risk as the product of the probability of that harm and its magnitude; risk in this literature is hazard with numbers attached. So, relating this terminology to Figures 3.1 and 3.2, it can be seen that risk can be placed in the circle “perceived through science” while the other two circles represent different types of hazard.
Typing “hazard management” into Google at the time of writing yielded 70,000 hits; “risk management” 12 million. The number of potential harms in life to which useful numbers can be attached is tiny compared to the number through which we must navigate using unquantified judgement. The Kelvinist, rocket-science approach to virtual risks, with its emphasis on the quantitatively soluble, threatens to divert attention from larger, more complicated, more urgent problems with which we ought to be grappling.
Bibliography
This article is a contraction of the articles “Three Risk Framing Devices”, published in the magazine PRIMO Risk Management & Governance in 2009 andas “Risk Management: It’s Not Rocket Science – It’s Much More Complicated”, published in Public Risk Forum, Edition May 2007, both from the hand of John Adams, Professor Emeritus.
For inspiration and information, please visit Risk in a Hypermobile World, the blog of John Adams.
Some references
Making God laugh: a risk management tutorial
7/7: What Kills You Matters – Not Numbers, Times Higher, 29 July 2005
Risk – available from Amazon.
Update-to date preface: Deus e Brasileiro
This publication is part of the web-book Public Risk Canon