Life is risky and tends to end in death, which makes it easy to become paranoid—about the food you eat, the air you breathe or the strangers you walk past in the street. But what should we really fear, whether as individuals or collectively?
After a year of lockdowns and millions of deaths this question is very much in the air as countries weigh up the risks of new strains of the coronavirus and the potential harms from further lockdowns. Evolution made us highly attuned to risks such as spiders or snakes. But we’re not so good at handling the risks of modern life and our governments are sometimes just as incompetent.
The surprising claim of social science is that what we fear—and how much we fear it—is in part a matter of choice. Your ancestors were terrified of going to hell or being cursed. Many of us are probably much more worried about being given cancer by a fizzy drink or the whole world going up in smoke because of climate change.
Research shows that we exaggerate fears of things that are dramatic, immediate and easy to visualize, and where we don’t have any control—like terrorism or air crashes. But we generally underestimate risks that are slow and invisible (like climate change) or where we think we are in control, like driving which may actually be much more dangerous (in the US about one in a hundred people is likely to die in a car crash and even more now die of opioid overdoses).
Media coverage feeds these imbalances. The BSE “mad cow disease” scare peaked in the 1990s and led to the culling of millions of cows at a cost of nearly £40 billion. Yet the numbers of people who died in Europe from it during that decade (about 150) were roughly equal to the number who died from drinking scented lamp oils in the same period.
The great anthropologist Mary Douglas showed half a century ago—and counter-intuitively—that much of our perception of risk is socially determined. How we see things like nuclear power or GM crops reflects our broader world views—a web of beliefs about hierarchy, individual control, or the authority of experts—as much as objective facts in the world. This has been very visible in risk perceptions of vaccines, and particularly with the rapid rollout of coronavirus vaccines, and also explains why just giving people the facts can have little impact.
These cultural dispositions also affect our views of positive risk, like our willingness to invest pension money on the stock market or to try out new cuisines. Here there are important geographical differences: contrast the US approach to venture capital and start-ups, happy with risks and failures, and the European stance which tends to be more cautious, favoring the “precautionary principle” and suspicion of new technologies like artificial intelligence.
But the patterns are also political and cut across national cultures. One of the striking findings of recent work on authoritarian politics, and the motivations of followers of figures like Donald Trump or Matteo Salvini, is that they hate complexity and distrust novelty of all kinds.
Becoming aware of the social and personal construction of risk may help us make wiser choices, as does some grasp of probability theory, which can help us distinguish the risks of death from slipping in the bath (quite high) versus being in a train crash (very low). But addressing the deeper causes of distrust may have more impact on people’s willingness to act on health risks than just providing more information.
For example, although overall the UK has pushed ahead with mass vaccination and with less hesitancy than many European countries—judging the benefits to greatly outweigh the potential and only partly known side-effects—significant minorities remain unconvinced.
But what of our collective approach to risk? Are governments any better at handling the risks we all share?
I became involved in this question 20 years ago. In 2000, a strike by fuel drivers almost brought the UK to a standstill. In the new age of just-in-time production, in which companies streamline systems to increase efficiency and lessen waste with just enough inventory as needed, it became clear that stocks of fuel for hospitals and supermarkets would barely last a couple of days. Luckily the strike was quickly settled, but not long after there was a severe outbreak of foot and mouth disease in Britain’s livestock. The country’s systems for handling risk were not up to scratch.
I was then running the UK government Strategy Unit which was asked to look at what needed to be done. We examined how big companies and other governments handled risk; tried to make sense of what had and hadn’t worked in the past; and made recommendations which were quickly put into effect.
The main conclusion was that since all organizations struggle to understand or prepare for risks, particularly high impact low probability ones, they need to find systematic ways to counter complacency. One strand was about helping government scan for potentially big risks: from pandemics to financial crises, attacks on critical infrastructure to extreme weather events. Another prepared decision makers using simulations, scenarios and models. A third involved creating a central Civil Contingencies Unit, networked into local government, to cope with the worst crises. And a fourth led to investment in longer term risks like the increased risk of flooding on the country’s east coast because of climate change.
For a decade these generally worked well. But it’s no secret that most of this apparatus crumbled during the 2010s thanks to the effects of austerity, Brexit and political distraction. Leading to disaster in 2020.
But now that we have experienced the risk—a pandemic—that always came top of the lists, there’s a serious danger of the wrong lessons being learned
Wrong lessons as well as right
One is about predictability. A top civil servant was asked in the 1970s to compile a list of several hundred of the top risks facing the UK, a huge exercise. But none of the risks on their list happened, though quite similar ones did. It follows that rather than trying to anticipate every risk you should try to cultivate resilience and adaptability so that when the crises hit you can respond fast and flexibly. Unfortunately, we’re now likely to see streams of books, gifted with hindsight, showing that COVID should have been predicted and putting pressure on governments to invest heavily in trying to predict the exact form of the next crisis.
A second, related, wrong lesson could be a swing to excessive paranoia. It’s right to criticize decision makers for not taking risks seriously (like the UK’s prime minister Boris Johnson apparently seeing COVID as “a scare story”). But just as often they’re punished for the opposite, like the French minister who spent heavily in response to H1N1 in 2009—stockpiling 2 billion masks—and was then denounced for wasteful overreaction. The same commentators who have enjoyed lambasting politicians and scientists for underestimating COVID will have equal fun lambasting their successors for hysterical overreaction. Hindsight is a wonderful thing.
Source: Read Full Article