Who made more accurate predictions about the course of the COVID-19 pandemic—experts or the public? A study from the University of Cambridge has found that experts such as epidemiologists and statisticians made far more accurate predictions than the public, but both groups substantially underestimated the true extent of the pandemic.
Researchers from the Winton Centre for Risk and Evidence Communication surveyed 140 UK experts and 2,086 UK laypersons in April 2020 and asked them to make four quantitative predictions about the impact of COVID-19 by the end of 2020. Participants were also asked to indicate confidence in their predictions by providing upper and lower bounds of where they were 75% sure that the true answer would fall—for example, a participant would say they were 75% sure that the total number of infections would be between 300,000 and 800,000.
The results, published in the journal PLOS ONE, demonstrate the difficulty in predicting the course of the pandemic, especially in its early days. While only 44% of predictions from the expert group fell within their own 75% confidence ranges, the non-expert group fared far worse, with only 12% of predictions falling within their ranges. Even when the non-expert group was restricted to those with high numeracy scores, only 16% of predictions fell within the ranges of values that they were 75% sure would contain the true outcomes.
“Experts perhaps didn’t predict as accurately as we hoped they might, but the fact that they were far more accurate than the non-expert group reminds us that they have expertise that’s worth listening to,” said Dr. Gabriel Recchia from the Winton Centre for Risk and Evidence Communication, the paper’s lead author. “Predicting the course of a brand-new disease like COVID-19 just a few months after it had first been identified is incredibly difficult, but the important thing is for experts to be able to acknowledge uncertainty and adapt their predictions as more data become available.”
Throughout the COVID-19 pandemic, social and traditional media have disseminated predictions from experts and nonexperts about its expected magnitude.
Expert opinion is undoubtedly important in informing and advising those making individual and policy-level decisions. However, as the quality of expert intuition can vary drastically depending on the field of expertise and the type of judgment required, it is important to conduct domain-specific research to establish how good expert predictions really are, particularly in cases where they have the potential to shape public opinion or government policy.
“People mean different things by ‘expert’: these are not necessarily people working on COVID-19 or developing the models to inform the response,” said Recchia. “Many of the people approached to provide comment or make predictions have relevant expertise, but not necessarily the most relevant.” Recchia noted that in the early COVID-19 pandemic, clinicians, epidemiologists, statisticians, and other individuals seen as experts by the media and the general public, were frequently asked to give off-the-cuff answers to questions about how bad the pandemic might get. “We wanted to test how accurate some of these predictions from people with this kind of expertise were, and importantly, see how they compared to the public.”
For the survey, participants were asked to predict how many people living in their country would have died and would have been infected by the end of 2020; they were also asked to predict infection fatality rates both for their country and worldwide.
Both the expert group and the non-expert group underestimated the total number of deaths and infections in the UK. The official UK death toll at 31 December was 75,346. The median prediction of the expert group was 30,000, while the median prediction for the non-expert group was 25,000.
For infection fatality rates, the median expert prediction was that 10 out of every 1,000 people with the virus worldwide would die from it, and 9.5 out of 1,000 people with the virus in the UK would die from it. The median non-expert response to the same questions was 50 out of 1,000 and 40 out of 1,000. The real infection fatality rate at the end of 2020—as best the researchers could determine, given the fact that the true number of infections remains difficult to estimate—was closer to 4.55 out of 1,000 worldwide and 11.8 out of 1,000 in the UK.
“There’s a temptation to look at any results that says experts are less accurate than we might hope and say we shouldn’t listen to them, but the fact that non-experts did so much worse shows that it remains important to listen to experts, as long as we keep in mind that what happens in the real world can surprise you,” said Recchia.
Source: Read Full Article