Risky Business: Beliefs, values and the sifting of facts
How our shared identities shape the values we hold - which end up determining the way we evaluate risk
The UK government has today lifted restrictions concerning a wide range of COVID safety measures – and yet according to data recently released by Ipsos, many people are not supportive of the measures. So how did the government come to be at such odds with wider public opinion?
Is it that the government is simply better at understanding risk than the general public? This perspective is somewhat undermined when organisations such as the WHO warn countries to lift their Covid-19 restrictions slowly so as “not to lose the gains that [they] have made.” It seems there is a difference in the way in which different groups are evaluating risk, despite the ‘facts’ being widely shared.
It seems there is something of a disconnect in assessment of risk between politicians (and a substantial minority of the British public) on the one side and a much larger proportion of the general public as well as a number of experts and public health bodies on the other. Regardless of our position on the policy direction, how can we best understand this difference of perspective and what does it tell us about the ways humans evaluate risk?
How we evaluate risk
It is now well understood that the way humans going about evaluating risk is not simply a matter of weighing up the alternatives and then making the ‘right’ judgement call. Instead, a wide variety of influences come into play: indeed, policy about risk is never a simple translation of ‘scientific fact’ as there are a wide set of other social, wellbeing and economic considerations that inevitably come into play.
Even so, we might think that after so long living with COVID, with hundreds of scientific advisors in place, and a great deal of public debate, we might expect there to be broad agreement between government policy and public appetite for what safety measures to enact. So why do we not have this?
Perhaps a clue to the reason for the discrepancy came when Boris Johnson considered that we can now use our own “personal responsibility” as a means to navigating risk relating to COVID. The notion that we are all individually responsible for the choices we make, and our subsequent success or failure is very much a core value for those holding right-of-centre political views. Those to the left will typically reference more communal (and less individualistic) structural factors when forming policy.
The role of cultural values in shaping our beliefs
So do the values we hold influence the way in which we evaluate risk? Dan Kahan, Professor of Law & Professor of Psychology at Yale Law School certainly considers that this is so. He makes a case for the way in which the cultural values we hold define our social identities – which in turn then shape our beliefs about disputed matters of fact (e.g., whether humans are responsible for climate change; whether the death penalty prevents murder).
This helps to explain why groups with different cultural outlooks (such as left or right of centre political orientation) disagree about important societal issues. On this basis disagreement is not due to people failing to understand the science or even that they lack relevant information. Instead, according to Kahan, disagreement is generated from the way “people endorse whichever position reinforces their connection to others with whom they share important ties”.
Kahan suggests this is a form of motivated reasoning; advocating beliefs that are not consistent with the sentiment in one’s group could threaten one’s position within the group, in which case people may be motivated to “protect” their cultural identities.
Values and science
But is it this straightforward? It is tempting to think that science operates in a way that can offer us objective facts and it is our values that determine which of these we look at and how we use them. But of course the notion that science itself is an activity that is undertaken without recourse to any human values is one that has long been punctured by the likes of Bruno Latour who points out that scientific knowledge does not operate in a vacuum but is the product of particular people, in a particular setting, and in a particular time. Our values seep through everything we do and while science has tools and processes to keep these at a degree of arms length, it is not immune.
The point of this diversion into the philosophy of science is to counter the notion that differences in judgement about risk are purely about some kind of deficit due to motivated reasoning. Instead we can start to see that the values we hold are not something entirely separate to scientific facts. Any consideration of facts and what you do about them is hard to do without reference to some type of values.
So does that mean when we are trying to decide what the right thing to do, that we are forever trapped inside our own values, destined to always disagree with each other based on the values of the group we occupy?
Not so we can argue, for two reasons. First, while we recognise that scientific facts are socially constructed, and often contested, it does not mean we each have a licence to cherry pick facts to our own liking. As Silvio Waisbord put it: “Sure, we should approach facts and expertise critically, contest self-appointed arbiters of truth, believe truth is misty and manifold, conceive truth-telling as a complex process, be sensitive to multiple perspectives, and doubt any confident claims to truth. Such sensibility, however, is completely different from the conviction that facts and rigor do not matter, that all truth-telling is wrong, and that subjective beliefs are sufficient proof of reality”. And Bruno Latour, himself has therefore called for the return of some of the authority of scientific expertise. For more discussion on this issue see here.
Second, it seems the general public fortunately does in fact evaluate scientific facts in a way that looks beyond their own values. Research by Sander van der Linden on climate change found that when the general public start to sense a ‘scientific consensus’ on an issue, this acts as a ‘Gateway’ to changing beliefs about an issue, which then has an influence on behaviour.
So whilst values have a role to play in the way we construct and maintain our beliefs about risk, the Gateway Beliefs Model does surely suggest that we are not trapped, solely motivated to “protect” our cultural identities.
On conclusion
Whether you agree or disagree with the UK Government’s approach, the policy decision does perhaps offer us some insight into the way that evaluation of risk is not a simple computation style process that we go through. Instead we engage in a complex relationship between values, beliefs and ‘facts’ that is not always easy for an observer to tease out.
For our purposes we can see how our beliefs about risk have a strong social dimension, tied to shared identities and associated values. So yet again, we observe that our behaviour and decisions about risk are something that references the ‘we’ rather than purely occupying the ‘me’ and as such we need to ensure behavioural science theory, thinking and tools properly reflect this.