Susceptibility to misinformation as a feature not a bug
We need to go beyond apparent failures of human cognition to explain misinformation and look instead to the social nature of belief
Understanding how we form beliefs is critical if we are to understand the reasons for misinformation and how to address it. Last week we challenged the notion of motivated reasoning – the idea that our beliefs are simply the by-product of our political perspective. Instead we consider that our beliefs have a significant social element to them – and while this is critical for human flourishing, the very same mechanism can lead us astray.
Neil Levy sets out the way in which as humans we live in a ‘cumulative culture’ in which our collective, shared knowledge has a type of a ‘ratchet effect’. This means we are not all having to learn anew each time, instead our collective knowledge becomes a shared platform on which others can build. For example, we do not need to learn about vaccines anew each time, we have a collective understanding of what they are and how they work. This means that when a vaccine becomes available for a new condition, then we can consider this particular application rather than having to be educated about vaccines from scratch.
The adaptive benefits of social cultural cognition
It is this shared mechanism that means humans are able to achieve far more than any individual or indeed any generation can achieve. Levy sets out the way that the evolution of cultural knowledge means that we are able to detect signal in noise when the degree of noise is greater than the capability of our individual cognition to analyse. By this we mean that when the relationship between an action and the effects are slow to become apparent and are probabilistic then, as individuals, we find it difficult to tease out the relationships.
One example of this is the length of time it took to establish the impact of tobacco on health; for many years, people challenged any link between smoking and cancer, as the public was more focused on prominent cases of people who had lived to old age despite being heavy smokers. Of course, science provides tools for identifying signal in the noisy association between variables, such as that between smoking and cancer. Without these tools, our own individual cognition is unreliable. But cultural cognition often succeeds in identifying the signal amid the noise without the need for statistical tools.
Levy gives the example of traditional Arctic life which requires knowledge of how to make particular clothing, build and use tools for hunting, make snow houses, build fires and so on. The know-how needed for any one of these elements is hard to acquire and each are of little use by themselves. Acquiring and using this sort of knowledge is something that is built of over generations to the extent that it is then part of what we do to live, manage risk and flourish as humans. We could apply the same principles to our day to day lives – education, social skills, employment capabilities, eating and exercising well and so on are things that of course we can build on and look for help with but at the same time are an engrained part of how we have learned to live well.
What we know is not always explicit
Of course, these deep social knowledge and practices may well be partially opaque to those who inherit them. When things suddenly change then we can see this a little more clearly: COVID has perhaps shown us think how many things we do that we do not examine particularly deeply such as hand shaking. The advice to avoid hand shaking suddenly allowed us to see the complex set of human needs that are met through this simple act – but that it is so embedded that we simply do not see them in everyday life.
Acquiring this social knowledge is something that is often not explicitly taught – we simply take them on trust – this is how things are done, so this is how I will do it. We acquire the norms, customs and conventions of those around us. Levy points out that the conformist bias helps us to acquire the local conventions and norms, and the prestige bias leads us to imitate particularly successful individuals. But rather than this ‘social referencing’ being a reflection of our inherent irrationality and some kind of deficit, it is in fact a highly adaptive means by which we acquire all-important shared knowledge.
And this is not something that we necessarily accept unthinkingly. On the contrary, we are selective in who we look to, filtering out information from out-group members when there is evidence that it conflicts with the majority view, and when it comes from those known to be unreliable or untrustworthy. This is intelligent, rational behaviour rather than unreflective and automatic.
Snowballing nature of belief change
This social referencing means that changes of mind can ‘snowball’ as shifts in belief can create additional shifts, resulting in rapid changes. This is in no small part because we do not, and cannot hope to, have access to ‘first order representations’. The vast majority of the population accept that vaccines can prevent illness but have little understanding of how they actually do this. Similarly, many of us “believe in” evolution but have little understanding of the theory or the mechanisms involved. But as Levy points out, this indistinctness is not a bug, but a feature: a deep understanding the science is very hard so having indistinct concepts allows us to defer to the experts.
This has also been called has called the ‘paradox of the psychosocial’; we can think of people as being suggestible if they accept knowledge from others as true, despite lack of evidence. Instead of living in the real world, suggestible people attract our disapproval as they are overly influenced by the things other people say, do and want. `But on the other hand, this very same capacity is seen as something to be celebrated given it is that which makes learning, affection, socialization and social cohesion possible at all.
This outsourcing of the detail allows us to operate effectively through the use of indistinct meta-beliefs, building on the work of others in our community and wider society. But it also has a vulnerability. Once others know the cues we use to determine how to respond to the information we receive, then we are vulnerable to having our environment ‘epistemically polluted’ as others knowingly or unwittingly use these cues for influence.
There are many ways in which our ‘epistemic’ (knowledge) environment may be polluted such as the use of supposedly neutral think tanks that are in fact funded by particular interest groups, predatory publishers, who publish low-quality scientific research for a fee, and, as we have flagged previously, the way in which issues are knowingly framed in the media in ways that represent the interest of only a few, can all influence what we believe.
We can of course also go further than this to examine the way in which ‘information warfare’ is taking place with purposeful attempts to shape agendas and undermine confidence in the other side’s narrative. We can see this playing out in many areas, not least in the current evolving situation with the Ukraine.
Understanding the social nature of knowledge is looking critical for us to understand how to navigate misinformation. Much of the focus to date has been to examine the capability of the individual – trying to find a deficit on the cognition or finding fault on their motivations. If this is the assessment, then the solutions are all about finding ways to mitigate these flaws so that we can be more vigilant and better ‘trained’.
However, while this may be some of the solution, there is a mounting body of evidence that points away from the individual specially and more focused on the way in which we are able to navigate the wider shared knowledge community in which we live (see here for more background).
With the social nature of knowledge as a key diagnosis then the prescription for a ‘cure’ is surely pointing towards collective action, encouraging an open and considered dialogue on topics that are contested. We will explore this topic in greater detail in next week’s posting.
Get a weekly dose of behavioural science that dissects the hot issues of the day by subscribing to Frontline BeSci