capture-144x144-121_0-763_763.jpg

THE PSYCHOLOGY BEHIND MID

On the previous pages, we learned what MID is and why it is such a complex problem. We learned that intrinsic factors (characteristics of the believer of the false information) play a big role in this complexity, because this makes the reasons to believe a piece of information different for literally everybody. Even though all believers are individuals with different characteristics and beliefs, there is one comprehensive intrinsic factor that we all share: human psychology. The ways in which we think make us vulnerable to fake news. If we understand these cognitive processes, we can understand what makes false (medical) information so attractive to us and why we believe it. Hopefully, this will make us more aware of our own weaknesses when it comes to judging whether information is reliable or not. On this page, we explain a few important aspects of the psychology behind medical information disorder.

 
Schermafbeelding 2020-11-10 om 11.44.24.

1: STORYTELLING

Storytelling is something that is in human nature. We have always been natural storytellers; we have done so since the beginning of our existence. This was beneficial for our survival: early humans could spread their experiences and useful life hacks fast, so that others could learn from it very quickly. It allowed us to solve complex problems together. This is called collective learning. However, this is a double-edged sword: it also creates opportunities for false theories or information to spread.

 
Schermafbeelding 2020-11-10 om 11.48.25.

2: CONFIRMATION BIAS

Confirmation bias can be regarded as a sort of prejudice. We often have the tendency to believe information that is in line with our pre-existing beliefs, expectations, or assumptions, and reject information that is not. In this way, information can be misinterpreted or even selectively sought to confirm our beliefs. For example, one might be told by someone that they have never been vaccinated but were also never sick. Someone who already believes that vaccines do not work will likely perceive this experience as proof that their belief is correct. The former example also includes the survivorship bias, in which we generalize successes (in this case, an unvaccinated individual not getting sick) while ignoring failures (all the unvaccinated individuals that did get sick). One cause of the confirmation bias is that it is less effortful to not question our beliefs and assumptions. By only seeking out information that we agree with, there is less confrontation with other perspectives and opinions. Our own beliefs are hence often echoed back to us, creating a cognitive echochamber. Within echochambers, opinions become more extreme and people with different views are often demonized. Next to confirmation bias, there are many more cognitive biases.

 
Schermafbeelding 2020-11-10 om 11.54.09.

3: MOTIVATED REASONING

This process is similar to confirmation bias: people use their reasoning to confirm what their beliefs are and favour arguments that they want to believe. Normally, when reasoning, we want to determine what the truth is. The difference between confirmation bias and motivated reasoning is that motivated reasoning is a rational process that causes misbeliefs, whereas confirmation bias could be seen as irrational or lazy thinking.

 
Schermafbeelding 2020-11-10 om 11.57.54.

4: COGNITIVE DISSONANCE

When you encounter information that contradicts your beliefs, sometimes this gives you a negative feeling: you might have been wrong. Usually, after this you try to do some more research to find out what to believe. However, the negative feeling (or: dissonance) gives people the tendency to reject the information that might have been credible. In this way, people stick to their initial beliefs.

 
Schermafbeelding 2020-11-10 om 12.13.16.

5: COGNITIVE MISERLINESS

Part of what makes our brains so efficient is that we can use simple ways of solving problems so that not everything requires a lot of mental effort. As a consequence, we do not have to think really hard about every single thing. However, this makes us a bit lazy as well: we sometimes also use this simple way of thinking for problems that require a little more mental effort, such as judging whether information we encounter online is reliable. This makes us less sceptical and more vulnerable to believing misinformation. For example, when we do not exert effort to critically evaluate new information, we are more prone to believe statements that seem profound (because of, for example, the use of smart words), even if the statement has little to no actual content. 


Cognitive miserliness is related to the idea that we have two ways of thinking

  1. System 1 - Fast thinking: this is an automatic, simple, and effortless way of thinking. With fast thinking, important details can be missed, and we think that information that is easy to process is true - even when it’s not. 

  2. System 2 - Slow thinking: this is a more analytic way of thinking that requires more effort. 

 
Schermafbeelding 2020-11-10 om 12.14.12.

6: FLUENCY

In the point about cognitive miserliness, we explained that there are two systems for thinking and that we prefer to use the first system. This is the easiest one that requires the least mental effort. Fluency is how easily someone processes information. If something is easy to process, it feels right, and so we believe it’s true. That is also why repetition plays an important role (see availability bias): when something is more familiar, it is more fluent: it gets easier to process and thus you believe it more easily. Humans are already more likely to accept information they encounter than reject it (this is called the default of acceptance). Familiar information that intuitively feels right increases the likelihood that we accept information even more. Especially with medical dis- and malinformation, which are intentionally created, deliberately repeating information is a way to convince you of believing the information.

 
Schermafbeelding 2020-11-10 om 12.14.56.

7: HEURISTICS - JUDGEMENT INDICATORS

Heuristics are indicators that we use to make a quick judgement with, so that we do not have to do very extensive research ourselves. This also explains why social media creates such a good bubble for the spread of false medical information. When a person we trust posts something, we use this trust as an indicator of how trustworthy that information is. However, this is not a reliable representation of the truth and can lead you to believe things that are not true. On social media, you generally follow your friends, families or other celebrities that you like - people you trust or look up to. This creates a social media bubble in which you use biased indicators to validate information.

 
Schermafbeelding 2020-11-10 om 12.15.32.

8: SOCIAL INFLUENCE

Similarly to the heuristics as judgement indicators, the influence of others on social media can subconsciously (but also consciously) make us believe misinformation. Subconsciously, seeing opinions and information posted by our friends online can make us think and feel the way they do. Generally, we only follow people who we personally like or find interesting. This makes the information we receive through our social media likely to be one-sided. Additionally, it may consciously drive us to accept their theories as well, because we do not like to call our friends out. We want to avoid social rejection and this creates social pressure. We furthermore might start to overestimate the number of people that agree with our stance, a phenomenon coined perceived social consensus. Sometimes we perceive corrections of false information as someone telling us what to think. This may cause us to reject the correction (and endorse the false information even more) in the name of social reactance. 

 
Schermafbeelding 2020-11-10 om 12.16.37.

9: AVAILABILITY BIAS

Fake information is widely spread. Sometimes we encounter the same piece of information several times a day. When we see something more often, the information becomes easier available in our minds. This subsequently leads to that we are able to recall this information a lot quicker. In other words: we remember that information better, so that it feels more familiar and credible. Availability bias is also the motive behind many political campaigns and advertisements.

 
Schermafbeelding 2020-11-10 om 12.17.16.

10: EMOTIONS

As humans, we have a lot of emotions that we routinely rely on to make decisions. When emotions are used in reasoning, they make us more vulnerable to false information. Thinking then becomes more intuitive, and rational thinking is easily mixed up with or influenced by our current feelings and past experiences. It minimizes room for critical thought. Fake news often plays on these emotions, by evoking anger or fear. The creators often use a scapegoat in order to try to create division between two parties. A good example of how our emotions influence critical thinking is the current COVID-19 pandemic. Even though the government reassured us that we had enough supplies, people still hoarded toilet paper because they were scared and because many others were doing the same. Thinking on an emotional basis makes us more susceptible to MID.

 
Schermafbeelding 2020-11-10 om 12.18.12.

11: FALSE MEMORY

Sometimes, our brains can even create false memories. This is also related to confirmation biases - the tendency to believe information that is in line with our own beliefs or hypotheses. False memories are memories of events that did not happen in the way we remember, even though we experience this memory as vividly as real memories. These false memories are sometimes created to support our own beliefs and assumptions. Sometimes information that is presented after an event interferes with our memory of the event, making the memory become less accurate. When we are presented with an engaging story from an untrustworthy source, we sometimes forget where we got the information from. The consequence of this Sleeper Effect is that the (false) information is then likely accepted as truth. False memories also occur when real memories fade and are then supplemented by our brains with false information. 

 
Schermafbeelding 2020-11-10 om 12.19.09.

12: THE DUNNING-KRUGER EFFECT

Humans are generally not good at evaluating the extent of their own knowledge. One of the most prominent examples of this is the Dunning-Kruger effect, which describes that the people with the littlest knowledge often overestimate themselves, while the true experts underestimate their expertise. For example, researchers found that those knowing little about the causes of autism and safety of vaccines were overconfident in their beliefs, thinking they know just as much or more than doctors and scientists. In the words of Confucius, Real knowledge is to know the extent of one's ignorance.

 

IN SUMMARY

On the right we have summarized some of the most important information about the psychology behind MID in an infographic. 

You can use this summary if you quickly want to look back on the information, for instance when you want to improve the psychological flaws we all have that make us more susceptible to MID.

Click the button below to learn about 'The spread' of MID.

WhatsApp Image 2020-11-10 at 12.45.53.jp
 

WANT TO KNOW MORE?

Find out more about the psychology behind medical information disorder and what makes it attractive by reading these interesting articles we have selected just for you:

Image by Simone Secci
Image by Maria Teneva
Image by Morgan Housel

WHO FALLS FOR FAKE NEWS? THE ROLES OF BULLSHIT RECEPTIVITY, OVERCLAIMING, FAMILIARITY, AND ANALYTIC THINKING

BEYOND MISINFORMATION: UNDERSTANDING AND COPING WITH THE “POST-TRUTH” ERA

MISINFORMATION AND ITS CORRECTION: CONTINUED INFLUENCE AND SUCCESSFUL DEBIASING

computer Tutorials

FIGHTING FAKE NEWS: A ROLE FOR COMPUTATIONAL SOCIAL SCIENCE IN THE FIGHT AGAINST DIGITAL MISINFORMATION