Skip to content Skip to menu
This website uses cookies to help us understand the way visitors use our website. We can't identify you with them and we don't share the data with anyone else. If you click Reject we will set a single cookie to remember your preference. Find out more in UCL's privacy notices.

Information “blindness”: Having high confidence in a belief means our brains do not process contradictory information

Researchers from the Metacognition Team find that people are selective about the information that they process, depending on whether it confirms or contradicts a prior belief, according to a new study published in Nature Communications.

The study shows that when people are highly confident in a particular decision, they selectively integrate information that confirms their decision, yet they do not process information which contradicts it. This biased intake of information might lead to inaccurate and skewed perspectives – a process highly relevant for many societal issues such as political or scientific debate.

“We were interested in the cognitive and neural mechanisms causing people to ignore information that contradicts their beliefs, a phenomenon known as confirmation bias. For example, critics of climate change might ignore scientific evidence that indicates the existence of global warming. While psychologists have long known about this bias, the underlying mechanisms were not yet understood,” said lead author Max Rollwage (PhD candidate at the Wellcome Centre for Human Neuroimaging and Max Planck UCL Centre for Computational Psychiatry & Ageing Research). “Our study found that our brains become blind to contrary evidence when we are highly confident, which might explain why we don’t change our minds in light of new information.”

For the study, 75 participants conducted a simple task: they had to judge whether a cloud of dots was moving to the left or right side of a computer screen. They then had to give a confidence rating (how certain they were in their response). After this initial decision, they were shown the moving dots again and asked to make a final decision. The information was made even clearer the second time and could help participants to change their mind if they had initially made a mistake. However, when people were confident in their initial decision, they rarely used this information to correct their errors.

25 of the participants were also asked to complete the experiment in a brain scanner known as a magnetoencephalography (MEG) scanner. The researchers monitored their brain activity as they processed the leftward and rightward motion of the dots. Based on this brain activity, the researchers evaluated the degree to which participants processed the newly presented information. When people were not very confident in their initial choice, they integrated the new evidence accurately. However, when participants were highly confident in their initial choice, their brains were practically blind to information that contradicted their decision but remained sensitive to information that confirmed their choice.

“Confirmation bias is often investigated in scenarios that involve complex decisions about issues such as politics. However, the complexity of such opinion makes it difficult to disentangle the various contributing factors to the bias, such as wanting to maintain self-consistency with our friends or social group. By using simple perceptual tasks, we were able to minimize such motivational or social influences and pin down drivers of altered evidence processing that contribute to confirmation bias”, said senior author Dr Steve Fleming (Wellcome Centre for Human Neuroimaging, Max Planck UCL Centre for Computational Psychiatry & Ageing Research and the Department of Experimental Psychology).

Moreover, because the neural pathways involved in making a perceptual decision are well understood in such simple tasks, this makes it possible for researchers to monitor the relevant brain processes involved. The researchers highlight that an understanding of the mechanism that causes confirmation bias may help in developing interventions that could reduce people’s blindness to contradictory information.

“These results are especially exciting to me as a detailed understanding of the neural mechanisms behind confirmation bias opens up opportunities for developing evidence-based interventions. For instance, the role of inaccurate confidence in promoting confirmation bias indicates that training people to boost their self-awareness may help to people to make better decisions.” said Max Rollwage.


The full paper is published in Nature Communications: Confidence drives a neural confirmation bias.