From evolution to climate change to the holocaust, there are always those who deny claims despite overwhelming evidence. What drives these people? Psychologist and professional skeptic Michael Shermer’s new book ‘The Believing Brain’ describes the mental mechanisms that are at work here and paints a picture of our alarmingly primitive reasoning capacity.
Crackpot conspiracy theorists and religious fanatics are not the only groups that deny certain facts in a way that astonishes the average person. Republican American presidential candidate Rick Perry, for instance, denied in interviews earlier this year that evidence regarding evolution and man-made global warming were convincingly demonstrated by science. What makes people believe, or should we say – not believe- despite all the evidence stacked against them?
Beliefs before explanations
The American psychologist Michael Shermer has made it his business to figure out how people come to believe things and where their reasoning process goes haywire. As the founding publisher of Skeptic Magazine, editor of skeptic.com and author of eleven books on the subject, Shermer is the leading man of the skepticism community and a professional debunker. When the US media need a rational voice against pseudoscience, the paranormal or the supernatural, they call Shermer to have him explain that the latest alien abduction might also be attributed to hallucinations, sleep anomalies or hypnosis.
His latest book, ‘The Believing Brain’, is a fascinating synthesis of 30 years of research on the subject. Shermer’s conclusion, about our belief-forming machinery, is disturbing. Most beliefs are not formed by carefully evaluating the evidence in favor or against a particular claim. Instead, they are snap decisions made for psychological, emotional and social reasons in the context of an environment created by family, friends, colleagues, culture and society at large. Only after the belief is formed, do people try to rationalize it and subconsciously seek out confirmatory evidence which, upon finding, reinforces the belief in a positive feedback loop.
American physiologist Mark Hoofnagle, one of the originators of the concept of ‘denialism’ and blogger on denialist anti-science tactics, finds this to be a plausible process. He adds: “At the basis of almost all denialism is some ideology that overrides people’s rational mind. Most people are probably irrational about one thing or another. It’s not a liberal or conservative thing; all sides have something that is threatening to them.” Following this line of thought you can, for instance, imagine people so blinded by religious ideology that they take its scripture literally, leading them to deny that the earth is round, that it’s older than 6,000 years or that evolution is true.
Seeking patterns and agents
How did we end up with such a flawed belief system? Shermer argues that one ancient brain process at work here is our tendency to find patterns everywhere we look. This tendency has been useful from the early days in our ancestral environment (the African savannah) where, for instance, quickly establishing the pattern ‘rustle in the grass means dangerous predator’ could save your life. Of course, sometimes a pattern is false and the rustle in the grass is just the wind. Shermer makes the case that the costs of missing a pattern (missing the presence of a predator in this case) often greatly outweigh the costs of believing a false pattern (thinking it’s a predator while it’s only the wind). This, in turn, easily leads to false patterns.
Another characteristic of our brain is that, once we have established a pattern, we tend to infuse it with meaning, intention and agency. So in the example above, when we are dealing with a predator, we correctly assume that we are dealing with an intentional agent instead of an inanimate force like the wind. Shermer suspects this tendency is related to the fact that people have a ‘theory of mind’, or the capacity to be aware of mental states like desires and intentions in both ourselves and others. Problems arise of course when we assume agency when there actually is none, for instance when dealing with the wind, thinking it is an angry higher power instead of plain physics. In fact, most patterns in the world lack agents and are governed by bottom-up causal laws and randomness and assuming agency in those cases have led to practices like shamanism, animism and magical thinking in the past, and to religion, superstition and New Age spiritualism today.
To make matters worse, once committed to a belief, it is extremely hard to change your mind. Shermer identified no less than 39 cognitive biases that make us stick to our guns (see Bias Bonanza below). The most important of all of them, he argues, is the confirmation bias which is our tendency to seek confirmatory evidence in support of our already existing beliefs and ignore or reinterpret disconfirming evidence. This effect has been found in many studies including one where participants had to assess somebody’s personality after reading a (fictional) profile of that person which consequently led their assessment to become strikingly similar to the profile. In another study involving a murder trial, participants did not evaluate the evidence first, as one might expect, but quickly concocted a narrative in their mind about what happened and then rifled through the evidence and picked out what most closely fit the story.[box style=”note”]
Michael Shermer describes an impressive number of cognitive biases leading our brains to construct false beliefs and stick to them. Here a small selection:
Confirmation bias: Tendency to seek and find confirmatory evidence in support of already existing beliefs and ignore or reinterpret disconfirming evidence
Hindsight bias: Tendency to reconstruct the past to fit with present knowledge
Self-justification bias: Tendency to rationalize decisions after the fact to convince ourselves that what we did was the best thing we could have done
Attribution bias: Tendency to attribute different causes for our own beliefs and actions than that of others
Sunk-cost bias: Tendency to believe in something because of the investment already made into that belief
Status quo bias: Tendency to opt for whatever it is we are used to, that is, the status quo.
Bias blind spot: Tendency to recognize the power of cognitive biases in other people but to be blind to their influence upon our own beliefs[/box]
An especially revealing study was a neuro-imaging experiment done by American psychologist Drew Westen during the 2004 American presidential election. Westen found that both Republicans and Democrats were much more critical of the candidate of the opposite party when confronted with contradictory statements made by both candidates. Strikingly, the brain areas most active in this process were not those involved with reasoning but those associated with emotions and conflict resolution. Once the participants had arrived at a conclusion that made them emotionally comfortable, the brain’s reward area became active. Shermer concludes that instead of rationally evaluating a candidate’s position on an issue, the participants had an emotional reaction to conflicting data and got neuro-chemically rewarded after rationalizing the conflicting data away.
The reluctance to change one’s mind could ultimately be, once again, a legacy from our evolutionary past. Shermer argues that our tribal tendencies lead us to form coalitions with fellow like minded members of our group and to demonize others who hold differing beliefs. This effect could have possibly supported group cohesion and thereby promoted its survival. Furthermore, our faulty reasoning process could have to do with, what Shermer calls, folk numeracy, or our natural tendency to misperceive probabilities, to think anecdotally instead of statistically, and to focus on short-term trends and small-number runs (e.g. we notice a short stretch of cool days and ignore the long-term global warming trend). When roaming the African savannah in the past, this way of thinking was probably adequate for survival but in the modern world it can fall painfully short.
Science as antidote
You might wonder how we can avoid all of these irrational belief pitfalls. According to Shermer, the best tool we have is science. Before accepting a claim, the scientific process requires an impressive number of checks and balances like control groups, double-blind tests, replication studies by independent labs and peer reviewed publications. In addition, science has a built in self-correcting mechanism where, eventually, after enough data comes in, the truth will come out.
All the more worrisome then, that according to a 2002 survey by the National Science Foundation, 70% of Americans do not understand the scientific process (defined by them as grasping probability, the experimental method and hypothesis testing). To tackle this problem, Shermer recommends better communication about science in the media and especially explaining how science works versus only explaining what science knows.
Mark Hoofnagle adds that conspiracy theories are often an important element of denialism because, in order to deny well proven facts, you have to assume a huge number of people are lying. He writes that pointing out the absurdity of these theories can be a successful strategy as well in convincing some deniers they are wrong.
Unfortunately, as we have seen, the majority of our deeply held beliefs have turned out to be immune to attack by direct educational tools, especially for those who are not ready to hear contradictory evidence. The pope won’t become an atheist anytime soon and conservatives suddenly turning into liberals, or vice versa, are rare. Shermer concludes belief change ultimately comes from a combination of personal psychological readiness and a deeper social and cultural shift in the underlying zeitgeist, which is affected in part by education but is more the product of harder-to-define political, economic, religious, and social changes. In other words, it can take a lifetime for someone to change their mind if they ever change at all.
Shermer, M. (2011). The Believing Brain: From Ghosts and Gods to Politics and Conspiracies. How We Construct Beliefs and Reinforce Them as Truths. New York: Times Books. ISBN: 978-0-8050-9125-0
Westen, D., Blagov, P., Harenski, K., Kilts, C., & Hamann, S. (2006). Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgment in the 2004 U.S. Presidential Election. Journal of Cognitive Neuroscience, 18 (11), 1947-1958 DOI: 10.1162/jocn.2006.18.11.1947
-BY MARC SMEEHUIJZEN