728 x 90
728 x 90

Why We Deny

Why We Deny

From evolution to climate change to the holocaust, there are always those who deny claims despite overwhelming evidence. What drives these people? Psychologist and professional skeptic Michael Shermer’s new book ‘The Believing Brain’ describes the mental mechanisms that are at work here and paints a picture of our alarmingly primitive reasoning capacity.

Crackpot conspiracy theorists and religious fanatics are not the only groups that deny certain facts in a way that astonishes the average person. Republican American presidential candidate Rick Perry, for instance, denied in interviews earlier this year that evidence regarding evolution and man-made global warming were convincingly demonstrated by science. What makes people believe, or should we say – not believe- despite all the evidence stacked against them?

Beliefs before explanations

The American psychologist Michael Shermer has made it his business to figure out how people come to believe things and where their reasoning process goes haywire. As the founding publisher of Skeptic Magazine, editor of skeptic.com and author of eleven books on the subject, Shermer is the leading man of the skepticism community and a professional debunker. When the US media need a rational voice against pseudoscience, the paranormal or the supernatural, they call Shermer to have him explain that the latest alien abduction might also be attributed to hallucinations, sleep anomalies or hypnosis.

His latest book, ‘The Believing Brain’, is a fascinating synthesis of 30 years of research on the subject. Shermer’s conclusion, about our belief-forming machinery, is disturbing. Most beliefs are not formed by carefully evaluating the evidence in favor or against a particular claim. Instead, they are snap decisions made for psychological, emotional and social reasons in the context of an environment created by family, friends, colleagues, culture and society at large. Only after the belief is formed, do people try to rationalize it and subconsciously seek out confirmatory evidence which, upon finding, reinforces the belief in a positive feedback loop.

American physiologist Mark Hoofnagle, one of the originators of the concept of ‘denialism’ and blogger on denialist anti-science tactics, finds this to be a plausible process. He adds: “At the basis of almost all denialism is some ideology that overrides people’s rational mind. Most people are probably irrational about one thing or another. It’s not a liberal or conservative thing; all sides have something that is threatening to them.” Following this line of thought you can, for instance, imagine people so blinded by religious ideology that they take its scripture literally, leading them to deny that the earth is round, that it’s older than 6,000 years or that evolution is true.

Seeking patterns and agents

How did we end up with such a flawed belief system? Shermer argues that one ancient brain process at work here is our tendency to find patterns everywhere we look. This tendency has been useful from the early days in our ancestral environment (the African savannah) where, for instance, quickly establishing the pattern ‘rustle in the grass means dangerous predator’ could save your life. Of course, sometimes a pattern is false and the rustle in the grass is just the wind. Shermer makes the case that the costs of missing a pattern (missing the presence of a predator in this case) often greatly outweigh the costs of believing a false pattern (thinking it’s a predator while it’s only the wind). This, in turn, easily leads to false patterns.

Another characteristic of our brain is that, once we have established a pattern, we tend to infuse it with meaning, intention and agency. So in the example above, when we are dealing with a predator, we correctly assume that we are dealing with an intentional agent instead of an inanimate force like the wind. Shermer suspects this tendency is related to the fact that people have a ‘theory of mind’, or the capacity to be aware of mental states like desires and intentions in both ourselves and others. Problems arise of course when we assume agency when there actually is none, for instance when dealing with the wind, thinking it is an angry higher power instead of plain physics. In fact, most patterns in the world lack agents and are governed by bottom-up causal laws and randomness and assuming agency in those cases have led to practices like shamanism, animism and magical thinking in the past, and to religion, superstition and New Age spiritualism today.

Brain biases

To make matters worse, once committed to a belief, it is extremely hard to change your mind. Shermer identified no less than 39 cognitive biases that make us stick to our guns (see Bias Bonanza below). The most important of all of them, he argues, is the confirmation bias which is our tendency to seek confirmatory evidence in support of our already existing beliefs and ignore or reinterpret disconfirming evidence. This effect has been found in many studies including one where participants had to assess somebody’s personality after reading a (fictional) profile of that person which consequently led their assessment to become strikingly similar to the profile. In another study involving a murder trial, participants did not evaluate the evidence first, as one might expect, but quickly concocted a narrative in their mind about what happened and then rifled through the evidence and picked out what most closely fit the story.

[box style=”note”]

Bias Bonanza

Michael Shermer describes an impressive number of cognitive biases leading our brains to construct false beliefs and stick to them. Here a small selection:

Confirmation bias: Tendency to seek and find confirmatory evidence in support of already existing beliefs and ignore or reinterpret disconfirming evidence

Hindsight bias: Tendency to reconstruct the past to fit with present knowledge

Self-justification bias: Tendency to rationalize decisions after the fact to convince ourselves that what we did was the best thing we could have done

Attribution bias: Tendency to attribute different causes for our own beliefs and actions than that of others

Sunk-cost bias: Tendency to believe in something because of the investment already made into that belief

Status quo bias: Tendency to opt for whatever it is we are used to, that is, the status quo.

Bias blind spot: Tendency to recognize the power of cognitive biases in other people but to be blind to their influence upon our own beliefs


An especially revealing study was a neuro-imaging experiment done by American psychologist Drew Westen during the 2004 American presidential election. Westen found that both Republicans and Democrats were much more critical of the candidate of the opposite party when confronted with contradictory statements made by both candidates. Strikingly, the brain areas most active in this process were not those involved with reasoning but those associated with emotions and conflict resolution. Once the participants had arrived at a conclusion that made them emotionally comfortable, the brain’s reward area became active. Shermer concludes that instead of rationally evaluating a candidate’s position on an issue, the participants had an emotional reaction to conflicting data and got neuro-chemically rewarded after rationalizing the conflicting data away.

The reluctance to change one’s mind could ultimately be, once again, a legacy from our evolutionary past. Shermer argues that our tribal tendencies lead us to form coalitions with fellow like minded members of our group and to demonize others who hold differing beliefs. This effect could have possibly supported group cohesion and thereby promoted its survival. Furthermore, our faulty reasoning process could have to do with, what Shermer calls, folk numeracy, or our natural tendency to misperceive probabilities, to think anecdotally instead of statistically, and to focus on short-term trends and small-number runs (e.g. we notice a short stretch of cool days and ignore the long-term global warming trend). When roaming the African savannah in the past, this way of thinking was probably adequate for survival but in the modern world it can fall painfully short.

Science as antidote
You might wonder how we can avoid all of these irrational belief pitfalls. According to Shermer, the best tool we have is science. Before accepting a claim, the scientific process requires an impressive number of checks and balances like control groups, double-blind tests, replication studies by independent labs and peer reviewed publications. In addition, science has a built in self-correcting mechanism where, eventually, after enough data comes in, the truth will come out.

All the more worrisome then, that according to a 2002 survey by the National Science Foundation, 70% of Americans do not understand the scientific process (defined by them as grasping probability, the experimental method and hypothesis testing). To tackle this problem, Shermer recommends better communication about science in the media and especially explaining how science works versus only explaining what science knows.

Mark Hoofnagle adds that conspiracy theories are often an important element of denialism because, in order to deny well proven facts, you have to assume a huge number of people are lying. He writes that pointing out the absurdity of these theories can be a successful strategy as well in convincing some deniers they are wrong.

Unfortunately, as we have seen, the majority of our deeply held beliefs have turned out to be immune to attack by direct educational tools, especially for those who are not ready to hear contradictory evidence. The pope won’t become an atheist anytime soon and conservatives suddenly turning into liberals, or vice versa, are rare. Shermer concludes belief change ultimately comes from a combination of personal psychological readiness and a deeper social and cultural shift in the underlying zeitgeist, which is affected in part by education but is more the product of harder-to-define political, economic, religious, and social changes. In other words, it can take a lifetime for someone to change their mind if they ever change at all.

Shermer, M. (2011). The Believing Brain: From Ghosts and Gods to Politics and Conspiracies. How We Construct Beliefs and Reinforce Them as Truths. New York: Times Books. ISBN: 978-0-8050-9125-0

Westen, D., Blagov, P., Harenski, K., Kilts, C., & Hamann, S. (2006). Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgment in the 2004 U.S. Presidential Election. Journal of Cognitive Neuroscience, 18 (11), 1947-1958 DOI: 10.1162/jocn.2006.18.11.1947


Leave a Comment

Your email address will not be published. Required fields are marked with *

Cancel reply


  • Ercan
    July 7, 2012, 06:38

    to be integral to the stnficieic method. “There is an assumption that ‘climate science’ is somehow ‘settled’ at the root of this statement. Science is never settled – see the recent exciting advances in regard to evolutionary theory: most thought this ‘settled’ but lo and behold it turns ou there is still much to learn on how evolution works. So this statement is not so (see recent comments by the head of the CRU). Many sceptics requested the data from which this ‘climate science’ has drawn its conclusions, wishing to validate both it and the methodology used. This was denied. But a core value of science is to make data and methodology available so that anyone can replicate the claimed results. Your statement is not correct.“Rather than proposing an alternative hypothesis which would better explain the range of observations made, any line of attack is used, no matter how contradictory with others it may be. So, what we have in denialist discourse is all politics, and no science. No stnficieic method.”Again, a very peculiar view. What of the alternative hypotheses of cosmic rays and cloud formation, variations in the water cycle (it being the principal greenhouse gas), solar motion theory, to name a few? There are may alternative hypotheses. SO your comment about sceptic discourse being all politics is invalidated by these facts.“It’s important to underline this point. What denialists cannot provide is anything which can approximate to a truth statement. Methodological doubt, Cartesian style, is supposed to be a prelude to the uncovering of a truth, not a rhetorical strategy of dismissal. Climate change scepticism, contrary to the claims of some of its proponents, has absolutely nothing to do with ‘The Enlightenment’. Quite the contrary.”Here you vary sharply from stnficieic method. If the AGW hypothesisers raise this hypothesis, as they have, then they have to present all available evidence supporting it, which they have. An hypothesis is disproved, irrespective of the amount of evidence supporting it, by one contrary fact. As this hypothesis claims that human caused CO2 levels will cause global warming, leading to unnatural climate change, and as we know that there have been global glaciations when atmospheric CO2 levels were much higher than now. Therefore, the hypothesis is disproved.Next hypothesis please – they are a dime a dozen, as you know.“Their other classic move is to hold science itself to an impossible standard. Somehow the findings of climate science have to be unequivocally true.”Who has said this, please? Or is this your political interpretation of the position of your opponents? If the former, publish who said this and when: such a person is a quite foolish, of course.“What we actually see, then, in this contre-temps is a debate over what constitutes truth. “How is this assertion linked to your discussion above? What we are seeing from both sides of the debate is equal amounts of distortion, and squabbling over facts. Little more.“But probabilities of 90%, as in the IPCC’s Fourth Report, are very strong indeed.”Not necessarily (discuss this with a statistician), and subject to rigorous validation of the data and methodology on which they were based. If the data was tampered with (and it has been accepted by CRU and IPCC that the data they used was indeed tampered with to increase its support for the global warming hypothesis) what value is the asserted probability? If the methodology was flawed (and the readme files in the data released from CRU demonstrate in spades that is was gravely flawed in dozens of respects) then again, of what value of the asserted probability?If the asserted probability is therefore of no value, is it sensible to base policy on it?Mr Hamilton’s articles are quite poorly written and based on what appears to be conspiracy theory. I have found them of negligible value.IT

  • Azer
    July 7, 2012, 21:53

    As for “calling denialism for what it is”, that is not the obecjt of using the word “denial”. It is a political device to dismiss a viewpoint by associating it with something egregious.Just because you assert that, doesn’t make it so. I know that I don’t use denial as a political device to dismiss a viewpoint by associating it with something egregious , so your assertion is wrong.I use denial to describe what I see: studied, deliberate disavowal and wilful ignorance of established facts in the scientific record, and an attempt to substitute the established facts with a tricked-up, erroneous, pseudo-scientific and shoddy monument to one’s opinions, prejudices and gut-feel. That is denialism.The fact that other people in other contexts have used the same denialist method for something egregious’ is of no relevance. I see a spade, I call it a spade. Some spades have been used to dig wells, other spades have been used to dig mass graves. They’re still spades.It really underlines my point in that it is another means of dismissing a viewpoint without actually demonstrating error. It is nasty and it suggests an attempt to suppress dissent.Error in the denialist position has been demonstrated, re-demonstrated, and shown time and time again. Error has been established for anybody who isn’t actively denialist in their orientation towards the truth claims of climate science. The fact that some people continue to propound discredited junk science is what makes them denialists. Dissent is possible only in matters of opinion, politics, ethics, philosophy, etc. It is impossible in principle to dissent from fact. If I wish to assert that the sun did not rise this morning, that is error, not dissent .In fact, why isn’t “dissenter” a perfectly acceptable term?Because dissenters have skin in the game: careers, assets and reputations that they place at risk by engaging in a deliberate, public-spirited campaign of defiance at considerable risk to their reputations and sometimes personal safety. You are debasing the term dissenter enormously if you attempt to apply it to armchair commentators who risk nothing, invest nothing, and who hide behind internet pseudonyms to launch email cyber-bullying campaigns against reputable scientists. That is not dissent . That is lynch-mob mentality.The answer is that “dissenter” suggests good faith and those who dissent from the proposition that AGW is a serious threat are not to be allowed the presumption of good faith.When they start showing signs of acting in good faith, they’ll be afforded the presumption of good faith. For example, when they:- Cease repeating previously discredited talking points.- Cease harrassing practising scientists with abusive and threatening communications.- Cease hacking the computer systems of taxpayer-funded institutions.- Cease mis-stating and mis-representing that statements of climate scientists.- Cease exaggerating the significance of minor errors that creep into a vast body of well established scientific record.- Advance an internally coherent and empircally plausible set of scientific theories that offer a testable alternative to the basics of AGW theory. Then they’ll be showing signs of good faith. Until then, they’re just crank denialists who no more deserve to apply to themselves the noble labels of skeptics (Gallileo, Copernicus, Columbus) or dissenters (Mandela, Gandhi, Walesa) than do the dust bunnies under my bed.