Everyone Believes Weird Things

It’s a vaccine that needs daily boosters.

In an interview with Stephen Colbert, Neil Degrasse Tyson made an interesting comment that has since gone viral as an internet meme. Colbert asked Tyson if science was a way of approaching the world, and he responded,

It’s not only a way of approaching the world, it’s a way of equipping yourself to interpret what happens in front of you. I think of science – the methods and tools that enable it – as kind of like a utility belt that you walk around with. …Science literacy is vaccine against charlatans of the world that would exploit your ignorance.

This last remark merits consideration both because of its huge popularity and because of how manifestly untrue it is. We should be wary of this kind of self-congratulatory attitude, because it blinds intelligent people to the possibility–or rather, the inevitability–that they could be wrong. Smartness is not an inoculation against magical thinking.

My attention was first drawn to this fact by none other than Tyson himself, at the first Beyond Belief conference in 2006. Tyson gave a talk called “The Perimeter of Ignorance,” and in it he made several very interesting observations. The first came from a quote at the very end of Isaac Newton’s Principia Mathematica, in which Newton, perhaps the greatest scientific mind in history, in his seminal work, invokes an intelligent design argument to explain how all the planets could be set in motion, all travelling the same direction, all on the plane of the elliptic, without either crashing into the sun or flying off into deep space. His laws of motion brilliantly explained everything else about the orbits of the planets, but he had reached the perimeter of his ignorance.

This is Isaac Newton invoking intelligent design, at the limits of his knowledge. And I want to put on the table the fact that you have school systems wanting to put intelligent design into the classroom, but you also have the most brilliant people who have ever walked this earth doing the same thing. So it’s a deeper challenge than merely educating the public.

Tyson carries the point further:

There’s 90 percent of the American public that believes in a personal God that responds to their prayers. Then you ask, what is that percentage for scientists? Averaged over disciplines it’s about 40 percent. Then say, how about the elite scientists, members of the National Academy of Sciences? An article on those data, recently in Nature, said “Eighty-five percent of the [NAS] reject a personal God.” …[But], you know, that’s not the story there. They missed the story. What that article should have said is, “how come this number isn’t zero?”

My esteemed colleague here, Professor Krauss, says all we need to do is make a scientifically literate public. Well, when you do, how can they do better than the scientists themselves? That’s kind of unrealistic, I think. So there’s something else going on that nobody seems to be talking about, so that, yes, as you become more scientific, religiosity drops off… but not [to] zero.So it’s not that 85 percent reject, it’s that 15 percent of the most brilliant minds the nation has accepts it.

When 1 in 7 of the best scientists in the world believe in a personal God who listens to prayers and does miracles, this should give us reason for pause. Education and intelligence are clearly not sufficient to immunize a person against nonscientific beliefs. Some will object to using religiosity as a measure of irrationality, but I’m only using Tyson’s own argument here. If you don’t like the example of religion, pick any belief–political, scientific, economic–that you consider to be irrational, and you will find substantial numbers of smart, educated, wealthy people who believe it.

The cognitive missteps, biases, and errors in reasoning that cause people to believe irrational things are not the result of simple ignorance. Things like confirmation bias, where people seek out information that confirms beliefs they already have and reject or ignore facts that don’t fit their preconceptions, are deeply ingrained, universal, chronic problems of the human mind. They cannot be wholly cured by education, and even people educated about cognitive biases and trained to correct for these errors are still subject to making them.

In his book Why People Believe Weird Things, Michael Shermer concludes with a section on why smart people believe weird things. Smart people, he says, believe strange things for the same reasons everyone else does, but with the added complication that, “Smart people… are skilled at defending beliefs they arrived at for non-smart reasons.” In other words, they make the same mistakes as everyone else, but because they are smart, they are better at rationalizing their beliefs.

We are only partially rational, and we are capable of making both logical arguments as well as succumbing to irrational delusions. The real danger comes when people do both at the same time, making valid arguments from false premises–premises they are unwilling to examine. Overconfidence and a veneer of plausibility can convince people of almost anything, if their premises are sufficiently flawed. And, in the words of Voltaire, “Those who can make you believe absurdities can make you commit atrocities.”

It’s almost impossible to predict the liabilities of dogma, beliefs we accept without criticism or in the very teeth of reasonable doubt. The belief that there are souls in 32-cell human embryos does not seem, a priori, to be a dangerously irrational belief, but a plausible chain of reasoning could lead from this idea to the murder of scientists and doctors conducting stem cell research. Are anti-abortion fanatics crazy? To the contrary, they are making rational decisions in light of certain irrational beliefs, and while it would be comforting to dismiss them as nuts, they are as sane as you or I.

Mohamed Atta, Ph.D.

Consider the phenomenon of Islamic suicide terrorism. The typical explanation for men like the 9/11 hijackers is that they are either evil, in some cartoonish sense of hating good and loving destruction, or that they are poor and ignorant, or that they’re simply “crazy.” The overwhelming response of western liberals to the chronic problem of suicide terrorism is that it is the result of poor education, colonialism, or poverty in the Muslim world.

I’m often tempted to wonder how many more architects and engineers need to hit the wall at 500 miles an hour before we admit that this is not merely a problem of economics or education. How many more middle class college graduates and practicing neurosurgeons need to blow themselves up on buses and trains before we accept that there is more to the issue than lack of opportunity? It is entirely possible for a human mind to maintain the cognitive dissonance necessary to be both a scientist and a terrorist. The 19 hijackers were not insane, they were not idiots, and they were not even suicidal: they were behaving exactly as you would expect them to, given a specific set of beliefs and circumstances.

In a two-hour round table discussion titled “The Four Horsemen“, author Sam Harris hammers this point home:

It’s true to say that you can go through the curriculum of becoming a scientist and never have your faith explicitly challenged, because it’s taboo to do so. We have engineers who can build nuclear bombs in the Muslim world, who still think it’s plausible metaphysics that you can get to paradise and get seventy two virgins, and we have people like Francis Collins who think that on Sunday you can kneel down in the dewy grass and give yourself to Jesus because you are in the presence of a frozen waterfall, and on Monday you can be a physical geneticist.

The issue that Harris doesn’t address is that you can make the leap from frozen waterfall to belief in Jesus and still be an excellent scientist–even the head of the Human Genome Project or the National Institutes of Health. It’s entirely possible to live your life with two sets of books, using both reason and a hodgepodge of other mental heuristics to form your beliefs, and it’s even possible that a certain level of cognitive dissonance could be necessary just to get up in the morning. Confirmation bias is one cognitive trick where you selectively accumulate evidence and arguments in favor what you already believe, while ignoring mountains of disconfirmatory data, thus giving a superficially plausible exterior to a silly set of ideas. Creationists, for example, have literally gotten confirmation bias down to a science: creation science.

The fundamental issue we have to deal with is that it is not only possible but inevitable that we will live our lives and form our beliefs in these incompatible ways. It’s not merely people on the fringes of society, the people who believe especially “weird” things, like religious fundamentalists or conspiracy theorists: it’s all of us, including skeptics. That is the reason it is so troubling to see this meme travelling around the skeptic community. It’s our own special brand of confirmation bias, congratulating ourselves for being impervious to false beliefs.

Our status as skeptics, scientists, rationalists, atheists, or whatever sort of scientifically literate person you hold yourself to be, does not vaccinate us against bad ideas or illogical arguments. By imagining that we are immune to the type of irrational thinking we criticize in others, we leave ourselves especially vulnerable to the follies of human conceit. We don’t know, nor do we spend enough to time mediating on, which of our own treasured beliefs will be shown to be unreasonable or unfounded.

Shermer:

It is a given assumption in the skeptical movement—elevated to a maxim really—that intelligence and education serve as an impenetrable prophylactic against the flim flam that we assume the unintelligent and uneducated masses swallow with credulity. Indeed, at the Skeptics Society we invest considerable resources in educational materials distributed to schools and the media under the assumption that this will make a difference in our struggle against pseudoscience and superstition. These efforts do make a difference, particularly for those who are aware of the phenomena we study but have not heard a scientific explanation for them. But are the cognitive elite protected against the nonsense that passes for sense in our culture? Is flapdoodle the fodder for only fools? The answer is no.

 …Most of us most of the time come to our beliefs for a variety of reasons having little to do with empirical evidence and logical reasoning (that, presumably, smart people are better at employing). Rather, such variables as genetic predispositions, parental predilections, sibling influences, peer pressures, educational experiences, and life impressions all shape the personality preferences and emotional inclinations that, in conjunction with numerous social and cultural influences, lead us to make certain belief choices. Rarely do any of us sit down before a table of facts, weigh them pro and con, and choose the most logical and rational belief, regardless of what we previously believed. Instead, the facts of the world come to us through the colored filters of the theories, hypotheses, hunches, biases, and prejudices we have accumulated through our lifetime. We then sort through the body of data and select those most confirming what we already believe, and ignore or rationalize away those that are disconfirming.

All of us do this, of course, but smart people are better at it through both talent and training.

Skepticism is an attitude and a philosophy, not a belief. It is questioning every assumption and taking nothing for granted. We can set up mechanisms like peer review to try to check our cognitive biases and intellectual prejudices, but this will only serve to enhance the credibility of the collective body of knowledge we call science. We generally do not subject our own beliefs and decisions to peer review and public criticism. We cannot live our lives as scientists, so instead, we use a series of mental shortcuts that work more or less well at getting us through the day. Because we cannot be sure of our beliefs, we must be careful and especially attentive to the idea that our convictions are less rational than we suppose. The more important and far reaching the belief, the more scrutiny we should give it. It is this humility that matters most, and that can only be learned–not taught.

Daniel Bier

Daniel Bier is the founder and editor-at-large of The Skeptical Libertarian. He writes on issues relating to science, skepticism, and economic freedom, focusing on the role of evolution in social and economic development.