Is intelligence a tool for propaganda or truth-seeking?: An excerpt from “The Intelligence Trap”

07 July, 2019

In his debut book, The Intelligence Trap, the author David Robson explains why smart people are vulnerable to foolish thinking. Robson writes in the introduction to the book that before beginning his career as a science journalist, he thought that intelligence was synonymous with good thinking. But later, he found serious problems with this premise and arrived at the conclusion that “general intelligence fails to protect us from various cognitive errors.”

In the following excerpt from the book, Robson argues that “smart people do not apply their superior intelligence fairly, but instead use it ‘opportunistically’ to promote their own interests and protect the beliefs that are most important to their identities.” He substantiates this by narrating an anecdote about Arthur Conan Doyle, the creator of the fictional character Sherlock Holmes, and by citing research demonstrating that smart people merely use their intelligence to rationalise their opinions. “… an intelligent person with an inaccurate belief system may become more ignorant after having heard the actual facts,” he writes. “Intelligence can be a tool for propaganda rather than truth-seeking.”

Consider how Conan Doyle was once infamously fooled by two schoolgirls. In 1917—a few years before he met Houdini—16-year-old Elsie Wright and nine-year-old Frances Griffith claimed to have photographed a population of fairies frolicking around a stream in Cottingley, West Yorkshire. Through a contact at the local Theosophical Society, the pictures eventually landed in Conan Doyle’s hands.

Many of his acquaintances were highly sceptical, but he fell for the girls’ story hook, line and sinker. “It is hard for the mind to grasp what the ultimate results may be if we have actually proved the existence upon the surface of this planet of a population which may be as numerous as the human race,” he wrote in The Coming of Fairies. In reality, they were cardboard cut-outs, taken from Princess Mary’s Giftbook—a volume that had also included some of Conan Doyle’s own writing.

What’s fascinating is not so much the fact that he fell for the fairies in the first place, but the extraordinary lengths that he went to in order to explain away any doubts. If you look at the photographs carefully, you can even see hatpins holding one of the cut-outs together. But where others saw pins, he saw the gnome’s belly button—proof that fairies are linked to their mothers in the womb with an umbilical cord. Conan Doyle even tried to draw on modern scientific discoveries to explain the fairies’ existence, turning to electromagnetic theory to claim that they were “constructed in material which threw out shorter or longer vibrations,” rendering them invisible to humans.

As Ray Hyman, a professor of psychology at the University of Oregon, puts it: “Conan Doyle used his intelligence and cleverness to dismiss all counter-arguments . . . [He] was able to use his smartness to outsmart himself.”

The use of system “slow thinking” to rationalise our beliefs even when they are wrong leads us to uncover the most important and pervasive form of the intelligence trap, with many disastrous consequences; it can explain not only the foolish ideas of people such as Conan Doyle, but also the huge divides in political opinion about issues such as gun crime and climate change.

So what’s the scientific evidence?

The first clues came from a series of classic studies from the 1970s and 1980s, when David Perkins of Harvard University asked students to consider a series of topical questions, such as: “Would a nuclear disarmament treaty reduce the likelihood of world war?” A truly rational thinker should consider both sides of the argument, but Perkins found that more intelligent students were no more likely to consider any alternative points of view. Someone in favour of nuclear disarmament, for instance, might not explore the issue of trust: whether we could be sure that all countries would honour the agreement. Instead, they had simply used their abstract reasoning skills and factual knowledge to offer more elaborate justifications of their own point of view.

This tendency is sometimes called the confirmation bias, though several psychologists—including Perkins—prefer to use the more general term “myside bias” to describe the many different kinds of tactics we may use to support our viewpoint and diminish alternative opinions. Even student lawyers, who are explicitly trained to consider the other side of a legal dispute, performed very poorly.

Perkins later considered this to be one of his most important discoveries. “Thinking about the other side of the case is a perfect example of a good reasoning practice,” he said. “Why, then, do student lawyers with high IQs and training in reasoning that includes anticipating the arguments of the opposition prove to be as subject to confirmation bias or myside bias, as it has been called, as anyone else? To ask such a question is to raise fundamental issues about conceptions of intelligence.”

Later studies only replicated this finding, and this one-sided way of thinking appears to be a particular problem for the issues that speak to our sense of identity. Scientists today use the term “motivated reasoning” to describe this kind of emotionally charged, self protective use of our minds. Besides the myside or confirmation bias that Perkins examined (where we preferentially seek and remember the information that confirms our view), motivated reasoning may also take the form of a disconfirmation bias—a kind of preferential scepticism that tears down alternative arguments. And, together, they can lead us to become more and more entrenched in our opinions.

Consider an experiment by Dan Kahan at Yale Law School, which examined attitudes to gun control. He told his participants that a local government was trying to decide whether to ban firearms in public—and it was unsure whether this would increase or decrease crime rates. So they had collected data on cities with and without these bans, and on changes in crime over one year:

Kahan also gave his participants a standard numeracy test, and questioned them on their political beliefs.

Try it for yourself. Given this data, do the bans work?

Kahan had deliberately engineered the numbers to be deceptive at first glance, suggesting a huge decrease in crime in the cities carrying the ban. To get to the correct answer, you need to consider the ratios, showing around 25 percent of the cities with the ban had witnessed an increase in crime, compared with 16 percent of those without a ban. The ban did not work, in other words.

As you might hope, the more numerate participants were more likely to come to that conclusion—but only if they were more conservative, Republican voters who were already more likely to oppose gun control. If they were liberal, Democrat voters, the participants skipped the explicit calculation, and were more likely to go with their (incorrect) initial hunch that the ban had worked, no matter what their intelligence.

In the name of fairness, Kahan also conducted the same experiment, but with the data reversed, so that the data supported the ban. Now, it was the numerate liberals who came to the right answer—and the numerate conservatives who were more likely to be wrong. Overall, the most numerate participants were around 45 percent more likely to read the data correctly if it conformed to their expectations.

The upshot, according to Kahan and other scientists studying motivated reasoning, is that smart people do not apply their superior intelligence fairly, but instead use it “opportunistically” to promote their own interests and protect the beliefs that are most important to their identities. Intelligence can be a tool for propaganda rather than truth-seeking.

It’s a powerful finding, capable of explaining the enormous polarisation on issues such as climate change. The scientific consensus is that carbon emissions from human sources are leading to global warming, and people with liberal politics are more likely to accept this message if they have better numeracy skills and basic scientific knowledge. That makes sense, since these people should also be more likely to understand the evidence. But among free-market capitalists, the opposite is true: the more scientifically literate and numerate they are, the more likely they are to reject the scientific consensus and to believe that claims of climate change have been exaggerated.

The same polarisation can be seen for people’s views on vaccination, fracking and evolution. In each case, greater education and intelligence simply helps people to justify the beliefs that match their political, social or religious identity. (To be absolutely clear, overwhelming evidence shows that vaccines are safe and effective, carbon emissions are changing the climate and evolution is true.)

There is even some evidence that, thanks to motivated reasoning, exposure to the opposite point of view may actually backfire; not only do people reject the counter-arguments, but their own views become even more deeply entrenched as a result. In other words, an intelligent person with an inaccurate belief system may become more ignorant after having heard the actual facts. We could see this with Republicans’ opinions about Obamacare in 2009 and 2010: people with greater intelligence were more likely to believe claims that the new system would bring about Orwellian “death panels” to decide who lived and died, and their views were only reinforced when they were presented with evidence that was meant to debunk the myths.

Kahan’s research has primarily examined the role of motivated reasoning in political decision-making—where there may be no right or wrong answer—but he says it may stretch to other forms of belief. He points to a study by Jonathan Koehler, then at the University of Texas at Austin, who presented parapsychologists and sceptical scientists with data on two (fictional) experiments into extrasensory perception.

The participants should have objectively measured the quality of the papers and the experimental design. But Koehler found that they often came to very different conclusions, depending on whether the results of the studies agreed or disagreed with their own beliefs in the paranormal.

This is an edited excerpt from David Robson’s book, The Intelligence Trap: Why Smart People Make Stupid Mistakes—and How to Make Wiser Decisions, published by Hachette India.