Biased but Brilliant


By Cordelia Fine

 

How’s this for a cynical view of science? “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

Scientific truth, according to this view, is established less by the noble use of reason than by the stubborn exertion of will. One hopes that the Nobel Prize-winning physicist Max Planck, the author of the quotation above, was writing in an unusually dark moment.

And yet a large body of psychological data supports Planck’s view: we humans quickly develop an irrational loyalty to our beliefs, and work hard to find evidence that supports those opinions and to discredit, discount or avoid information that does not. In a classic psychology experiment, people for and against the death penalty were asked to evaluate the different research designs of two studies of its deterrent effect on crime. One study showed that the death penalty was an effective deterrent; the other showed that it was not. Which of the two research designs the participants deemed the most scientifically valid depended mostly on whether the study supported their views on the death penalty.

In the laboratory, this is labeled confirmation bias; observed in the real world, it’s known as pigheadedness.

Scientists are not immune. In another experiment, psychologists were asked to review a paper submitted for journal publication in their field. They rated the paper’s methodology, data presentation and scientific contribution significantly more favorably when the paper happened to offer results consistent with their own theoretical stance. Identical research methods prompted a very different response in those whose scientific opinion was challenged.

This is a worry. Doesn’t the ideal of scientific reasoning call for pure, dispassionate curiosity? Doesn’t it positively shun the ego-driven desire to prevail over our critics and the prejudicial urge to support our social values (like opposition to the death penalty)?

Perhaps not. Some academics have recently suggested that a scientist’s pigheadedness and social prejudices can peacefully coexist with — and may even facilitate — the pursuit of scientific knowledge.

Let’s take pigheadedness first. In a much discussed article this year in Behavioral and Brain Sciences, the cognitive scientists Hugo Mercier and Dan Sperber argue that our reasoning skills are really not as dismal as they seem. They don’t deny that irrationalities like the confirmation bias are common. Instead, they suggest that we stop thinking of the primary function of reasoning as being to improve knowledge and make better decisions. Reasoning, they claim, is for winning arguments. And an irrational tendency like pigheadedness can be quite an asset in an argumentative context. A engages with B and proposes X. B disagrees and counters with Y. Reverse roles, repeat as desired — and what in the old days we might have mistaken for an exercise in stubbornness turns out instead to be a highly efficient “division of cognitive labor” with A specializing in the pros, B in the cons.

It’s salvation of a kind: our apparently irrational quirks start to make sense when we think of reasoning as serving the purpose of persuading others to accept our point of view. And by way of positive side effect, these heated social interactions, when they occur within a scientific community, can lead to the discovery of the truth.

Read more

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 4,931 other followers

%d bloggers like this: