I know expressing exasperation doesn’t really accomplish much but:
Please stop the nonsense on our “defective brains.”
Frankly, I don’t know why journalists write, much less why newspapers and newsmagazines continue to publish, the same breathless, “OMG! Scientists have determined we’re stupid!!!” story over & over & over.
Maybe it is because they assume readers are stupid and will find the same the same simplistic rendering of social psychology research entertaining over & over & over.
Or maybe the writers who keep recycling this comic book account of decision science can’t grasp the grownup version of why people become culturally polarized on risk and related facts—although, honestly, it’s really not that complicated!
Look: the source of persistent controversy over risks and related facts of policy significance is our polluted science communication environment, not any defects in our rationality.
People need to (and do) accept as known by science much much much more than they could possibly understand through personal observation and study. They do this by integrating themselves into social networks—groups of people linked by cultural affinity—that reliably orient their members toward collective knowledge of consequence to their personal and collective well-being.
The networks we rely on are numerous and diverse—because we live in a pluralistic society (as a result, in fact, of the same norms and institutions that make a liberal market society the political regime most congenial to the flourishing of scientific inquiry). But ordinarily those networks converge on what’s collectively known; cultural affinity groups that failed to reliably steer their members toward the best available evidence on how to survive and live well would themselves die out.
Polarization occurs only when risks or other facts that admit of scientific inquiry become entangled in antagonistic cultural meanings. In that situation, positions on these issues will come to be understood as markers of loyalty to opposing groups. The psychic pressure to protect their standing in groups that confer immense material and emotional benefits on them will then motivate individuals to persist in beliefs that signify their group commitments.
They’ll do that in part by dismissing as noncredible or otherwise rationalizing away evidence that threatens to drive a wedge between them and their peers. Indeed, the most scientifically literate and analytically adept members of these groups will do this with the greatest consistency and success.
Once factual issues come to bear antagonistic cultural meanings, it is perfectly rational for an individual to use his or her intelligence this way: being “wrong” on the science of a societal risk like climate change or nuclear power won’t affect the level of risk that person (or anyone else that person cares about): nothing that person does as consumer, voter, public-discussion participant, etc., will be consequential enough to matter. Being on the wrong side of the issue within his or her cultural group, in contrast, could spell disaster for that person in everday life.
So, in that unfortunate situation, the better our “brains” work, the more polarized we’ll be. (BTW, what does it add to these boring, formulaic “boy, are humans dumb!” stories to say “scientists have discovered that our brains are responsible for our inability to agree on facts!!”? Where else could cognition be occurring? Our feet?!)
The number of issues that have that character, though, is miniscule in comparison to the number that don’t. What side one is on on pasteurized milk, fluoridated water, high-power transmission lines, “mad cow disease,” use of microwave ovens, exposure to Freon gas from refrigerators, treatment of bacterial diseases with antibiotics, the inoculation of children against Hepatitis B, etc. et. etc., isn’t viewed as a a badge of group loyalty and commitment for the affinity groups most people belong to. Hence, there’s not meaningful amount of cultural polarization on these issues–at least in the US (meaning pathologies are local; in Europe there might be cultural dispute on some of these issues & not on some of the ones that divide people here).
The entanglement of facts that admit of scientific investigation—e.g., “carbon emissions are heating the planet”; “deep geologic isolation of nuclear wastes is safe”—with antagonistic meanings occurs by a mixture of influences, including strategic behavior, poor institutional design, and sheer misadventure. In no such case was the problem inevitable; indeed, in most, such entanglement could easily have been avoided.
These antagonistic meanings, then, are a kind of pollution in the science communication environment. They disable the normal and normally reliable faculties of rational discernment by which ordinary individuals recognize what is collectively known.
One of the central missions of the science of science communication in a liberal democratic state is to protect the science communication environment from such contamination, and to develop means for detoxifying that environment when preventive or protective measures fail.
This is the account that is best supported by decision science.
And if you can’t figure out how to make that into an interesting story, then you are falling short in relation to the craft norms of science journalism, the skilled practitioners of which continuously enrich human experience by figuring out how to make the wonder of what’s known to science known by ordinary, intelligent, curious people.