A thoughtful correspondent writes:
I am a physician . . . I was reading an article on Vox debunking the theory which states that more information makes people smarter. This article referenced your study concluding that those with the most scientific literacy and technical reasoning ability were less likely to be concerned about climate change and the safety of nuclear energy.
I read the paper which shows this quite nicely.
I am confused about the conclusions. I scored a perfect score on the science literacy test and on a technical reasoning test as well. I do not believe climate change is a settled science and I believe nuclear power is the safest form of reliable energy available.
The conclusion that I am biased by my scientific knowledge is strange.
In medical experiments data are scientifically gathered and tabulated. Conclusions are used as a way to explain the data. Could an alternate conclusion be reached that scientific and reasonable people downplay the danger of climate change and nuclear power precisely because we are well informed and able reason logically? It seems just as likely a conclusion as the one you reached yet it was never discussed.
My response:
Thanks for these thoughtful reflections. They deserve a reciprocally reflective and earnest response.
1st, I don’t think the methods we use are useful for explaining individuals. In the study you described, they identify in large samples patterns that furnish more support than one would otherwise have for the inference that some group-related influence or dynamic is at work that helps to explain variance in genera.
One can then do additional studies, experimental in nature (like this & this), that try to help to furnish even more support for the inference — or less, since that is what a valid study has to be in the position to do to be valid.
But once one has done that, all one has is an explanation for some portion of the variance in groups of people. One doesn’t have an explanation all the variance (the practical & not & merely “statistical” significance of which is what a reflective person must assess). One doesn’t have an instrument that “diagnoses” or tells one why any particular individual believes what he or she does.
And most important of all you don’t have a basis for saying anyone on any of the issues one is studying is “right” or “wrong”: to figure that out, do a valid study on the issue on which people like this disagree; then do another & another & another & another. And compare your results w/ others doing the same thing.
2d, I don’t believe the dynamic we are looking at is a “bias” per se. Things are more complicated than, at least for me!
I’m inclined to think that the dynamics that we observe generating polarization in our studies are the very ones that normally enable people to figure out what is known by science.
They are also the very same processes that enable people to effectively use information for another of their aims, which is to form stances and positions on issues that evince commitments that they care about and that connected them to others. That is a matter that is cognitively demanding as well — & of course one that that most people, even ones who don’t get a perfect score on “science comprehension” tests, possess the reasoning proficiency that it takes to perform it.
What to make of the situations, then, in which that same form of reasoning generates states of polarization on facts that admit of empirical inquiry is a challenging issue — conceptually, morally & psychologically? This is very perplexing to me!
I suspect sometimes it reflects the experience of a kind of interference between or confounding of mental operations that serve one purpose and those that serve another. That in effect, the “science communication environment” has become degraded by conflicts between the stake people have in knowing what’s known & being who they are.
At others, times it might simply be that nothing is amiss from the point of view of the people who are polarized; they are simply treating being who they are as the thing that matters most for them in processing information on the issue in question. . . .
3d, notwithstanding all this, I don’t think our studies admit of your “alternate conclusion”: that “scientific and reasonable people downplay the danger of climate change and nuclear power precisely because we are well informed and able reason logically.”
The reason is that that’s not what the data show. They show that those highest in one or another measure of science comprehension are the most polarized on a small subset of risk issues including climate change.
That doesn’t tell us which side is “right” & which “wrong.”
But it tells us that we can’t rely on what would otherwise be a sensible heuristic — that the answer individuals with those proficiencies are converging on is most likely the right answer. Because again, those very people aren’t converging; on the contrary, they are the most polarized.
Many people write to me suggesting that an “alternative explanation” for our data is that “their side” is right.
About 50% of the time they are part of the group whose group is “climate skeptical” & the other half of the time the one that is “climate nonskeptical” (I have no idea what terms I’m supposed to be using for these groups at this point; if they hold a convention and vote on a preferred label, I will abide by their decisions!).
I tell them every time that can’t actually be what the data are showing—for all the reasons I’ve just spelled out.
Some fraction (only a small one, sadly), say “ah, yes, I see.”
I can’t draw any inferences, as I said, about the relationship between their “worldviews” & how they are thinking.
I have no information about their scors on “science comprehension” or “critical reasoning” tests.
But at that point I can draw an inference about their intellectual character: that they possess the virtue of being able and willing to recognize complexity.