Political psychologist Brendan Nyhan and his collaborators Jason Reifler & Peter Ubel just published a really cool paper in Medical Care entitled “The Hazards of Correcting Myths About Health Care Reform.” It shows just how astonishingly resistant the disease of ideologically motivated reasoning is to treatment with accurate information. And like all really good studies, it raises some really intersting questions.
NRU conducted an experiment on the effect of corrections of factually erroneous information originating from a partisan source. Two groups of subjects got a news article that reported on false assertions by Sarah Palin relating to the role of “death panels” in the Obamacare national health plan. One group received in addition a news story that reported that “nonpartisan health care experts have concluded that Palin was wrong.” NRU then compared the perceptions of the two groups.
Well, one thing they found is that the more subjects liked Palin, the more likely they were to believe Palin’s bogus “death panel” claims. Sure, not a big surprise.
They also found that the impact of being showing the “correction” was conditional on how much subjects liked Palin: the more they liked her, the less they credited the correction. Cool, but again not startling.
What was mind-blowing, however, was the interaction of these effects with political knowledge. As subjects became more pro-Palin in their feelings, high political knowledge subjects did not merely discount the “correction” by a larger amount than low political knowledge ones. Being exposed to the “nonpartisan experts say Palin wrong” message actually made high-knowledge subjects with pro-Palin sentiments credit her initially false statements even more strongly than their counterparts in the “uncorrected” or control condition!
The most straightforward interpretation is that for people who have the sort of disposition that “high political knowledge” measures, the “fact check”-style correction itself operated as a cue that the truth of Palin’s statements was a matter of partisan significance, thereby generating unconscious motivation in them to view her statements as true.
That’s singularly awful.
There was already plenty of reason to believe that just bombarding people with more and more “sound information” doesn’t neutralize polarization on culturally charged issues like climate change, gun control, nuclear power, etc.
There was also plenty of reason to think that individuals who are high in political knowledge are especially likely to display motivated reasoning and thus to be especially resistant to a simple “sound information” bombardment strategy.
But what NRU show is that things have become so bad in our polarized society that trying to correct partisan-motivated misperceptions of facts can actually make things worse! Responding to partisan misinformation with truth is akin to trying to douse a grease fire with water!
But really, I’d say that the experiment shows only potentially how bad things can get.
First, the NRU experimental design, like all experimental designs, is a model of real-world dynamics. I’d say the real-world setting it is modeling is one in which an issue is exquisitely fraught; Palin & Obamacare are each flammable enough on their own, so when you mix them together you’ve created an atmosphere just a match strike away from an immense combustion of ideologically motivated reasoning.
Still, there is plenty of reason to believe that there are conditions, issues, etc. like that in the world. So the NRU model gives us reason to be very wary of rushing around trying to expose “lies” as a strategy for correcting misinformation. At least sometimes, the the study cautions, you could be playing right into the misinformer’s hands.
Actually, I think that this is the scenario on the mind of those who’ve reacted negatively to the proposed use of climate change “truth squads”—SWAT teams of expert scientists who would be deployed to slap down every misrepresentation made by individuals or groups who misrepresent climate science. The NRU study gives more reason to think those who didn’t like this proposal were right to think this device would only amplify the signal on which polarization feeds.
Second, interpreting NRU, however, depends in part on what is being measured by “political knowledge.”
Measured with a civics quiz, essentially, “political knowledge” is well-known to amplify partisanship.
But why exactly?
The usual explanation is that people who are “high” in political knowledge literally just know more and hence assign political significance to information in a more accurate and reliable way. This by itself doesn’t sound so bad. People’s political views should reflect their values, and if getting the right fit requires information, then the “high” political knowledge individuals are engaged in better reasoning. Low-knowledge people bumble along and thus form incoherent views.
But that doesn’t seem satisfying when one examines how political knowledge can amplify motivated reasoning. When people engage in ideologically motivated reasoning, they give information the effect that gratifies their values independently of whether doing so generates accurate beliefs. Why would knowing more about political issues make people reason in this biased way?
Another explanation would be that “political knowledge” is actually measuring the disposition to define oneself in partisan terms. In that case, it would make sense to think of high knowledge as diagnostic or predictive of vulnerability to ideologically motivated reasoning. People with strong partisan identities are the ones who experience strong unconscious motivation to use what they know in a way that reinforces conclusions that are ideologically congenial.
Moreover, in that case, being low in “political knowledge” arguably makes one a better civic reasoner. Because one doesn’t define oneself so centrally with respect to one’s ideology or party membership, one gives information an effect that is more reliably connected to its connection to truth. Indeed, in NRU the “low knowledge” subjects seemed to be responding to “corrections” of misinformation in a normatively more desirable way—assuming what we desire is the reliable recognition and open-minded consideration of valid evidence.
I would say that the “partisan identity” interpretation of political knowledge is almost certainly correct, but that the “knows more, reasons better” interpretation is likely correct too. The theoretical framework that informs cultural cognition asserts that it is rational for people to regard politically charged information in a manner that reliably connects their beliefs to those that predominate in their group because the cost of being “out of synch” on a contentious matter is likely to be much higher than the cost of being “wrong”—something that on most political issues is costless to individuals, given how little impact their personal beliefs have on policymaking. If so, then, we should expect people who “know more” and “reason better” to be more reliable in “figuring out” what the political significance of information is—and thus more likely to display motivated reasoning.
In support of this, I’d cite two CCP studies. The first showed that individuals who have higher levels of science comprehension are more likely to polarize on climate change. The second shows that individuals who are higher in “cognitive reflection,” as measured by the CRT test, show an even greater tendency to engage in culturally or ideologically motivated reasoning when evaluating information.
These studies belie an interpretation of NRU that suggests that “low knowledge” subjects are reasoning in a higher quality way because they are not displaying motivated cognition. In truth, higher quality reasoning makes motivated reasoning worse.
Because it is rational for people to fit their perceptions of risk and other policy-consequential facts to their identities (indeed, because this is integral to their capacity to participate in collective knowledge), the way to avert political conflict over policy-relevant science isn’t to flood the political landscape with “information.” It is to protect the science communication enviroment from the antagonistic social meanings that are the source of the conflict between the individual interest that individuals have in forming and expressing commitment to particular cultural groups and the collective one that the members of all such groups have in converging on the best available evidence of how to secure their common ends.
What gives me pause, though, is an amazingly good book that I happen to be reading right now: The Ambivalent Partisan by Lavine, Johnston & Steenbergen. LJS reports empirical results identifying a class of people who don’t define themselves in strongly partisan terms, who engage in high quality reasoning (heuristic and systematic) when examining policy-relevant evidence, and who are largely immune to motivated reasoning.
That would make these ambivalent partisans models of civic virtue in the Liberal Republic of Science. I suppose it would mean too that we ought to go on a crash program to study these people and see if we could concoct a vaccine, or perhaps a genetic modification procedure, to inculcate these dispositions in others. And more seriously still (to me at least!), such findings might suggest that I need to completely rethink my understanding of cultural cognition as integral to rational engagement with information at an individual level. . . . I will give a fuller report on LJS in due course.
I can report for now, though, that NRU & LJS have both enhanced my knowledge and made me more confused about things I thought I was figuring out.
Important contributions to scholarly conversation tend to have exactly that effect!
References
Delli Carpini, M.X. & Keeter, S. What Americans Know About Politics and Why It Matters. (Yale University Press, New Haven; 1996).
Hovland, C.I. & Weiss, W. The Influence of Source Credibility on Communication Effectiveness. Public Opin Quart 15, 635-650 (1951-52).
Kahan, D. Fixing the Communications Failure. Nature 463, 296-297 (2010).
Kahan, D. Ideology, Cognitive Reflection, and Motivated Cognition, CCP Working Paper No. 107 (Nov. 29, 2012).
Kahan, D. Why we are poles apart on climate change. Nature 488, 255 (2012).
Kahan, D.M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L.L., Braman, D. & Mandel, G. The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change 2, 732-735 (2012).
Lavine, H., Johnston, C.D. & Steenbergen, M.R. The ambivalent partisan : how critical loyalty promotes democracy. (Oxford University Press, New York, NY; 2012).
Nyhan, B., Reifler, J. & Ubel, P.A. The Hazards of Correcting Myths About Health Care Reform. Medical Care Publish Ahead of Print, 10.1097/MLR.1090b1013e318279486b (9000).
Zaller, J.R. The Nature and Origins of Mass Opinion. (Cambridge Univ. Press, Cambridge, England; 1992).