Gave lecture /workshop today at Cambridge. It was advertised as being a session on the CCP working paper, “Motivated Numeracy and Enlightened Self-Government.” It was—but I added some context/motivation. Outline of what I remember saying below & slides here. Lots of great questions & comments after—on issues from the influence of cultural cognition on scientists to the relative potential impact of fear & curiosity in fortifying critical reasoning dispositions!
I. What’s the point? The “Motivated Numeracy” study is the latest (more or less) installment in a series intended to make sense of and maybe help solve the science communication problem. The “science communication problem” refers to the failure of valid, compelling, and widely accessible scientific evidence to dispel public controversy over risks and other policy-relevant facts. Climate change is a salient instance of the problem but is not the only one. The conflict between public and expert views on the safety of nuclear power once attracted nearly as much attention. There are other contemporary instances of the science communication problem, too, including the controversy over mandatory HPV vaccination in the US and GM foods in Europe (but actually not in the US).
II. Two theories. What accounts for the science communication problem? One explanation, the “public irrationality thesis,” attributes public controversy over climate change and other societal risks to the public’s limited capacity to comprehend science. The problem is only part one of a “knowledge deficit”; more important is a deficit in critical reasoning. Members of the public rely excessively on largely unconscious, heuristic-driven forms of information processing and thus overestimate more emotionally compelling dangers—such as terrorism—relative to less evocative ones like climate change, which the conscious, analytic modes of risk analysis used by experts show are even more consequential. Informed by Kahneman’s “system 1/system2” conception of dual process reasoning, PIT is more or less the dominant account in popular and academic commentary.
Another account of the science communication problem is the “cultural cognition thesis.” Cultural cognition involves the tendency of individuals to conform their perceptions of risk and other policy-relevant facts to the positions that are dominant in the affinity groups that play a central role in organizing their day-to-day lives. As a species of motivated reasoning, CCT is distinguished by its use of Mary Douglas’s “cultural worldview” framework to specify the core commitments of the affinity groups that shape information processing. CCT is distinguished from other conceptions of the “cultural theory of risk” by its attempt to root the influence that group commitments of this sort play in shaping perceptions of risk in cognitive mechanisms that admit of empirical investigation by the methods featured in social psychology and related disciplines.
III. Three studies. Motivated Numeracy describes the third in a series of studies dedicated to investigating the relationship between PIT and CCT. The first study, an observational one that examined the climate-change risk perceptions of a large nationally representative sample, made two findings at odds with PIT.
The first finding had to do with the impact of science comprehension on the perceived risk of climate change. If, as PIT asserts, the reason that the average member of the public is less concerned with climate change risks than he or she should be is that he or she lacks the capacity to make sense of scientific evidence, than one would expect people to become more concerned about climate change as their science literacy and quantitative reasoning abilities increase. But this isn’t so: the study found that the impact of these attributes on climate change risk was close to zero for the sample as a whole.
The second finding contrary to PIT had to do with the relationship between science comprehension and cultural cognition. PIT views cultural cognition as just another heuristic substitute for the capacity to understand and give proper effect to scientific evidence of risk: those who can are reliably guided by the best available evidence; those who can’t must with their gut, which is filled with crap like “what do people like me believe?” If this position is correct, one would expect the risk perceptions of culturally diverse individuals to be progressively less correlated with their groups and more correlated across groups as their science comprehension capacity increases.
But not so. On the contrary, cultural polarization, the first study found, increases as science comprehension does.
Why? The CCT explanation is that individuals are using their knowledge of and capacity to reason about scientific evidence to form and persist in beliefs that reflect their group identities.
The second study used experimental methods to test this hypothesis. The study found, consistent with CCT, that individuals who display the strongest disposition for cognitive reflection—a habit of mind associated with conscious, effortful system 2 reasoning—are more likely to discern the ideological implications of conceptually complicated information and selectively credit or reject it depending on its congeniality to their cultural outlooks.
The third and final study—the one the results of which are reported in “Motivated Numeracy”—likewise used an experimental design to assess whether individuals can be expected to use their critical reasoning dispositions in a manner that promotes identity-congruent rather than truth-congruent beliefs. The study compared the interaction of right-left ideology (an alternative way to measure the group affinities that generate cultural cognition) with numeracy, a quantitative reasoning capacity associated with “system 2” information processing.
Subjects were instructed to examine a problem understood to be a predictor of their vulnerability to a defective heuristic alternative to the assessment of covariance. The problem involved assessing whether the results of an experiment supported or negated a hypothesis. For subjects in the “control group,” this problem was styled as one involving the effectiveness of a new skin-rash treatment. As expected, only the most highly numerate subjects were likely to correctly interpret the experimental data.
Another version of the problem was styled as an experiment involving the effectiveness of a ban on carrying concealed weapons. In this condition, high-numerate subjects again did much better than low-numerate ones but only when the data properly construed generated an ideologically congenial result. When the data, properly construed, supported an ideological noncongenial result, high numerate subjects latched onto the incorrect but ideologically satisfying heuristic alternative to the logical analysis required to solve the problem correctly.
Because high-numeracy subjects used their quantitative reasoning powers selectively to credit evidence that low-numeracy subjects could not reliably interpret, high-numeracy subjects ended up more likely on average to disagree than low-numeracy ones. The impact of science comprehension in magnifying cultural polarization on climate change is consistent with exactly this pattern of ideologically opportunistic critical reasoning.
IV. One synthesis. The studies investigating the interaction of PIT and CCT support (provisionally, as always!) a cluster of interrelated descriptive, normative, and prescriptive conclusions.
A. The tragedy of the science communication commons. The science communication problem is a result not of too little rationality but rather too much. Because the beliefs and actions of any ordinary individual member of the public can’t affect climate change, neither she nor anyone she cares about will be put at risk if she makes a mistake in interpreting the best available evidence. But if such a person forms a position that is out of keeping with the dominant one in her affinity group, the consequences—in estrangement from those she depends on for support—can be extremely detrimental. It thus is individually rational for individuals to attend to information on societal risks that more reliably connects their beliefs to those shared by others with their defining outlooks than to the best available evidence. The more proficient they are in reasoning about scientific evidence, moreover, the more successful they’ll be in forming and persisting in such beliefs.
Such behavior, however, is collectively irrational. If all individuals pursue it simultaneously, they will not converge or converge as quickly as they should on valid evidence essential to their welfare. Yet this predictable consequence will not change the psychic incentive that any individual faces to form group- rather than truth-convergent beliefs.
The science communication problem thus involves a distinctive form of collective action problem—a tragedy of the science communications commons.
B. Pathological meanings. The signature attributes of the science communication problem—cultural polarization magnified by science comprehension—are not normal. The number of risk perceptions and like beliefs that display this pattern relative to the number that do not is tiny. On issues from fluoridation of water to the safety of medical x-rays, the most science comprehending individuals do converge, pulling along those who share their cultural outlooks. This process of knowledge transmission breaks down only when positions on disputed issues become symbols of membership in and loyalty to competing groups—at which point the stake ordinary individuals will have in forming group-convergent beliefs will systematically dominate the stake they have in forming truth-congruent ones.
This sort of entanglement of risk perceptions and culturally antagonistic meanings is a pathology—both in the sense of being harmful and in the sense of being unusual or opposed to the normal, healthy functioning of collective belief formation.
C. “Scicomm environment protection” as a public good. The health of a democratic society depends on the quality of the science communication environment just as the health of its members depends on the quality of the natural one. Antagonistic cultural meanings are a form of pollution in the science communication environment that disables the exercise of the rational faculties that ordinary citizens normally and reliably use to discern what’s known to science. Protecting the science communication environment from this toxin is a public good essential to enlightened self-government.
By using reason, we can protect reason from the distinctive threats that the science communication problem comprises.