Weekend update: “Culture is prior to fact” & what that implies about resolving political conflict over risk

The idea that cultural cognition and related dynamics are peculiar to “unsettled” issues, or ones where the scientific evidence is not yet “clearly established,” is a recurring theme.  For some reason, the recent “What exactly is going on in their heads?” post has stimulated many commentators — in the discussion thread & in correspondence — to advance this claim.  In fact, that view is at odds with the central tenet of cultural cognition as a research program.

The cultural cognition thesis asserts that “culture is prior to fact” in a cognitive sense: the capacity of individuals to recognize the validity of evidence on risks and like policy-relevant facts depends on cognitive faculties that themselves are oriented by cultural affiliations. Because cultural norms and practices certify that evidence has the qualities that entitle it to being credited consistent with science’s criteria for valid proof, ordinary members of the public won’t be able to recognize that scientific evidence is “clear” or “settled” unless doing so is compatible with their cultural identities. 

Below I reproduce one relatively early formulation of this position. It is from  Kahan, D.M. & Braman, D. Cultural Cognition of Public Policy. Yale J. L. & Pub. Pol’y 24, 147-170 (2006).  

In this essay, Don “Shotgun” Braman & I characterize the “cultural cognition thesis” as a “conjecture.”  I am happy to have it continue to be characterized as such — indeed, prefer that it forever be referred to as “conjectural” no matter how much evidence is adduced to support it than that it be referred to as “proven” or “established” or the like, a way of talking that reflects a vulgar heuristic substitute for science’s own way of knowing, which treats every current best understanding as provisional and as subject to modification and even rejection in light of additional evidence. 

But in fact, since this essay was published, the Cultural Cognition Project has conducted numerous experiments that support the “cultural cognition thesis.”  These experiments present evidence on mechanisms of cognition the operation of which implies that “clear” or valid evidence can be recognized as such only when assent to it affirms rather than denigrates perceivers’ cultural identities.  Such mechanisms include (1) culturally biased search and assimilation; (2) cultural source credibility; (3) the cultural availability effect; and (4) culturally motivated system 2 reasoning 

As the excerpt emphasizes (and as is documented in its many footnotes, which are not reproduced here)all of these involve extensions of well-established existing psychological dynamics.  The nerve of the cultural cognition research program has been been simply to demonstrate important interactions between known cognitive mechanisms and cultural outlooks, a process that we hypothesize accounts for persistent political conflict on risk and other policy-relevant facts that admit of scientific investigation.

Knowing what I (provisionally) do now, there are collateral elements of the account below that I would qualify or possibly even disavow! I’m sure I’ll continue to discover holes and gaps and false starts in the future, too–and I look forward to that.

V. FROM HEURISTIC TO BIAS

Public disagreement about the consequences of law is not just a puzzle to be explained but a problem to be solved. The prospects for enlightened democratic decisionmaking obviously depend on some reliable mechanism for resolving such disputes and resolving them accurately. Because such disagreements turn on empirical claims that admit of scientific investigation, the conventional prescription is the pursuit and dissemination of scientifically sound information.

The hope that democracy can be enlightened in such a straightforward manner, however, turns out to be an idle one. Like most heuristics, cultural cognition is also a bias. By virtue of the power that cultural cognition exerts over belief formation, public dispute can be expected to persist on questions like the deterrent effect of capital punishment, the danger posed by global warming, the utility or futility of gun control, and the like, even after the truth of the matter has been conclusively established.

Imagine—very counterfactually—that all citizens are perfect Bayesians. That is, whenever they are apprised of reliable information, they readily update their prior factual beliefs in a manner that appropriately integrates this new information with all existing information at their disposal.

Even under these circumstances, conclusive discovery of the truth is no guarantee that citizens will converge on true beliefs about the consequences of contested public policies. For while Bayesianism tells individuals what to do with relevant and reliable information, it doesn’t tell them when they should regard information as relevant and reliable. Individuals can be expected to give dispositive empirical information the weight that it is due in a rational-decisionmaking calculus only if they recognize sound information when they see it.

The phenomenon of cultural cognition suggests they won’t. The same psychological and social processes that induce individuals to form factual beliefs consistent with their cultural orientation will also prevent them from perceiving contrary empirical data to be credible. Cognitive-dissonance avoidance will steel individuals to resist empirical data that either threatens practices they revere or bolsters ones they despise, particularly when accepting such data would force them to disagree with individuals they respect. The cultural judgments embedded in affect will speak more authoritatively than contrary data as individuals gauge what practices are dangerous and what practices are not. And the culturally partisan foundation of trust will make them dismiss contrary data as unreliable if they perceive that it originates from persons who don’t harbor their own cultural commitments.

This picture is borne out by additional well-established psychological and social mechanisms. One constraint on the disposition of individuals to accept empirical evidence that contradicts their culturally conditioned beliefs is the phenomenon of biased assimilation. This phenomenon refers to the tendency of individuals to condition their acceptance of new information as reliable based on its conformity to their prior beliefs. This disposition to reject empirical data that contradict one’s prior belief (for example, that the death penalty does or doesn’t deter crime) is likely to be especially pronounced when that belief is strongly connected to an individual’s cultural identity, for then the forces of cognitive dissonance avoidance that explain biased assimilation are likely to be most strongly aroused.

Two additional mechanisms reinforce the tendency to see new information as unreliable when it challenges a culturally congenial belief. The first is naïve realism. This phenomenon refers to the disposition of individuals to view the factual beliefs that predominate in their own cultural group as the product of “objective” assessment, and to attribute the contrary factual beliefs of their cultural and ideological adversaries to the biasing influence of their worldviews. Under these conditions, evidence of the truth will never travel across the boundary line that separates a factually enlightened cultural group from a factually benighted one.

Indeed, far from being admitted entry, the truth will be held up at the border precisely because it originates from an alien cultural destination. The second mechanism that constrains societal transmission of truth—reactive devaluation—is the tendency of individuals who belong to a group to dismiss the persuasiveness of evidence proffered by their adversaries in settings of intergroup conflict.

We have been focusing on the impact of cultural cognition as a bias in the public’s recognition of empirically sound information. But it would be a mistake to infer that the immunity of social and natural scientists to such bias improves the prospects for truth, once discovered, to penetrate public debate.

This would be a mistake, first, because scientists aren’t immune to the dynamics we have identified. Like everyone else, scientists (quite understandably, even rationally) rely heavily on their priors when evaluating the reliability of new information. In one ingenious study, for example, scientists were asked to judge the experimental and statistical methods of what was represented to be a real study of the phenomenon of ESP. Those who received the version of the fictitious study that found evidence of ESP rated the methods to be low in quality, whereas those who received the version that found no evidence of ESP rated the methods to be high in quality, even though the methods were in fact independent of the conclusion. Other studies showing that cultural worldviews explain variance in risk perceptions not just among lay persons but also among scientists who specialize in risk evaluation fortify the conclusion that for scientists, too, cultural cognition operates as an information-processing filter.

But second and more important, any special resistance scientists might have to the biasing effect of cultural cognition is beside the point. The issue is whether the discovery and dissemination of empirically sound information can, on its own, be expected to protect democratic policymaking from the distorting effect of culturally polarized beliefs among citizens and their representatives.

Again (for the umpteenth time), ordinary citizens aren’t in a position to determine for themselves whether this or that scientific study of the impact of gun control laws, of the deterrent effect of the death penalty, of the threat posed by global warming, et cetera, is sound. Scientific consensus, when it exists, determines beliefs in society at large only by virtue of social norms and practices that endow scientists with deference-compelling authority on the issues to which they speak. When they address matters that have no particular cultural valence within the group-grid matrix—What are the relative waterrepellant qualities of different synthetic fabrics? Has Fermat’s Last Theorem been solved?—the operation of these norms and practices is unremarkable and essentially invisible.

But when scientists speak to policy issues that are culturally disputed, then their truth-certifying credentials are necessarily put on trial. For many citizens, men and women in white lab coats speak with less authority than (mostly) men and women in black frocks. And even those who believe the scientists will still have to choose which scientists to believe. The laws of probability, not to mention the professional incentives toward contrarianism, assure that even in the face of widespread professional consensus there will be outliers. Citizens (again!) lack the capacity to decide for themselves whose work has more merit. They have no choice but to defer to those whom they trust to tell them which scientists to believe. And the people they trust are inevitably the ones whose cultural values they share, and who are inclined to credit or dismiss scientific evidence based on its conformity to their cultural priors.

These arguments are necessarily interpretative and conjectural. But in the spirit of (casual) empirical verification, we invite those who are skeptical to perform this thought experiment. Ask yourself whether you think there is any credible scientific ground for believing that global warming is/isn’t a serious threat; that the death penalty does/doesn’t deter; that gun control does/doesn’t reduce violent crime; that abortion is/isn’t safer than childbirth. If you believe the truth has been established on any one of these issues, ask yourself why it hasn’t dispelled public disagreement. If you catch yourself speculating about the possible hidden cognitive motivations the disbelievers might have by virtue of their cultural commitments, you may proceed to the next Part of this Essay (although not until you’ve reflected on why you think you know the truth and whether your cultural commitments might have anything to do with that belief).  If, in contrast, you are tempted to answer, “Because the information isn’t accessible to members of the public,” then please go back to the beginning of this Essay and start over.

  1. OVERCOMING CULTURAL BIAS: IDENTITY AFFIRMATION

Nothing in our account implies either that there is no truth of the matter on disputed empirical policy issues or that the public cannot be made receptive to that truth. Like at least some other cognitive biases, cultural cognition can be counteracted. . . .

Leave a Comment

error: