A complete and accurate account of how everything works

Okay, not really– but in a sense better than that: a simple model that is closer to being true than the most likely alternative model a lot of people probably have in mind when they try to make sense of public risk perceptions.

Above is a diagram that I created in response a friend’s question about how cultural cognition relates to Kahneman’s system 1/system 2 (or “fast”/”slow”) dual process reasoning framework.

Start at the bottom: exposure to information determines perception of risk.

Okay, but how is information taken in or assessed?

Well, move up to the top & you see Kahneman’s 2 systems. No 1 is largely unconscious, emotional. It’s the source of myriad biases. No. 2 is conscious, reflective, algorithmic. It double checks 1’s assessment and thus corrects its errors–assuming one has the cognitive capacity and time needed to bring it to bear. The arrows from these influences intersect the one from information to risk perception to signify that Systems 1 & 2 determine the impact that information has.

But there has to be something more going on. We know that some people react one way & some another to one and the same piece of evidence or information about climate change, guns, nuclear power, etc . And we know, too, that the reason they do isn’t that some use “fast” system 1 and others “slow” system 2 to make sense of such information; people who are able and disposed to resort to conscious, analytical assessment of information are in fact even more polarized than those who reason mainly with their gut.

The necessary additional piece of the model is supplied by cultural worldviews, which you encounter if you now move down a level. The arrows originating in “cultural worldviews” & intersecting those that run from “system 1” and “system 2” to “risk information” indicate that worldviews interact with those modes of reasoning. Worldviews don’t operate as a supplementary or alternative influence on risk perception but rather determine the valence of the influence of the various forms of cognition that system 1 and system 2 each comprises.

Whether that valence is positive or negative depends on the cultural meaning of the information

Cultural meaning” is the narrative congeniality or uncongeniality of the information–its disappointment or gratification of the expectations & hopes that a person with a particular worldview has about the best way of life.

Kahneman had this in mind, essentially, when, in his Sackler Lecture, he assimilated cultural cognition into system 1. System 1 is driven by emotional association. The emotional association are likely to be determined by moral evaluations of putative risk sources (nuclear power plants, say, or HPV vaccines). Because such evaluations vary across groups, members of those groups react differently to the information (some concluding “high risk” others “low”). Hence, Kahneman reasoned, cultural cognition is bound up with — it interacts, determines the valence of– heuristic reasoning.

The study we published recently in Nature Climate Change, though, adds the arrow that starts in cultural worldview & intersects the path between system 2 & information. We found that individuals disposed to use system 2 are more polarized, because (we surmise; we are doing experiments to test this conjecture further) they opportunistically use their higher quality reasoning faculties (better math skills, superior comprehension of statistics & the like) to fit the evidence to the narrative that fits their cultural worldview.

By the way, I stuck an arrow with an uncertain origin to the left of “risk information” to indicate that information need not be viewed as exogenous — or unrelated to the other elements of the model. There are lots of influences on information exposure, obviously, but cultural worldviews are an important one of them! People seek out and other otherwise more likely to be exposed to information that is congenial to their cultural outlooks; this reinforces the tendency toward cultural polarization on issues that become infused with antagonistic cultural meanings.

This representation of the mechanisms of risk perception not only helps to show how things work but also how they might be made to work better. Just saturating people with information won’t help to promote convergence on the best available information. Even if one crafts one’s message to anticipate the distinctive operation of Systems 1 & 2 on information processing, people with diverse cultural outlooks will still draw opposing inferences from that information (case in point: the competing inferences people with opposing cultural worldviews draw about climate change when they reflect on recent local weather …).

Or at least they will if the information on some issue like climate change, the HPV vaccine, gun possession or the like continues to convey antagonistic cultural meanings to such individuals. To promote open-minded engagement and preempt cultural polarization, risk communication only has to be fitted to popular information-processing styles but also framed in a manner that conveys congenial cultural meanings to all its recipients.

How does one accomplish that? That is the point of the “2 channel strategy” of science communication that we conceptualize and test in Geoengineering and the Science Communication Environment: A Cross-Cultural Experiment, Cultural Cognition Working Paper No. 92.

Leave a Comment

error: