Cultural Cognition Dictionary (or Glossary, whatever)

Note: This document is under construction. New terms will be added intermittently during periods in which there is nothing else to do or in which there is something else to do and hence an opportunity to engage in creative procrastination.

Last updated: Nov. 3, 2018

Affect heuristic. Refers to the impact that affect (positive or negative feelings) have on information processing on risk.  The cultural cognition thesis posits that cultural outlooks determine the valence of such feelings, which can be treated as mediating the impact of cultural worldviews on risk perceptions and related facts. [Sources: Slovic et al., Risk Analysis, 24, 311-322 (2004); Peters & Slovic, J. Applied Social Psy., 16, 1427-1453 (1996); Peters, Burraston & Mertz, Risk Analysis, 18, 715-27 (1998); Poortinga & Pidgeon, Risk Analysis, 25, 199-209. Dated added: Jan. 7, 2018.]

Asymmetry vs. symmetry theses.  Refers to competing positions on whether forms of cognitive proficiency subversive of unbiased processing of political information are concentrated more heavily among conservatives than among liberals (and hence “asymmetrically” across the left-right political spectrum).  Discussed ad nausem in CCP blog.  Proponents of one position or the other tend to ignore MS2R, which implies, paradoxically, that if there is an asymmetry in how open-mindedness and reflection are distributed along the left-right political spectrum, those who possess a higher degree of this trait are the most likely to evince politically motivated reasoning. [Added April 28, 2018.]

Bounded rationality thesis (“BRT”). Espoused most influentially by Daniel Kahneman, this theory identifies over-reliance on heuristic reasoning as the source of various observed deficiencies (the availability effect; probability neglect; hindsight bias; hyperbolic discounting; the sunk-cost fallacy, etc.) in human reasoning under conditions of uncertainty. Nevertheless, BRT does not appear to be the source of cultural polarization over societal risks. On the contrary, such polarization has in various studies been shown to be the greatest in the individuals most disposed to resist the errors associated with heuristic information processing. [Sources: Kahan, Emerging Trends in the Social and Behavioral Sciences (2016); Kahneman, American Economic Review, 93(5), 1449-1475 (2003); Kahneman & Frederick in Morrison (Ed.), The Cambridge handbook of thinking and reasoning (pp. 267-293), Cambridge University Press. (2005); Kahneman, Slovic, & Tversky, A., Judgment Under Uncertainty: Heuristics and Biases, Cambridge ; New York: Cambridge University Press (1982). Added Jan. 12, 2018]

Thus a science-trained professional might “believe in” human evolution when he or she is engaged in professional tasks that depend on the truth of that theory, yet still disbelieve in human evolution when he or she is acting as a member of a religious community, in which such disbelief enables her to experience and signal membership in and loyalty to such a community. Farmers, too, have been observed to “disbelieve in” human-caused climate change when acting as members of their cultural communities, but to “believe in it” when endorsing farming practices that anticipate human-caused climate change. [Sources: Everhart & Hameed, Evolution: Education and Outreach, 6(1), 1-8 (2013); Prokopy, Morton et al., Climatic Change, 117, 943-50 (2014); Cultural cognition blog passim. Added: Jan. 4 2018].

Cognitive illiberalism. Refers to a  tendency to selectively impute cognizable secular harms to behavior that generates non-cognizable sectarian harms. Such a tendency is unconscious and hence invisible to the actor whose information-processing capabilities have been infected by it.  Indeed, the bias that cognitive illiberalism comprises can subvert a decisionmaker’s conscious, genuine intent to exercise legal authority consistent with liberal ideals. Featured in NY Times Magazine’s “Year in Ideas,” 1999.  [source: Kahan, Hoffman & Braman, Harv. L. Rev. (2009), 126, 837-906; Kahan, Hoffman, Braman, Evans & Rachlinski, Stan. L. Rev., 64, 851-906  (2012). Added Dec. 26, 2017.]

Cognitive Reflection Test (“CRT”). A three-item assessment of the capacity and disposition to override judments founded on intuition. Regarded as the best meausure of a person’s propensity to overrely on heuristic, System 1 information processing as opposed to conscious, analytical System 2 information processing.  [Frederick, S. (2005), 19, J. Econ. Persp.; Kahneman & Frederick (2005), in The Cambridge handbook of thinking and reasoning  (pp. 267-293). Dated added: Feb. 7, 2018.]

Cognitively illiberal state. Refers t o a liberal political regime pervaded—and hence subverted—by institutions and laws that reflect the unconscious tendency of legal and political decisionmakers to impute secular harms to behavior that imposes only sectarian ones. [source: Kahan, Stanford L. Rev. 60, 115-64 (2007). Added Dec. 26, 2017.]

Conflict entrepreneurs. Individuals or groups that profit from filling public discourse with antagonistic memes, thereby entangling diverse cultural identities with opposing positions on some science issue. The benefit conflict entrepreneurs derive—greater monetary contributions to the advocacy groups they head; the opportunity to collect large speaking fees; remunerative deals for popular books; high status within their cultural communities—doesn’t depend on whether their behavior genuinely promotes the cause they purport to be advancing. On the contrary, they profit most in an atmosphere pervaded by cultural recrimination and contempt, one in which democratic convergence on valid science is decidedly unlikely to occur. Their conduct contributes to that state. [Source: Kahan, Scheufele & Jamieson, Oxford Handbook on the Science of Science Communication, Introduction (2017); Kahan, Jamieson et al. J. Risk Res., 20, 1-40 (2017) Cultural Cognition blog, passim. Dated added: Jan. 7, 2018.]

Cross-cultural cultural cognition (“C4”). Describes the use of the Cultural Cognition Worldview Scales to assess risk perceptions outside of the U.S. So far, the scales have been used in at least five nations other nations (England, Switzerland, Australia, Norway and Slovakia). [CCP Bog, passim. Added Jan. 12, 2018.]

Cultural Cognition Project (“CCP”)A group of scholars interested in studying how cultural values shape public risk perceptions and related policy beliefs. Project members use the methods of various disciplines — including social psychology, anthropology, communications, and political science — to chart the impact of values on risk perceptions and to identify the mechanisms through which this effect operates. [source: CCP internet site. Added Jan. 6, 2018.]

Cultural cognition thesisThe conjecture that culture is prior to fact in debates over contested societal risks and related facts. Culture is prior not just in the normative sense that cultural values guide action conditional on beliefs about states of affairs; it is also prior in the positive sense that cultural commitments, through a variety of mechanisms, shape what individuals believe the relevant facts to be. [source: Kahan, Slovic, Braman & Gastil, Harvard Law Review 119, 1071-1109 (2006), p. 1083; dated added Dec. 23, 2017].

Cultural Cognition Worldview scales. Scales that reflect two continuous, cross-cutting preferences—“hierarchy” versus “egalitarianism,” and “individualism” versus “communitarianism”—for the ordering of social relations. The combination of orientations formed by the intersection of the scales are archetypes of the group affinities that inform the cultural cognition thesis. As such, the scales enable measurement of the predictions associated with the cultural cognition thesis [Source: Kahan in Handbook of Risk Theory: Epistemology, Decision Theory, Ethics and Social Implications of Risk, Hillerbrand et al. eds. , 325-60 (2012). Added: Jan. 15, 2018.]

Disentanglement principle.  Shorthand for a normative practice, derived from empirical findings, that supports the self-conscious presentation of scientific information in a manner that effectively severs diverse cultural identities from positions on contested science issues.  The effective use of the disentanglement principle has been credited with the successful teaching of evolutionary theory to secondary school students. It also is the basis for science communication in Southeast Florida, where community engagement with climate change science draws together groups and communities that hold opposing beliefs in human-caused climate change. [Sources: Lawson & Worsnop, Journal of Research in Science Teaching, 29, 143-66 (1992). Kahan, Advances in Pol. Psych., 36, 1-43. Added on Jan. 4, 2016.]

Dual process theory/theories. A set of decisionmaking frameworks that posit two discrete modes of information processing: one (often referred to as “System 1”) that is rapid, intuitive, and emotion pervaded; and another (often referred to as “System 2”) that is deliberate, self-conscious, and analytical. [Sources: Kahan, Emerging Trends in the Social and Behavioral Sciences (2016); Kahneman, American Economic Review, 93(5), 1449-1475 (2003); Kahneman & Frederick in Morrison (Ed.), The Cambridge handbook of thinking and reasoning (pp. 267-293), Cambridge University Press. (2005); Stanovich & West, Behavioral and Brain Sciences, 23(5), 645-665 (2000). Added Jan. 12, 2018.]

Expressive rationality. Refers to the tendency of individuals to (unconsciously) engage in the forms of information processing that signify their membership in, and loyalty to, important, identity-defining affinity groups. Among opposing groups, expressive rationality does not produce convergence but rather political polarization on the best available scientific evidence.  Despite the harm it causes, this form of reasoning has been found to intensify rather than disippate as the public attain greater proficiency in critical reasoning skills recognized as the most essential for making sense of scientific evidence (e.g., Sources cognitive reflection, actively open-minded thinking, and numeracy . [Sources: Kahan, Peters, et al., Nature Climate Change, 2, 732-35, p. 734 (2012); Kahan, Behavioral & Brain Sci. 40,26-28 (2016); Stanovich, Thinking & Reasoning, 19, 1-26 (2013). Added Dec. 27, 2017.].

From mouth of the scientist to ear of the citizen. A fallacious view that treats the words scientists utter as a causal influence on formation and reform of public opinion on controversial forms of science. The better view recognizes that what science knows is transmitted from scientists to the public via the influence of dense, overlapping networks of intermediaries, which include not just the media but (more decisively) individuals’ peers, whose words & actions vouch for the science (or not) through their own use (or non-use) of scientific insights.  Where there is a science communication problem, then, the source of it is the corruption of these intermediary networks, not any problem with how scienitsts themselves talk. [Source: Kahan, Oxford Handbook of the Science of Science Communication, eds. K.H. Jamieson, D.M. Kahan & D.Scheufele. Added: Jan. 19, 2018.]

Identity-protective reasoning.  The tendency of individuals to selectively credit and dismiss factual assertions in a manner that reflects and reinforces their cultural commitments, thereby expressing affective orientations that secure their own status within cultural groups. [source: Kahan, Slovic et al., J. Empirical Legal Studies, 4, 465-505 (2007); added Dec. 23, 2017]

Industrial strength risk perception measure (“ISRPM”). A seven- or ten-point Likert measure that assesses respondents’ perception of the degree of risk an activity or state of affairs poses to society.  At least where respondents have some degree of familiarity with the putative risk source, the ISRPM will tend to correlate very strongly (r ≈ 0.8) with any more specific factual evaluation of the risk source. This property of the ISRPM, which likely reflects the item’s discernment of respondents’ affective orientation, makes it valid (and economical) to use the ISRPM alone to measure public risk perceptions. [Sources: Kahan, Advances in Pol. Psych. 36, 1-43 (2007); Cultural Cognition blog passim. Added: Jan. 12, 2018.]

Knowledge deficit.  A theory (either explicit or implicit, conscious or unconscious) that treats simple unfamiliarity with facts as the cause of the public’s failure to converge on the best available scientific evidence on human-caused climate change, human evolution, the safety of nuclear power generation, etc. The theory also assumes (explicitly or implicitly, consciously or unconsciously) that simple communication of the best available evidence will dispel public conflict over facts.  [added Dec. 19, 2017]

Knowledge deficit2 A theory (either explicit or implicit, conscious or unconscious) that treats simple unfamiliarity with the “knowledge deficit fallacy” as the cause of science communicators’ failure to converge on the best available scientific evidence on how to communicate human-caused climate change, human evolution, the safety of nuclear power generation, etc. The theory also assumes (explicitly or implicitly, consciously or unconsciously–but always naively) that simple communication of the best available evidence on science communication will dispel science communicators’ reliance on the knowledge deficit theory. [added Dec. 19, 2017]

Macro & micro science communication. An expository device that can be used to dispel reliance on the fallacy that science communication is only one thing–e.g., guidelines for scientists seeking to communicate to the public; or newspaper writers trying to excite interest in new discoveries.  Under the “M&m” (or “m&M”–whichever is more convenient) framework, micro communication informs individuals of scientific insights essential to personal decisionmaking; macro addresses the social & cognitive influences that operate on institutional or collective decisionmaking. This account of the diversity of science communication contexts is obviously false, in the sense of being incomplete. But it excites conjectures that can be profitably be investigated empirically at the same time that it rebuffs mistakes about science communication that are even more divorced from the truth. [Source Fischoff & Scheufele, PNAS  110, 14031-14032 (2013); CCP Blog. Added: March 25, 2018.]

Mobility and stability hypotheses. Competing conjectures about how individuals’ perceptions of risk and related facts can be expected to behave across different settings (e.g., the workplace vs. the home). The “stability hypothesis” predicts that “individuals will seek to homogenize their experience of social structure in different areas of their lives” in a manner that reflects individuals’ static cultural worldviews. The “mobility hypothesis,” in contrast, holds that individuals’ can be expected to form differing perceptions to risk as they move across social contexts, which themselves are understood to embody distinct, and often opposing, cultural worldviews: “according to this view, individuals may flit like butterflies from context to context, changing the nature of their arguments as they do so” [Source: Rayner, Cultural Theory and Risk Analysis. In S. Krimsky & D. Golding (Eds.), Social Theories of Risk (Krimsky &Golding eds.) 83-115 (1992), pp. 105-106.]

Motivated numeracy. The effect of numeracy (an aptitude that reflects quality of reasoning with numbers) in magnifying culturally polarized states of opinion on socially disputed risks.  This tendency reflects a particular instance of MS2R. [Source:  Kahan, Peters, et al., Behav. Public Pol’y, 1, 54-86. Added: Jan. 7, 2018]

Motivated reasoning. A form of unconscious information-processing that is characterized by the selective crediting and discrediting of evidence in patterns that advance some goal or interest independent of the apprehension of truth. Cultural cognition—the biased assessment of evidence  protective of one’s status in identity-defining affinity groups—is one form of motivated reasoning. But there are many others, including self-serving apprehension of one’s own abilities, and inattention to evidence of one’s own mortality. Accordingly, cultural cognition should be not be equated with motivated reasoning but rather be treated as a species of it. [Source: Kunda Psychological Bulletin, 108, 480-498; Kahan, Harv L. Rev., 126, 1-77 (2011), pp. 19-26. Added Jan. 15, 2016.]

Motivated System 2 reasoning and “MS2R.”  A summary of the empirical research finding that as individuals’ cognitive proficiencies (measured by a variety of wide variety of critical thinking assessments including the Cognitive Reflection Test, Numeracy, Actively Opened-minded thinking, and the Ordinary Science Intelligence assessment) increase, so does their tendency to display identity-protective reasoning in their perception of relevant facts. Abbreviated as “MS2R.” “System 2” refers to the deliberate, self-conscious, analytical mode of information processing identified by dual process theories, which distinguish such reasoning from the rapid, intuitive, and emotion-laden form of information processing designated as “System 1.” [source: Kahan, Landrum et al., Advances in Pol. Psych., 38: 179-199, pp. 181-182 (2017); ; Cultural Cognition Blog, passim. Added Dec. 23, 2017].

Numeracy. An assessment test that measures the apptitude to reason well with quantative information and drraw appropriate infereneces therefrom. [Peters, et al. (2005) , Psych. Sci., 5, 407-13.   Dated added: 2/7/18. Dated added: Feb. 7, 2018.]

Ordinary science intelligence and OSI_2.0. A science-comprehension aptitude. Measured by “OSI_2.0,” the underlying reasoning disposition consists of the ability to recognize and apply scientific evidence relevant to the decisions that an ordinary member of the public makes in his or her capacity as a consumer, an employee, a democratic citizen, and the like. [Source:  Kahan, Journal of Risk Research, 14, 147-74. Added Jan. 6, 2018.]

Pattern recognition. A cognitive dynamic in which a person recognizes some object or state of affairs by matching it (preconsciously) to a rapidly conjured set of prototypes acquired through experience. [Source: Margolis, H. (1987), Patterns, Thinking, and Cognition (Univ. Chicago Press. Date added Jan. 29, 2018.]

Politically motivated reasoning paradigm (“PMRP”) and the PMRP design. A model of the tendency of individuals of diverse identities to polarize when exposed to evidence on a disputed policy-relevant science issue.  Starting with a truth-seeking Bayesian model of information processing, the model focuses on the disposition of individuals of diverse identities to attribute opposing likelihood ratios to evidence; this mechanism would assure that individuals of diverse identities will not converge but rather become more sharply divided when they process information. The PMRP method refers to study designs suited for observing this dynamic if it in fact exists. [Source: Kahan Emerging Trends in the Social and Behavioral Sciences (2016). Added: Jan. 8, 2018.]

Professional judgment. Domain specific “habits of mind” (most likely specialized forms of pattern recognition) that guide domain experts (e.g., judges). [Source: Margolis, H. (1996), Dealing with risk : why the public and the experts disagree on environmental issues. (University of Chicago Press.).

Rope-a-dope. A tactic of science miscommunication whereby a conflict entrepreneur baits the communicators into fighting him or her in a conspicuous forum. The strength of the arguments advanced by the antagonists, the conflict entrepreneur realizes, are largely irrelevant. What matters is the appearance of of a social controversy, which cues onlookers to connect the competing positions with membership in and loyalty to members of their cultural group. Falling for this gambit marks science communicators as the miscommunicators’ “dope.” [Source: Cultural Cognition Project blog here & here. Added: Jan. 19, 2018.]

Science curiosity, Science Curiosity Scale. Science curiosity is a “general disposition, variable in intensity across persons, that reflects the motivation to seek out and consume scientific information for personal pleasure” (Kahan et al., Advances in Pol. Psych., 38: 179-199, p. 180). The dispositon is measured with the  Science Curiosity Scale (Id., pp. 182-86). [Added: Nov. 3, 2018]

Science of science communication. A research program that uses science’s own signature methods of disciplined observation and valid causal inference to understand and manage the processes by which citizens come to know what is known by science. [Source: Oxford Hand of the Science of Science Communication, eds. K.H. Jamieson, D.M. Kahan & D.Scheufele, passim ;Kahan, J. Sci. Comm., 14(3) (2015). Added Jan. 19, 2018.]

Science communication environment and science communication environment “pollution.” To flourish, individuals and groups need to make use of more scientific insight than they have either the time or capacity to verify.  Rather than become scientific experts on myriad topics, then, individuals become experts at recognizing valid scientific information and distinguishing it from invalid counterfeits of the same. The myriad cues and related influences that individuals use to engage in this form of recognition is their scientific communication environment.  Dynamics that interfere with or corrupt these cues and influences (e.g. , toxic memes and politically motivated reasoning) can be viewed as science-communication-environment “pollution.” [Source: Kahan in Oxford Handbook of Science of Science Communication, Eds. Jamieson, Kahan & Scheufele) pp, 35-50 (2017); Kahan, Science, 332, 53-54 (2013). Added Jan. 8, 2018.]

Secular and sectarian harms.  Opposing theories of harm, which have different relationships to political liberalism. “Secular harms” comprise set-backs to interest that are independent of assent to any  culturally partisan conception of the best way to live.  Principal examples include damage to individuals’ physical security and impediments to their apprehension of collective knowledge.  Precisely because such harms can be experienced universally by citizens of diverse cultural identities, protecting citizens from such set-backs is a legitimate end for law in a liberal state. “Sectarian harms,” in contrast comprise set-backs to interest that are dependent on assent to a partisan conception of the best way to live.  A principle example is the offense that individuals experience when they are exposed to behavior that expresses commitments to values alien to theirs. Precisely because such harms depend on—cannot be defined independently of—adherence to a particular conception of the best life, using state power to avert or remedy them is illegitimate in a liberal state. [Sources: Rawls, Political Liberalism 175, 217-18 (1993); & J.S. Mill, On Liberty, ch. 1 (1859).  Dated added: Dec. 26, 2017.]

Situation sense. Karl Llewellyn’s description of domain-specific habits of mind, acquired through education and experience, that enable judges and lawyers to rapidly and reliably converge on case outcomes notwithstanding the indeterminacy of formal legal norms. [Source: Llewellyn, K. (1989), The Case Law System in America (M. Ansaldi, Trans.); Kahan, et al (2017), Univ. Penn. L. Rev., 164, 349, 439. Date added Jan. 29, 2018.]

Social proof. Refers to the information (of danger, of potential profit, etc.) conveyed by the conspicuous behavior of others whom one perceives to be informed and like-situated to oneself. [Cialdini, The Psychology of Persuasion. Date added March 5, 2018.]

Toxic memes. Recurring tropes and idioms, the propagation of which (usually at first by conflict entrepreneurs) fuse diverse cultural identities to opposing positions on some form of decision-relevant science. In the contaminated science communication environment that ensues, individuals relying on the opinion of their peers—generally a successful strategy for figuring out what science knows—polarize rather than converge on the best possible evidence. [Source: Kahan, Scheufele & Jamieson, Oxford Handbook on the Science of Science Communication, Introduction (2017); Kahan, Jamieson et al. J. Risk Res., 20, 1-40 (2017). Date added: Jan. 7, 2018.]

Tragedy of the science communications commons. A normative objection to expressive rationality.  While it is often rational for an individual to engage in this form of reasoning, it is a disaster when all members of a culturally diverse democratic society do so at once, for in that case that case, members of opposing cultural groups are unlikely to converge (or at least converge as soon as they should) on what science has to say about the risks their society faces.  This consequence of expressive rationality does, however, does nothing to reduce the psychic incentives that make it rational for any particular member  of the public to form identity-protective rather than truth-convergent forms of information processing. [Source: Kahan, Peters, et al., Nature Climate Change, 2, 732-35, p. 734 (2012). Added Dec. 27, 2017.]

Leave a Comment

error: