Some experimental data on CRT, ideology, and motivated reasoning (probing Mooney’s Republican Brain)

This is my about my zillonth post on the so-called “asymmetry thesis”—the idea that culturally or ideologically motivated reasoning is concentrated disproportionately at one end of the political spectrum, viz., the right.

But it is also my second post commenting specifically on Chris Mooney’s Republican Brain, which very elegantly and energetically defends the asymmetry thesis. As I said in the first, I disagree with CM’s thesis, but I really really like the book. Indeed, I like it precisely because the cogency, completeness, and intellectual openness of CM’s synthesis of the social science support for the asymmetry thesis helped me to crystallize the basis of my own dissatisfaction with that position and the evidence on which it rests.

I’m not trying to be cute here.

I believe in the Popperian idea that collective knowledge advances through the perpetual dialectic of conjecture and refutation. We learn things through the constant probing and prodding of empirically grounded claims that have themselves emerged from the same sort of challenging of earlier ones.

If this is how things work, then those who succeed in formulating a compelling claim in a manner that enables productive critical engagement create conditions conducive to learning for everyone. They enable those who disagree to more clearly explain why (or show why by collecting their own evidence). And in so doing, they assure those who agree with the claim that it will not evade the sort of persistent testing that is the only basis for their continuing assent to it.

A. Recapping my concern with the existing data

In the last post, I reduced my main reservations with the evidence for the asymmetry thesis to three:

First, I voiced uneasiness with the “quality of reasoning” measures that figure in many of the studies Republic Brain relies on to show conservatives are closed minded or unreflective. Those that rely on dogmatic “personality” styles and on people’s own subjective characterization of their “open-mindedness” or amenability to reasoning are inferior, in my view, to objective, performance-based reasoning measures, particularly Numeracy and the Cognitive Reflection Test (CRT), which recently haven been shown to be much better predictors of vulnerability to one or another form of cognitive bias. CRT is the measure that figures in Kahneman’s justly famous “fast/slow”-“System 1/2” dual process theory.

Second, and even more fundamentally, I noted that there’s little evidence that any sort of quality of reasoning measure helps to identify vulnerability to motivated cognition—the tendency to unconsciously fit one’s assessment of evidence to some goal or interest extrinsic to forming an accurate belief. Indeed, I pointed out that there is evidence that the people highest in CRT and numeracy are more disposed to display ideologically motivated cognition. Mooney believes—and I agree—that ideologically motivated reasoning is at the root of disputes like climate change. But if the disposition to engage in higher quality, reflective reasoning doesn’t immunize people from motivated reasoning, then one can’t infer anything about disputes like climate change from studies that correlate the disposition to engage in higher quality, reflective reasoning with ideology..

Third, we should be relying instead on experiments that test for motivated reasoning directly. I suggested that many experiments that purport to find evidence of motivated reasoning aren’t well designed. They measure only whether people furnished with arguments change their minds; that’s consistent with unbiased as well as biased assessments of the evidence at hand. To be valid proof of motivated reasoning, studies must manipulate the ideological motivation subjects have for crediting one and the same piece of evidence.  Studies that do this show that conservatives and liberals both opportunisitically adjust their weighting of evidence conditional on its support for ideologically satisfying conclusions.

B. Some more data for consideration

Okay. Now I will present some evidence from a study that I designed with all three of these points—ones, again, that Mooney’s book convinced me are the nub of the matter—in mind.

That study tests three hypotheses:

(1) that there isn’t a meaningful connection between ideology and the disposition to use higher level, systematic cognition (“System 2” reasoning, in Kahneman’s terms) or open-mindedness, as measured by CRT;

(2) that a properly designed study will show that liberals as well as conservatives are prone to motivated reasoning on one particular form of policy-relevant scientific evidence: studies purporting to find that quality-of-reasoning measures show those on one or the other side of the climate-change debate are “closed minded” and unreflective; and

(3) that a disposition to engage in higher-level cognition (as measured by CRT) doesn’t counteract but in fact magnifies ideologically motivated cognition.

1. Relationship of CRT to ideology

This study involved a diverse national sample of U.S. adults (N = 1,750). I collected data on various demographic characteristics, including the subjects self-reported ideology and political-party allegiance.  And I had the subjects complete the CRT test.

I’ve actually done this before, finding only tiny and inconclusive correlations between ideology, culture, and party-affiliating, on the one hand, and CRT, on the other.

The same was true this time. Consistent with the first hypothesis, there was no meaningful correlation between CRT and either liberal-conservative ideology (measured with a standard 5-point scale) or cultural individualism (measured with our CC worldview scales).

There were weak correlations between CRT and both cultural hierarchy and political party affiliation. But the direction of the effects were contrary to the Republican Brain hypothesis.

That is, both hierarchy (as measured with the CC scale) and being a Republican (as measured by a standard 7-point partisan-identification measure) predicted higher levels of reflectiveness and analytical thinking as measured by CRT.

But the effects, as I mentioned (and as in the past), were miniscule.  I’ve set to the left the results of an ordered logistic regression that predicts the likelihood that someone who identifies as a “Democrat” or a “Republican” (2 & 6 on the 7-point scale), respectively, is to answer 0, 1, 2, or all 3 three CRT questions correctly (you can click here to see the regression outputs). For comparison, I’ve also included such models for religious as opposed to nonreligious and being female as opposed to male, both of which (here & here, e.g.) are known to be associated with lower CRT scores and which have bigger effects than does party affiliation.

Hard to believe that the trivial difference between Republicans and Democrats on CRT could explain much of anything, much less the intense conflicts we see over policy-relevant science in our society.

2. Ideologically motivated reasoning—relating to the asymmetry of ideologically motivated reasoning!

The study also had an experimental component.

The subjects were divided into three groups or experimental “conditions.”  In all of them, subjects indicated whether they agreed or disagreed–and how strongly (on a six-point scale)–with the statement:

I think the word-problem test I just took [i.e., the CRT test] supplies good evidence of how reflective and open-minded someone is.

But before they did, they received background information that varied between the experimental conditions.

In the “skeptics-biased” condition, subjects were advised:

Some psychologists believe the questions you have just answered measure how reflective and open-minded someone is.

In one recent study, a researcher found that people who accept evidence of climate change tend to get more answers correct than those who reject evidence of climate change. If the test is a valid way to measure open-mindedness, that finding would imply that those who believe climate change is happening are more open-minded than those who are skeptical that climate change is happening.

In contrast, in the “nonskeptics-biased” condition, subjects were advised:

Some psychologists believe the questions you have just answered measure how reflective and open-minded someone is.

In one recent study, a researcher found that people who reject evidence of climate change tend to get more answers correct than those who accept evidence of climate change. If the test is a valid way to measure open-mindedness, that finding would imply that those who are skeptical climate change is happening are more open-minded than those who believe that climate change is happening.

Finally, in the “control” condition, subjects read simply that “[s]ome psychologists believe the questions you have just answered measure how reflective and open-minded someone is” before they indicated whether they themselves agreed that the test was a valid measure of such a disposition.

You can probably see where I’m going with this.

All the subjects are indicating whether they believe the CRT test is a valid measure of reflection and open-mindedness and all are being given the same evidence that it is—namely, that “[s]ome psychologists believe” that that’s what it does.

Two-thirds of them are also being told, of course, that people who take one position on climate change did better than the other. Why should that make any difference? That’s just a result (like the findings of correlations between ideology and quality-of-reasoning measures in the studies described in Republican Brain); it’s not evidence one way or the other on whether the test is valid.

However, this additional information does either threaten or affirm the identities of the subjects to the extent that they (like most people) have a stake in believing that people who share their values are smart, open-minded people who form the “right view” on important and contentious political issues. Identity-protection is an established basis for motivated cognition—indeed, the primary one, various studies have concluded, for disputes that seem to divide groups on political grounds.

We didn’t ask subjects whether they believed that climate change was real or a serious threat or anything.  But, again, we did measure their political ideologies and political party allegiances (their cultural worldviews, too, but I’m going to focus on political measures, since that’s what most of the researchers featured in Republican Brain focus on).

Accordingly, if people tend to agree that the CRT is “supplies good evidence of how reflective and open-minded someone is” when the test is represented as showing that people who hold the position associated with their political identity are “open minded” and “reflective” but disagree when the test is represented as showing that such people are “biased,” that would be strong evidence of motivated cognition. They would then be assigning weight to one and the same piece of evidence conditional on the perceived ideological congeniality of the conclusion that it supports.

To analyze the results, I used a regression model that allowed me to assess simultaneously the influence of ideology and political party affiliation, the experimental group the subjects were in, and the subjects’ own CRT scores.

These figures (which are derived from the regression output that you can also find here) illustrate the results. On the left, you see the likelihood that someone who is either a “liberal Democrat” or a “conservative Republican” and who is “low” in CRT (someone who got 0 answers correct—as was true for 60% of the sample; most people aren’t inclined to use System 2 reasoning, so that’s what you’d expect) would “agree” the CRT is a valid test of reflective and open-minded thinking in the three conditions.

Not surprisingly, there’s not any real disagreement in the control condition. But in the “skeptic biased” condition—in which subjects were told that those who don’t accept evidence of climate change tended to score low—low CRT liberal Democrats were much more likely to “agree” than were low CRT conservative Republicans. That’s a motivated reasoning effect.

Interestingly, there was no ideological division among low CRT subjects in the “nonskeptic biased” condition—the one in which subjects were told that those who “accept” evidence of climate change do worse.

But there was plenty of ideological disagreement in the “nonskepetic biased” condition among subjects who scored higher in CRT! There was only about a 25% likelihood that a liberal Democrat who was “high” in CRT (I simulated 1.6 answers correct—“87th percentile” or + 1 SD—for graphic expositional purposes) would agree that CRT was valid if told that the test predicted “closed mindedness” among those who “accept evidence” of climate change.  There was a bit higher than 50% chance, though, that a “high” CRT conservative Republican would.

The positions of subjects like these flipped around in the “skeptic biased” condition.  That’s motivated reasoning.

It’s also motivated reasoning that gets higher as subjects become more disposed to use systematic or System 2 reasoning as measured by CRT.

That’s evidence consistent with hypotheses two and three.

The result is also consistent with the finding from the CCP Nature Climate Change study, which found that those who are high in science literacy and numeracy (a component of which is CRT) are the most culturally polarized on both climate change and nuclear power.  The basic idea behind the hypothesis is that in a “toxic science communication climate”—one in which positions on issues of fact become symbols of group identity—everyone has a psychic incentive to fit evidence to their group commitments. Those who are high in science literacy and technical reasoning ability are able to use those skills to get an even better fit. . . .

None of this, moreover, is consistent with the sort of evidence that drives the asymmetry thesis:

(1) There’s not a meaningful correlation here between partisan identity and one super solid measure of higher level cognitive reasoning.

(2) What’s more, higher-level reasoning doesn’t mitigate motivated reasoning. On the contrary, it aggravates it. So if motivated reasoning is the source of political conflict on policy-relevant science (a proposition that is assumed, basically, by proponents of the asymmetry thesis), then whatever correlation might exist between low-level cognitive reasoning capacity and conservativism can’t be the source of such conflict.

(3) In a valid experimental design, there’s motivated reasoning all around—not just on the part of Republicans.

But is the level of motivated reasoning in this experiment genuinely “symmetrical” with respect to Democrats and Republicans. Is the effect “uniform” across the ideological spectrum?

Frankly, I’m not sure that that question matters. There’s enough motivated reasoning across the ideological spectrum (and cultural spectra)—this study and others suggest—for everyone to be troubled and worried.

But the data do still have something to say about this issue. Indeed, it enables me to say something directly about it because there’s enough data to employ the right sorts of statistical tests (ones that involve fitting “curvilinear” or polynomial models rather than linear ones to the data).

But I’ve said enough for now, don’t you think?

I’ll discuss that another time (soon, I promise).

Post 1 & Post 3 in this “series”

Leave a Comment

error: