Guest post: Some weird things in measuring belief in human-caused climate change

From an honest-to-god real expert–a guest post by Matt Motta, a post doctoral fellow associated with the Cultural Cognition Project and Annenberg Public Policy Center. Matt discusses his recent paper, An Experimental Examination of Measurement Disparities in Public Climate Change Beliefs.

 Do Americans Really Believe in Human-Caused Climate Change? 

Matt Motta (@matt_motta)

Do most Americans believe that climate change is caused by human activities? And what should we make of recent reports (e.g., Van Boven & Sherman 2018) suggesting that self-identified Republicans largely believe in climate change?

Surprisingly, given the impressive amount of public opinion research focused on assessing public attitudes about climate change (see: Capstick et al., 2014 for an excellent review), the number of Americans (and especially Republicans) who believe that climate change is human caused actually a source of popular and academic disagreement.

For example, scholars at the Pew Research Center have found that less than half of all Americans, and less than a quarter of Republicans, believe that climate change is caused by human activity (Funk & Kennedy 2016). In contrast, a team of academic researchers recently penned an op-ed in the New York Times (Van Boven & Sherman 2018; based on Van Boven, Ehret, & Sherman 2018) suggesting that most Americans, and even most Republicans, believe in climate change – including the possibility that it is human caused.

In a working paper, my coauthors (Daniel Chapman, Dominik Stecula, Kathryn Haglin and Dan Kahan) and I offer a novel framework for making sense of why researchers disagree about the number of Americans (and especially Republicans) who believe in human caused climate change. We argue that commonplace and seemingly minor decisions scholars make when asking the public questions about anthropogenic climate change can have a major impact on the proportion of the public who appears to believe in it.

Specifically, we focus on three common methodological choices researchers must make when asking these questions. First, scholars must decide whether they want to offer “discrete choice” or Likert style response options. Discrete choice responses force respondents to choose between alternative stances; e.g., whether climate change is human caused, or caused by natural factors. Likert-style response formats instead ask respondents to assess their levels of agreement or disagreement with a particular argument; e.g., whether one agrees or disagrees that climate change is human caused.

Likert-style response can be subject to “acquiescence bias,” which occurs when respondents simply agree with statements, potentially to avoid thinking carefully about the question being asked. Discrete choice response formats can reduce acquiescence bias, but allow for less granularity in expressing opinions about an issue. Whereas the Pew Study mentioned earlier made use of discrete style response options, the aforementioned op-ed made use of Likert style responses (and found comparatively higher levels of belief in anthropogenic climate change).

Second, researchers must choose whether or not to offer a hard or soft “don’t know” (DK) response option. Hard DK options expressly give respondents the opportunity to report that they do not know how they feel about a certain question. Soft DK responses, on the other hand, allow respondents to skip a question, but do not expressly advertise their ability to not answer it.

Hard DKs have the benefit of giving those who truly have no opinion about a particular prompt to say so; rather than either guess randomly, or – especially when Likert style questions – simply agree with the prompt. However, expressly offering a DK option risks that respondents will simply indicate that they “don’t know” rather than engage more effortfully with the survey. Again drawing on the two examples described earlier, the comparatively pessimistic Pew study offered respondents a hard DK, whereas the work summarized in the New York Times op-ed did not.

Third, researchers have the ability to offer text that provides basic background information about complex concepts; including (potentially) anthropogenic climate change. This approach has the benefit of making sure that respondents have a common level of understanding about an issue, before answering questions about it. However, scholars must choose the words provided in these short “explainers” very carefully – as information presented there may influence how respondents interpret the question.

For example, the research summarized in the New York Times op-ed described climate change as being caused by “increasing concentrations of greenhouse gasses.” Although this text does not attribute greenhouse gas emissions to any particular human source, it is important to keep in mind that skeptics may see climate change as the result of factors having nothing to do with gas emissions (e.g., that the sun itself is responsible for increased temperatures). Consequently, this text could lead respondents toward providing an answer that better matches scientific consensus on anthropogenic climate change.

We test the impact of these three decisions on the measurement of anthropogenic climate change attitudes, in a large demographically online survey of American adults (N = 7,019). Respondents were randomly assigned to answer one of eight questions about their belief in anthropogenic climate change; each varying one of the methodological decisions described above, and holding all other factors constant.

The results are summarized in the figure below. Hollow circles are number of respondents in each condition who purport to believe in human-caused climate change, with 95% confidence intervals extending outward from each one. The left-hand pane plots these quantities for the full sample, and the right-hand pane does the same for just self-identified Republicans. The elements varied in each experimental condition are listed in the text just below the figure.

Generally, the results suggest that minor differences in how we ask questions about anthropogenic climate change can increase the number of Americans (especially Republicans) who appear to believe in it.  For example, Likert style response options (conditions 5–8) always produce higher estimates of the number of Americans and Republicans than discrete choice style questions (conditions 1–4).

At times, these differences are quite dramatic. For example, Condition 1 mimics the way Pew (i.e., Funk & Kennedy 2016) ask questions about anthropogenic climate change;  using discrete-choice questions that offer a hard DK option with no “explainer text.” This method  suggests that 50% of Americans, and just 29% of Republicans, believe that climate change is caused by human activities.

Condition 8, on the other hand, mimics method used in the piece reported in the aforementioned op-ed; featuring Likert-style response options, text that explains that climate change is caused by the greenhouse effect, and no explicit DK option. In sharp contrast, this method finds that 71% of Americans and 61% of Republicans believe that climate change is human caused. This means that the methods used in Condition 8 more than double the number of Republicans who appear to believe in human caused climate change.

We think that these results offer readers a useful framework for making sense  public opinion about anthropogenic climate change. Our research urges readers to pay careful attention to the way in which public opinion researchers ask questions about anthropogenic climate change, and to consider how those decisions might increase (or decrease) the number of Americans who appear to believe in anthropogenic climate change. Of course, we do not propose a single measurement strategy as a “gold standard” for assessing opinion about anthropogenic climate change. Instead, we hope that these results can readers be better consumers of public opinion about climate change.

References

Capstick, S., Whitmarsh, L., Poortinga, W., Pidgeon, N., & Upham, P. International trends in public perceptions of climate change over the past quarter century. Wiley Interdisciplinary Reviews: Climate Change , 6(1), 35-61. (2015).

Ehret, P. J., Van Boven, L., & Sherman, D. K. (2018). Partisan Barriers to Bipartisanship: Understanding Climate Policy Polarization. Social Psychological and Personality Science, 1948550618758709.

Funk, C., & Kennedy, B. The politics of climate. Pew Research Center. Retrieved from: http://www.pewinternet.org/2016/10/04/the-politics-of-climate/ (2016, Oct 4)

Van Boven, L. & Sherman D. Actually, Republicans Do Believe in Climate Change. New York Times  (2018, July 28)

Van Boven, L., Ehret, P. J., & Sherman, D. K. Psychological barriers to bipartisan public support for climate policy. Perspectives on Psychological Science , 13(4), 492-507. (2018).

Leave a Comment

error: