Pew Research Center, which in my mind is the best outfit that regularly performs US public opinion surveys (the GSS & NES are the best longitudinal data sets for scholarly research; that’s a different matter), issued a super topical report finding that a “majority” — 56% — of the U.S. general public deems it “acceptable” (41% “unacceptable”) for the “NSA [to be] getting secret court orders to track calls of millions of Americans to investigate terrorism.”
Polls like this — ones that purport to characterize what the public “thinks” about one or another hotly debated national policy issue — are done all the time.
It’s my impression — from observing how the surveys are covered in the media and blogosphere– that people who closely follow public affairs regard these polls as filled with meaning (people who don’t closely follow public affairs are unlikely to notice the polls or express views about them). These highly engaged people infer that such surveys indicate how people all around them are reacting to significant and controversial policy issues. They think that the public sentiment that such surveys purport to measure is itself likely to be of consequence in shaping the positions that political actors in a democracy take on such policies.
Those understandings of what such polls mean strike me as naive.
The vast majority of the people being polled (assuming they are indeed representative of the US population; in Pew’s case, I’m sure they are, but that clearly isn’t so for a variety of other polling operations, particularly ones that use unstratified samples recruited in haphazard ways; consider studies based on Mechanical Turk workers, e.g.) have never heard of the policy in question. Never given them a moment’s thought. Their answers are pretty much random — or at best a noisy indicator of partisan affiliation, if they are able to grasp what the partisan significance of the issue is (most people aren’t very partisan and can’t reliably grasp the partisan significance of issues that aren’t high-profile, perennial ones, like gun control or climate change).
There’s a vast literature on this in political science. That literature consistently shows that the vast majority of the U.S. public has precious little knowledge of even the most basic political matters. (Pew — which usually doesn’t do tabloid-style “issue du jour” polling but rather really interesting studies of what the public knows about what — regularly issues surveys that measure public knowledge of politics too.)
To illustrate, here’s something from the survey I featured in yesterday’s post. The survey was performed on a nationally representative on-line sample, assembled by YouGov with recruitment and stratification methods that have been validated in a variety of ways and generate results that Nate Silver gives 2 (+/- 0.07) thumbs up to.
In the survey, I measured the “political knowledge” of the subjects, using a battery of questions that political scientists typically use to assess how civically engaged & aware people are.
One of the items asks:
How long is the term of office for a United States Senator? Is it
(a) two years
(b) four years
(c) five years or
(d) six years?
Here are the results:
Got that? Only about 50% of the U.S. population says “6 yrs” is the term of a U.S. Senator (a result very much in keeping with what surveys asking this question generally report).
How should we feel about half the population not knowing the answer to this question?
Well, before you answer, realize that less than 50% actually know the answer.
If the survey respondents here had been blindly guessing, 25% would have said 6 yrs. So we can be confident the proportion who picked 6 yrs because they knew that was the right answer was less than 50% (how much less? I’m sure there’s a mathematically tractable way to form a reasonable estimate — anyone want to tell us what it is and what figure applying it yields here?).
And now just answer this question: Why on earth would anyone think that even a tiny fraction of a sample less than half of whose members know something as basic as how long the term of a U.S. Senator is (and only 1/3 of whom can name their congressional Representative, and only 1/4 of whom can name both of their Senators…) has ever heard of the “NSA’s phone tracking” policy before being asked about it by the pollster?
Or to put it another way: when advised that “x% of the American public believes y about policy z,” why should we think we are learning anything more informative than what a pollster discovered from the opinion-survey equivalent of tossing thousands and thousands of coins in the air and carefully recording which sides they landed on?
Yes, Ms./Mr. NYT editorial writer. And that’s the same majority (pretty much the same percentage, in fact) that is “untroubled” by the risks associated with nanotechnology, which 80% of the U.S. population has never heard of.
It’s actually still possible to have serious things to say about “threats to our Democracy” if one bases one’s opinions on genuine understandings of mass public opinion and how it shapes political decisionmaking.
Indeed, whether one manages to say anything meaningful when one relies on cartoonish pictures of those things instead is at best a coin toss.