Wild wild horses couldn’t drag me away: four “principles” for science communication and policymaking

Was invited to give a presentation on “effective science communication” for the National Academy of Sciences/National Research Council committee charged with preparing a report on wild horse & burro population management.

I happily accepted, for two reasons.

First, it really heartens and thrills me that the NAS gets the importance of integrating science and policymaking, on the one hand, with the science of science communication on the other. Indeed, as the NAS’s upcoming Sackler Colloquium on the Science of Science Communication attests, NAS is leading the way here. 

Second, it only took me about 5 minutes of conversation with Kara Laney, the NAS Program Officer who is organizing the NRC committee’s investigation of wild horse population management, to persuade me that the science communication dimension of this issue is fascinating. The day I spent at the committee’s meeting yesterday corroborated that judgment.

Not knowing anything about the specifics of wild-horse population management (aside from what everyone picks up just from personal experience & anecdote, etc), I confined myself to addressing research on the “science communication problem” — the failure of ample and widely disseminated science to quiet public dispute over policy-relevant facts that admit of scientific investigation. Like debates over climate change, HPV vacccination, nuclear power, etc.,  the dispute over wild-horse management falls squarely into that category.

After summarizing some illustrative findings (e.g., on the biasing impact of cultural outlooks on perceptions of scientific consensus; click on image for slides), I offered “four principles”:

First, science communication is a science.

Seems obvious–especially after someone walks you through 3 or 4 experiments — but in fact, the assumption that sound science communicates itself is the origin of messes like the one over climate change. As I said, NAS is now committed to remedying the destructive consquences of this attitude, but one can’t overemphasize how foolish it is to invest so much in policy-relevant science and then adopt a wholly ad hoc anti-scientific stance toward the dissemination of it.

Second, “science communication” is not one thing; it’s 5 (± 2).

Until recent times, those who thought systematically about science communication were interetested either in helping scientists learn to speak in terms intelligible to curious members of the public or in training science journalists to understand and accurately decipher scientists’ unintelligible pronouncements.

These are important things. But the idea that inarticulate scientists or bad journalists caused the climate change controversy, say, or that making scientists or journalists better communicators will solve that or other problems involving science and democratic decsionmaking is actually a remnant of the unscientific concepion of science communication– a vestiges, really, of the idea that “facts speak for themselves,” just so long as they are idiomatic, grammatical, etc.

As I explained in my talk, the disputes over climate change, the HPV vaccine, nuclear power, and gun control are not a consequence of a lack of clarity in science or a lack of science comprehension on the part of ordinary citizens.

The source of those controversies is a form of pollution in the science communication environment: antagonistic social meanings that get attached to facts and that interfere with the normally reliable capacity of ordinary people to figure out what’s known (usually by identifying who knows what about what).

Detoxifying the science communication environment and protecting it from becoming contaminated in the first place is thus another kind of “science communication,” one that has very little to do with helping scientists learn to avoid professional jargon when they give interviews to journalists, who themselves have been taught how to satisfy the interest that curious citizens have to participate in the thrill and wonder of our collective intelligence.

Those two kinds of science communication, moreover, are different from the sort that an expert like a doctor or a finanancial planner has to engage in to help individuals make good decisions about their own lives. The emerging scientific insights on graphic presentation of data etc. also won’t help fix problems like ones about climate change.

Still another form of science communication is the sort that is necessary to enable policymakers to make reliable and informed decisions under conditions of uncertainty. The NAS is taking the lead on this too — and isn’t laboring under the misimpression that what causes climate change is the “same thing” that has made judges accept finger prints and other bogus forms of forensic proof.

Finally, there is stakeholder science communication — the transmission of knowledge to ordinary citizens who are intimately affected by and who have (or are at least entiled to have) a say in collective decisionmaking. That’s mainly what the decisionmaking process surrounbding the wild-horse population is about.  There are scientific insights there, too– ones having very little to do with graphic presentation of data  or with good writing skills or with the sort of pollution problem that is responsible for climate change.

Third, “don’t ask what science communication can do for you; ask what you can do for science communication.”

Having just told the committee that their “science communication problem” is one distinct from four others, I anticipated what I was sure would be their next question: “so what do we do?”

Not surprisingly, that’s what practical people assigned to communicate always ask when they are engaging scholars who use scientific methods to study science communication. They want some “practical” advice–directions, instructions, guidelines.

My answer is that they actually shouldn’t be asking me or any other science-communication researcher for “how to” advice. And that they should be really really really suspicious of any social scientist who purports to give it to them; odds are that person has no idea what he or she is talking about.

Those who study science communication scientifically know something important and consequential, I’m convinced, about general dynamics of risk perception and science communication. But we know that only because we have investigated these matters in controlled laboratory environments– ones that abstract from real-world details that defy experimental control and confound interpretation of observations.

Studies, in other words, are models. They enable insight that one couldn’t reliably extract from the cacophony of real-world influences. Those insights, moreover, have very important real-world implications once extracted. But they do not themselves generate real-world communication materials.

The social scientists who don’t admit this usually end up offering banalities, like “Know your audience.”

That sort of advice is based on real, and really important, psychological research. But it’s pretty close to empty precisely because it’s (completely) devoid of any knowledge of the particulars of the communication context at hand (like what characteristics genuinely define the “audience” that is to be known, and what there actually is to “know” about it).

The practical communicators — the ones asking to be told what to do — are the people who have that knowledge. So they are the ones who have to use judgment to translate the general insights into real-world communication materials.

Experimentalists are not furnishing communicators with “shovel ready” construction plans. Rather they are supplying the communicators with reliable maps that tell them where they should dig and build through their own practical experimentation.

Once that process of experimental adaptation starts, moreover, the social scientist should then again do what she knows how to do: measure things.

She should be on hand to collect data and find out which sorts of real-world applications of knowledge extracted in the lab are actually working and which ones aren’t. She can then share that new knowledge with more people who have practical knowledge about other settings that demand intelligent science communication — and the process can be repeated.

And so forth and so on. Until what comes out is not a “how to” pamphlet but a genuine, evolving repository filled with vivid case studies, protocols, data collection and analysis tools and the like.

If you ask me for a facile check list of do’s & don’ts, I won’t give it to you.

Instead, I’ll stick a baton of reliable information in your hand, so you run the next lap in the advancement of our knowledge of how to communicate science in a democracy. I’ll even time you!

Fourth, science communication is a public good.

Clean air and water confer benefits independent of individuals’ contributions too them. Indeed, individuals’ personal contributions to clean air and water tend not to benefit them at all — it’s what others, en masse, are doing that determines whether the air and water are clean.

Same thing with the science communication environment. We all benefit when ordinary citizens form accurate judgments about what the best evidence is on issues like climate change. Accordingly, we all benefit when we live in an information environment free of toxic social meanings. But the judgments any ordinary person forms, and the behavior he or she engages in that amplify or mute toxic meanings — those have zero impact on him or her.

As a result, he or she and every other individual like him or her won’t have sufficient incentive to contribute. There has to be collective provisioning of such goods.

We need government policy for protection of the science communication environment every bit as much we need it to protect the physical environment.

There’s an importnat role for key entities in civil society too — like universities and foundations.

NAS is modeling the active, collective provisioning of this good.  Many others must now follow its lead!

Leave a Comment

error: