Against Goldilocks "Theorizing" (on Climate Change Risk Perception & Anything Else)

We often are told that "dire news" on climate change provokes dissonance-driven resistance.

Yet many commentators who credit  this account also warn us not to raise public hopes by even engaging in research on -- much less discussion of -- the feasibility of geoengeineering. These analysts worry that any intimation that there's a technological "fix" for global warming will lull the public into a sense of false security, dissipating political resolve to clamp down on CO2 emissions.

So one might infer that what's needed is a "Goldilocks strategy" of science communication -- one that conveys neither too much alarm nor too little but instead evokes just the right mix of fear and hope to coax the democratic process into rational engagement with the facts.

Or one might infer that what's needed is a better theory--or simply a real theory--of public opinion on climate change.

Here's a possibility: individuals form perceptions of risk that reflect their cultural commitments.

Here's what that theory implies about "dire" and "hopeful" information on climate change: what impact it has will be conditional on what response -- fear or hope, reasoned consideration or dismissiveness-- best expresses the particular cultural commitments individuals happen to have.

And finally here's some evidence from an actual empirical test conducted (with both US & UK samples) to test this conjecture:  

  • When individuals are furnished with a "dire" message -- that substantial increases in CO2 emissions are essential to avert catastrophic effects for the environment and human well-being -- they don't react uniformly.  Hierarchical individualists, who have strong pro-commerce and pro-technology values, do become more dismissive of scientific evidence relating to climate change. However, egalitarian communitarians, who view commerce and industry as sources of unjust social disparities, react to the same information by crediting that evidence even more forcefully.
     
  • Likewise, individuals don't react uniformly when furnished "hopeful" information about the contribution that geoengineering might make to mitigating the consequences of climate change. Egalitarian communitarians — the ones who ordinarily are most worried — do become less inclined to credit scientific information that climate change is such a serious problem after all. But when given the same information about geoengineering, the normally skeptical hierarchical individualists respond by crediting such scientific information more.

Am I saying that this account is conclusively established & unassailably right, that everything else one might say in addition or instead is wrong, and that therefore this, that, or the other thing ineluctably follows about what to do and how to do it? No, at least not at the moment.

The only point, for now, is about Goldilocks. When you see her, watch out.

Decision science has supplied us with a rich inventory of mechanisms. Afforded complete freedom to pick and choose among them,  any analyst with even a modicum of imagination can explain pretty much any observed pattern in risk perception however he or she chooses and thus invest whatever communication strategy strikes his or her fancy with a patina of "empirical" support.

One of the ways to prevent being taken in by this type of faux explanation is to be very skeptical about Goldilocks. Her appearance -- the need to engage in ad hoc "fine tuning" to fit a theory to seemingly disparate observations -- is usually a sign that someone doesn't actually have a valid theory and is instead abusing decision science by mining it for tropes to construct just-so stories motivated (consciously or otherwise) by some extrinsic commitment.

The account I gave of how members of the public react to information about climate change risks didn't involve adjusting one dial up and another down to try to account for multiple off-setting effects.

That's because it showed there really aren't offsetting effects here. There's only one: the crediting of  information in proportion to its congeniality to cultural predispositions. 

The account is open to empirical challenge, certainly.  But that's exactly the problem with Goldilocks theorizing: with it anything can be explained, and thus no conclusion deduced from it can be refuted.

 

Previous
Previous

Two common (and recent) mistakes about dual process reasoning and cognitive bias

Next
Next

Industrial Strength Risk Perception Measure