February 11, 2012

John Cook

Green Army: Persons of Interest





The Expert Consensus on Climate Change


In recent years, two studies have measured the level of agreement of human-caused warming in the scientific community. …
The first analysis of this type was by Naomi Oreskes, who in 2004 analysed publications in the Web of Science between 1993 and 2003 matching the search term 'global climate change'.
She found that out of 928 papers, none rejected the consensus position that humans are causing global warming.
Our paper builds upon this research.

We expanded the search to cover the 21 years from 1991 to 2011.
In addition to 'global climate change' papers, we also included papers matching the term 'global warming'.
This expanded the number of papers to over 12,000. …
Each abstract was [then] classified according to whether it explicitly or implicitly endorsed or rejected human-caused global warming, or whether it took no position on the cause of warming.

Out of the 12,000 papers, we identified just over 4,000 stating a position on human-caused global warming.
Among these 4,000 papers, 97.1% endorsed the consensus position that humans are causing global warming.

In the second phase of our study, we asked the scientists who authored the studies to rate their own papers.
1,200 scientists responded to our invitation, so that just over 2,000 papers in total received a self-rating.
Among the papers that were self-rated as stating a position on human-caused warming, 97.2% endorsed the consensus. …



However, there is a significant gap between public perception and the actual 97% consensus.
When a US representative sample was asked how many scientists agree that humans are causing global warming, the average answer was around 50%. …




When people correctly understand that the scientists agree on human-caused global warming, they're more likely to support policy that mitigates climate change.
This consensus gap is directly linked to a lack of public support for climate action.
This underscores the importance of clearly communicating the consensus and closing the consensus gap.

(John Cook, Video Abstract, Environment Research Letters, 2013)

Would you like to know more?



John Cook


Cognitive Scientist, Climate Change Communication Research Hub, Monash University.

  • Inoculate against misinformation, ABC Sunday Extra, 30 May 2021.
  • Why Curry, McIntyre, and Co are Still Wrong about IPCC Climate Model Accuracy, Skeptical Science, 4 October 2013.
  • Quantifying the consensus on anthropogenic global warming in the scientific literature, Environment Research Letters, 15 May, 2013.
    DOI: 10.1088/1748-9326/8/2/024024.
    John Cook, Dana Nuccitelli, Sarah Green, Mark Richardson, Barbel Winkler, Rob Painting, Robert Way, Peter Jacobs & Andrew Skuce.
  • The Debunking Handbook, Version 2, Skeptical Science, 23 January 2012.
    John Cook.
    Stephan Lewandowsky (1958): School of Psychology, University of Western Australia.

    Debunking the first myth about debunking


    Unless great care is taken, any effort to debunk misinformation can inadvertently reinforce the very myths one seeks to correct.
    To avoid these "backfire effects", an effective debunking requires three major elements.
    • First, the refutation must focus on core facts rather than the myth to avoid the misinformation becoming more familiar.
    • Second, any mention of a myth should be preceded by explicit warnings to notify the reader that the upcoming information is false.
    • Finally, the refutation should include an alternative explanation that accounts for important qualities in the original misinformation.

    [Democratic societies depend] on accurate information. …

    A common misconception [is] that public misperceptions are due to a lack of knowledge and[, therefore,] the solution is more information [— the so-called] “information deficit model”. …

    [To be effective] communicators need to understand
    • how people process information,
    • how they modify their existing knowledge and
    • how worldviews affect their ability to think rationally.
    It’s not just what people think that matters, but how they think.

    [“Misinformation” refers] to any information that people have acquired that turns out to be incorrect, irrespective of why and how that information was acquired in the first place.
    We are concerned with the cognitive processes that govern how people process corrections to information they have already acquired …
    [If] you find out that something you believe is wrong, how do you update your knowledge and memory? …

    [In] a 1994 experiment … people were exposed to misinformation about a fictitious warehouse fire, then given a correction clarifying the parts of the story that were incorrect.
    Despite remembering and accepting the correction, people still showed a lingering effect, referring to the misinformation when answering questions about the story. …

    The evidence indicates that no matter how vigorously and repeatedly we correct the misinformation … the influence remains detectable.

    [Indeed, attempts to debunk] a myth can actually strengthen it in people’s minds.
    Several [such] “backfire effects” have been observed, arising
    • from making myths more familiar,
    • from providing too many arguments, or
    • from providing evidence that threatens one’s worldview.
    (p 1)


    The Familiarity Backfire Effect


    [People] were shown a flyer that debunked common myths about flu vaccines.
    Afterwards, they were asked to separate the myths from the facts.
    When asked immediately after reading the flyer, people successfully identified the myths.
    However, when queried 30 minutes after reading the flyer, some people actually scored worse after reading the flyer.
    The debunking reinforced the myths.

    Ideally, avoid mentioning the myth altogether while correcting it.
    When seeking to counter misinformation, the best approach is to focus on the facts you wish to communicate.

    [If not] mentioning the myth … not a practical option. …
    Your debunking should begin with emphasis on the facts, not the myth.
    Your goal is to increase people’s familiarity with the facts.
    (p 2)


    The Overkill Backfire Effect


    When … refuting misinformation, less can be more.
    Generating three arguments … can be more successful in reducing misperceptions than generating twelve arguments, which can end up reinforcing the initial misperception.

    [This] occurs because processing many arguments takes more effort than just considering a few.
    A simple myth is more cognitively attractive than an over-complicated correction.
    • [Keep your content] easy to process [by using] simple language, short sentences, subheadings and paragraphs.
    • Avoid dramatic language and derogatory comments [ie s]tick to the facts.
    • [Use graphics wherever possible …]
    • End on a strong and simple message …
    Writing at a simple level runs the risk of sacrificing the complexities and nuances [hence] Skeptical Science [publishes] rebuttals at several levels.
    Basic versions are written using short, plain English text and simplified graphics.
    More technical Intermediate and Advanced versions are also available with more technical language and detailed explanations.
    (p 3)


    Filling the gap with an alternative explanation


    [In] an experiment in which people read a fictitious account of a warehouse fire [mention] was made of paint and gas cans along with explosions.
    Later … it was clarified that paint and cans were not present at the fire.
    Even when people remembered and accepted this correction, they still cited the paint or cans when asked questions about the fire.
    When asked,
    “Why do you think there was so much smoke?”,
    people routinely invoked the oil paint despite having just acknowledged it as not being present.
    [{When} an alternative explanation involving lighter fluid and accelerant was provided, people were less likely to cite the paint and gas cans when queried about the fire.]

    [People] prefer an incorrect model over an incomplete model.
    In the absence of a better explanation, they opt for the wrong explanation. …
    The most effective way to reduce the effect of misinformation is to provide an alternative explanation …

    [Similarly,] in fictional murder trials [accusing] an alternative suspect greatly reduced the number of guilty verdicts … compared to defences that [only] explained why the defendant wasn’t guilty. …
    When you debunk a myth, you create a gap in the person’s mind.
    To be effective, your debunking must fill that gap

    [The] most effective [rebuttal structure combines] an alternative explanation [with] an explicit warning [given before mentioning the myth]. …

    When people read a refutation that conflicts with their beliefs, they seize on ambiguities to construct an alternative interpretation.
    Graphics provide more clarity and less opportunity for misinterpretation.
    When self-identified Republicans were surveyed about their global warming beliefs, a significantly greater number accepted global warming when shown a graph of temperature trends compared to those who were given a written description.
    (p 4)


    The Worldview Backfire Effect


    For those who are strongly fixed in their views, being confronted with counter-arguments can cause their views to be strengthened.

    One cognitive process that contributes to this effect is Confirmation Bias …
    In one experiment, people were offered information on … gun control … labelled by its source … eg, the National Rifle Association vs. Citizens Against Handguns …
    [When] presented with a balanced set of facts, [the subjects reinforced] their pre-existing views by gravitating towards information they already agree with.
    The polarisation was greatest among those with strongly held views.

    [When presented] with arguments that run counter to their worldview … the cognitive process that comes to the fore is Disconfirmation Bias … where people spend significantly more time and thought actively arguing against opposing arguments.

    [When] Republicans who believed Saddam Hussein was linked to the 9/11 terrorist attacks were provided with evidence that there was no link between the two [only] 2% of participants changed their mind (although interestingly, 14% denied that they believed the link in the first place).
    The vast majority clung to the link between Iraq and 9/11 …
    The most common response was attitude bolstering — bringing supporting facts to mind while ignoring any contrary facts [— resulting in a] strengthening [of] erroneous belief. …

    [This] suggests that
    • outreaches should be directed towards the undecided majority rather than the unswayable minority.
    • [messages] be presented in ways that reduce the usual psychological resistance.
    • For example,
      • when worldview-threatening messages are coupled with … self-affirmation [people are more receptive to messages that might otherwise threaten their worldviews.]
        [This] can be achieved by asking people to write a few sentences about a time when they felt good about themselves because they acted on a value that was important to them. …
        [This] “self-affirmation effect” is strongest among those whose ideology was central to their sense of self-worth.
      • [“"Framing” the information] in a way that is [least] threatening …
        For example, Republicans are far more likely to accept an otherwise identical charge as a “carbon offset” than as a “tax”, whereas the wording has little effect on Democrats or Independents — because their values are not challenged by the word “tax”.
    (p 4)


    Anatomy of an effective debunking


    Core facts


    [A] refutation should emphasise the facts, not the myth:
    [97 out of 100 climate experts agree humans are causing global warming.
    Several independent surveys find 97% of climate scientists who are actively publishing peer-reviewed climate research agree that humans are causing global warming.]
    Present only key facts to avoid an Overkill Backfire Effect.


    Explicit warnings


    [Before] any mention of a myth, text or visual cues should warn that the upcoming information is false:
    [However, movements that deny a scientific consensus have always sought to cast doubt on the fact that a consensus exists.
    One technique is the use of fake experts, citing scientists who have little to no expertise in the particular field of science.
    For example, the OISM Petition Project claims 31,000 scientists disagree with the scientific consensus on global warming.]

    Alternative explanation


    [Any] gaps left by the debunking need to be filled:
    [However, around 99.9% of the scientists listed in the Petition Project are not climate scientists.
    The petition is open to anyone with a Bachelor of Science or higher and includes medical doctors, mechanical engineers and computer scientists.]
    This may be achieved by providing an alternative causal explanation for why the myth is wrong and, optionally, why the misinformers promoted the myth in the first place.


    Graphics


    [Core] facts should be displayed graphically if possible.
    (p 6)

    Would you like to know more?

  • An analysis of climate change denial, ABC Science Show, 14 May 2011.
  • Climate Change Denial: Heads in the Sand, Earthscan, London, 2011.
    Hadyn Washington: Environmental Scientist.
    John Cook.

    The Five Types Of Climate Denial Argument


    Conspiracy Theories


    Fake Experts


    Impossible Expectations


    Misrepresentations and Logical Fallacies


    Cherry-picking


    Do We Let Denial Prosper?


    Psychological Types of Denial

    1. Literal denial — the assertion that something did not happen or is not true. …
    2. Interpretive denial — in which the facts themselves are not denied, but they are given a different interpretation. …
    3. Implicatory denial — where what is denied are the "psychological, political or moral implications. …
      [Knowledge] itself is not at issue, but doing the 'right' thing with the knowledge." …

    (p 98)

  • The Scientific Guide to Global Warming Skepticism, Skeptical Science, December 2010.

    [Human 'Fingerprints' of Climate Change

    1. Fossil fuel signature in the air and coral — declining C13:C12 ratios due to release of C12 from burning fossil fuels. (p 2)
    2. Less heat is escaping out to space — trapping of heat by green house gases (GHGs) and low cloud (p 3).
    3. The ocean warming pattern — warming is greatest on the surface and declines with increasing depth (p 4).
    4. Nights warming faster than days — GHGs slow night time heat loss into space (p 5).
    5. Australian annual-average daily maximum temperatures have increased by 0.75 °C [and] overnight minimum temperatures have warmed by more than 1.1 °C since 1910.
      (CSIRO/BOM, 2012, p 2)
    6. More heat is returning to Earth — heat trapped by GHG is reradiated back to earth (p 7).
    7. Winter warming faster — GHGs slow winter heat loss (p 9).
    8. Cooling upper atmosphere — stratosphere would be warming if increased solar irradiance was responsible, instead it is cooling (p 10).]

  • Climate Misinformers, Skeptical Science.