With the international political, financial and reputational stakes so high, it was only a matter of time before climate change appeared in the dock, handcuffed to its partner in prognostication, the dodgy discipline of extreme weather attribution.
Attribution, n., the art of evaluating the relative contributions of multiple causal factors to a change or an event, according to one’s prejudices.
To make sense of the climate change scene today, it is best to begin with the end game: the orthodoxy’s search for an argument, however abstruse, that will stand up in court. It needs one sufficiently “robust” to ensure developed countries—still effectively on trial in the United Nations, where a protracted “loss and damages” claim awaits resolution—and fossil fuel companies are legally liable to pay multi-billion-dollar “climate reparations” to the alleged victims of “carbon pollution”, be they in the developing world or in the path of a natural disaster.
Indeed, the credibility of the “relatively young science” of extreme weather attribution, the legitimacy of its ambition to “tease out the influence of human-caused climate change from other factors”, the whole alarmist movement and fate of the UN’s Green Climate Fund, all crucially depend on delivering such a legal argument.
How did we get to this point? When the climate change meme was planted successfully in the collective mind a decade ago as the most serious existential threat facing humankind, the orthodoxy wanted it to stay there. A sense of public anxiety had to be maintained, despite the risk of apocalypse fatigue syndrome.
So it created an Attribution of Climate-related Events (ACE) initiative. The international research agenda gradually shifted to the tricky territory of extreme weather attribution.
ACE’s first workshop was held on January 26, 2009, in Boulder, Colorado, at the Pei-designed National Center for Atmospheric Research (NCAR) Mesa Lab. Attendees included Myles Allen (Oxford University), Martin Hoerling (NOAA, USA), Peter Stott (UK Met Office, Hadley Centre), Kevin Trenberth (NCAR) and David Karoly (University of Melbourne). Its objective was to:
develop a conceptual framework for attribution activities to be elevated in priority and visibility, leading to substantial increases in resources (funds, people, computers) and both a research activity and a framework for an “operational” activity, that sets forth a goal of providing a lot more concrete information in near real time about what has happened and why in weather and climate.
ACE later released a four-paragraph statement. Its mission would be: “to provide authoritative assessments of the causes of anomalous climate conditions and EWEs” (extreme weather events), presumably for government agencies and the Intergovernmental Panel on Climate Change’s 2013/2014 Fifth Assessment Report (AR5).
But just how “robust”—one of the orthodoxy’s favourite adjectives—was the climate modelling underpinning this grand design? How could it be sold to the public, given the challenging uncertainties? ACE participants agreed they would need “increased real-time numerical experimentation activity” and something else too, a narrative that would ensure public interest.
To succeed, everyone would have to sing from the same song-sheet. There would have to be consistent use of terminology and close collaborative teamwork “to maintain an authoritative voice when explaining complex multi-factorial events such as the recent Australian bushfires” (my italics).
via The Global Warming Policy Forum (GWPF)
March 31, 2018 at 09:46AM