No-one Does Wrong Quite Like Lewandowsky

 

Uncertainty as Knowledge

Let’s say, for the sake of argument, that you wake up one morning with a burning desire to learn everything there is to know about uncertainty and its relationship to knowledge. And let’s say, for the sake of extending this rhetorical ploy, that you want to make sure that your desire is fulfilled without fear of being misinformed. Wary of Wikipedia, you turn instead to the Philosophical Transactions of the Royal Society – Mathematical, Physical and Engineering Sciences. To your delight, you find a special issue containing not one but a complete set of papers introduced under the title: ‘Uncertainty as Knowledge’. Finally, let us say, for the sake of dramatic irony, that you have never heard of the author, Stephan Lewandowsky, and so there is nothing to suggest that you have done anything other than stumble upon a mother lode of expert wisdom.

Now, for the sake of the unexpected twist, let’s try repeating the story but, this time, replace the eager student with a hard-bitten sceptic who had recently written an article bemoaning Peter Gleick’s treatment of the subject of uncertainty, and who was just looking to see where else such misdirection could be found. Surely not within the publications of the Philosophical Transactions of the Royal Society? Well let me tell you what I discovered so that you can judge for yourselves.

Experts on Parade

There are actually ten papers in the Royal Society’s set, not including the introductory paper written by Stephan Lewandowsky, Timothy Ballard and Richard D. Pancost. This means that there is far too much material for me to critique in just the one post. Consequently, I will concentrate here on the Lewandowsky et al introduction, together with the first of the papers it introduces – namely, a paper written by Mark Freeman et al, titled ‘Climate sensitivity uncertainty: when is good news bad?’ I may follow up by writing a review of the remaining nine papers but only if nothing more interesting distracts me in the coming few days.

In a nutshell, Lewandowsky’s introductory paper explains that, far from being an excuse to delay climate change action, uncertainty is the reason why one should act. The important relationship between uncertainty and knowledge is that the extent of the former provides the knowledgebase upon which we can formulate our risk management planning. To put it in the authors’ own words:

“Growing uncertainty about the future therefore ironically imbues us with the knowledge of what we can do to escape that uncertain future.”

A number of arguments, taken from ‘physical, economic and social perspectives’, are offered to back up this assertion. However, it is the Freeman et al paper that takes pride of place:

“Several articles in this issue expand on the relationship between uncertainty and knowledge. Perhaps the most formal and counterintuitive treatment is provided by Freeman et al. [2,3] whose article extends an initial analysis provided by Lewandowsky et al. [5,6].”

What we are faced with, therefore, is a phalanx of experts, lined up and waiting to set the record straight by providing a comprehensive and multi-stranded thesis, starting with (but my no means limited to) a mathematical analysis of uncertainty which builds upon that performed by none other than the great man himself. And all of this brought to you by that most august of authorities: The Philosophical Transactions of the Royal Society.

Such a shame, therefore, that the thesis falls flat on its face in the opening two sentences.

We’ve Been Here Before

I think we can all agree that if you are going to embark upon a lengthy dissertation on such a philosophically difficult subject as uncertainty and its relationship to knowledge, you should help your readers on their way by starting out with a clear and explicit statement as to what uncertainty is and where it comes from. Specifically, one should point out that uncertainty has two basic foundations: variability as an inherent feature of nature, and incertitude resulting from gaps in knowledge. Here is what Lewandowsky et al came up with:

“This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system.”

Well, inherent variability certainly seems important to the authors, but what about incertitude resulting from gaps in knowledge? One might be forgiven for thinking that incertitude’s role as a foundation for uncertainty is covered by the first sentence since – after all, knowledge does get a mention. But don’t be fooled. It quickly transpires that the authors are not referring to incertitude as a source of uncertainty; rather, they go on to make the point that analysing the ‘scientific’ source of uncertainty (inherent variability leading to a range of possibilities) provides the knowledge required to justify action, i.e. analysis of variability is the road to certitude. That is their idea of the relationship between uncertainty and knowledge – uncertainty is ‘a source of actionable knowledge’. Indeed, the very idea that lack of knowledge can lead to an uncertainty that obstructs good decision-making is soon dismissed as a scurrilous ploy:

“Although the climate community has sought to develop ways of dealing with the various forms of uncertainty (e.g. [1,2]), uncertainty has often been highlighted in public debates to preclude or delay political action (e.g. [3,4]). Appeals to uncertainty are so pervasive in political and lobbying circles that they have attracted scholarly attention under the name ‘scientific certainty argumentation methods’, or ‘SCAMs’ for short [3].”

If one follows the links, one discovers that the so-called scholarly attention to which the authors refer is little more than the well-used strawman argument that sceptics misunderstand the scientific method by demanding the removal of all incertitude. I don’t intend chasing that squirrel here, because the real problem is that an anxiety to downplay the important role of incertitude in the decision-making process has caused individuals such as Lewandowsky to completely overlook its role in undermining any proposed analysis of inherent variability. This, indeed, is the same mistake that Peter Gleick makes when he talks of ‘scientific uncertainty’ as if it relates only to the quantification of a range of possibilities. It’s almost as if these individuals actually believe that ‘scientific uncertainty’ can be quantified without considering the weight and heterogeneity of evidence.

The classic expression of this basic mistake is to attempt to analyze the uncertainty regarding equilibrium climate sensitivity (ECS) as if it were simply an exercise in measurement theory, conveniently furnished with probability distributions for the purposes of statistical analysis. In point of fact, the frequency distribution of ECS values provided by the collected outputs of climate model ensembles (and proxy reconstructions for that matter) is an expression of deterministic incertitude and not stochastic climate variability. Its shape is an artifact of the current state of knowledge and it hides all manner of selection effects and evidential issues. It certainly is not a measurement spread resulting from inherent natural variability. You can’t therefore treat it as a probability distribution, and to do so is a grave error. Everyone knows this, don’t they? So surely a gaffe such as this would not feature in the Philosophical Transactions of the Royal Society — would it?

Introducing the First Paper

Lewandowsky proudly points out that the first paper in the set (‘Climate sensitivity uncertainty: when is good news bad?’, by Freeman et al) builds upon a mathematical demonstration of his own, in which he shows that increased uncertainty regarding equilibrium climate sensitivity unavoidably implies increased risk, provided (as seems reasonable) that the posited damage function is convex. From this observation, Lewandowsky had concluded that uncertainty is not the sceptics’ friend. The Freeman et al paper extends this finding by further pointing out that, even when the increase in uncertainty involves only the lowering of the lower bound, the effect is still to increase the imperative for action. I won’t go into the details of the mathematics used to prove this point, other than to point out that theirs is basically an ordinal argument. It all looks very impressive and extends to a number of pages. Nevertheless, there is one basic problem that both the Lewandowsky and Freeman et al papers share: they both misunderstand the foundation of the ECS uncertainty and so make the mistake of assuming that the ECS frequency distribution can be treated as a probability distribution. At the end of the day, none of the authors involved seems to be particularly concerned about the difference between aleatory and epistemic uncertainty and no-one seems to appreciate the difficulties encountered when one uses the analytical methods devised for the former to analyse the latter. They are using the wrong mathematics and so are proving nothing – ordinal or otherwise. The result seems impressive but is based upon the false premise that the shape of the distribution accurately reflects the scale of the uncertainty.

You might think it harsh of me to suggest that Lewandowsky and his fellow contributors are guilty of an egregious error. After all, they are hardly alone in attempting to use frequentist statistical analysis to investigate incertitude. They are simply following a well-established tradition and extending it for all it is worth – which, to be honest, may not be so much. Furthermore, there are actually circumstances where the incertitude of the masses does act stochastically. Even so, there is no basis for assuming this applies in the example of the ECS frequency distribution. When dealing with evidential weight, consensus is a dodgy surrogate.

Traditional or not, just because so many have made the same mistake, that does not make it right (or perhaps it does if you follow Lewandowsky’s logic). Lewandowsky et al only attract my opprobrium because they make the error whilst mocking those who fail to do so (SCAMs indeed!). It is in that sense that no-one does wrong quite like Lewandowsky.

Ignorance as Uncertainty

I’d like to say that things get better after the Freeman et al paper, but they don’t. Far from being an aberration in an otherwise excellent set of papers, Freeman et al rather sets the standards for those that follow. The headline thesis is clear enough: Greater uncertainty implies a greater range of adverse possibilities and so contributes to the list of recognized perils. But nowhere within any of the papers is there a recognition that an increase in uncertainty may also undermine confidence in the range calculation, or even render meaningless the very idea of interpreting a frequency distribution as a probability curve. In fact, nowhere is there any serious discussion of epistemic issues such as evidential weight or the non-complementarity of evidence. Consequently, whilst I searched eagerly for any papers that covered the likes of fuzzy logic, imprecise probabilities, possibility theory, Dempster-Schafer theory, Info Gap decision theory, Bayesian belief networks or causal inference, all I found in the set was a loosely connected bunch of papers telling me that the more uncertain I am, the more I know how frightened I should be. This was a thesis that worked very well for me when I was a child afraid of the dark. Now I like to think I am more enlightened and I can appreciate that knowing what may possibly exist is a long way from knowing what necessarily exists. As a result, I am no longer a hostage to my imagination.

For the moment, I’ll leave it there. As I have said, there is too much material in this particular issue of Philosophical Transactions of the Royal Society for me to review in one post. I may or may not resume the critique in a future article. But, if I do, you shouldn’t expect there to be any great surprises. Whilst all of this stuff goes under the banner of ‘Uncertainty as Knowledge’ it also suffers from a lack of knowledge of what uncertainty is.

Like this:

Like Loading…

Related

via Climate Scepticism

https://ift.tt/3lvB3vm

August 29, 2020 at 09:39AM

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s