Month: September 2023

History Shows Today’s Ocean at Cool End of Range

You may have heard claims recently that the ocean is now “boiling”.  Fortunately, a world expert in ocean heat uptake provides a deep dive into oceanic temperature history, thereby putting that fear to rest.

Geoffrey Gebbie of Woods Hole Oceanographic Institution has published an highly informative study Combining Modern and Paleoceanographic Perspectives on Ocean Heat Uptake in Annual Review of Marine Science (2021).  H/T Kenneth Richard.  Below are the main findings, along with some excerpts in italics with my bolds.explaining some oceanography for the rest of us.

The large climatic shifts that started with the melting of the great ice sheets have
involved significant ocean heat uptake that was sustained over centuries and millennia,
and modern-ocean heat content changes are small by comparison.

Abstract

Monitoring Earth’s energy imbalance requires monitoring changes in the heat content of the ocean. Recent observational estimates indicate that ocean heat uptake is accelerating in the twenty-first century. Examination of estimates of ocean heat uptake over the industrial era, the Common Era of the last 2,000 years, and the period since the Last Glacial Maximum, 20,000 years ago, permits a wide perspective on modern-day warming rates. In addition, this longer-term focus illustrates how the dynamics of the deep ocean and the cryosphere were active in the past and are still active today. The large climatic shifts that started with the melting of the great ice sheets have involved significant ocean heat uptake that was sustained over centuries and millennia, and modern-ocean heat content changes are small by comparison.

Objective

This review seeks to put the most recent ocean heat uptake estimates of 0.5–0.7 W m−2 into the context of longer (multidecadal to millennial) timescales. Such timescales put a wider perspective on present-day heat uptake. In addition, the dynamics of these longer timescales may still have some expression today. This research direction leads to the long temperature time series of paleoceanographic proxies that predate the instrumental record. Ocean heat uptake over the last deglaciation (∼20,000–10,000 years ago) and the Common Era (previous two millennia) will serve as examples to explore the longer-timescale dynamics of ocean heat uptake.

Common Era Evolution of Mean Ocean Temperature

The Ocean2k global-mean SST compilation is derived from 57 marine proxy records that, in aggregate, show a statistically significant cooling trend from 700 to 1700 CE over the MCA–LIA transition. The data compilation contains a time series of 200-year averages that have been nondimensionalized. Here, we dimensionalize the values with the recommended values of McGregor et al. (2015) to obtain temperature anomalies, and the inferred global-mean surface cooling over the MCA–LIA transition is near the high end of the expected 0.4–0.6°C range (Figure 4a).

Figure 4 The Common Era. (a) The evolution of Ocean2k SST (blue circles, with σ/2 error bars) and mean ocean temperature, , as inferred from noble-gas measurements (red circles, with σ/2 error bars), the Gebbie & Huybers (2019) Common Era inversion (red line), and a power-law estimate (black line, with 2σ error shown in gray), referenced to global-mean SST in 1870. (b,c) Average ocean heat uptake over a running 50-year interval (panel b) and a 500-year interval (panel c) plotted from the Gebbie & Huybers (2019) inversion (red line) and a power-law estimate (black line, with 1σ error shown in gray). Heat uptake is expressed in terms of an equivalent planetary energy imbalance. Abbreviation: SST, sea-surface temperature.

One realization of the Common Era was produced by an inversion that attempted to reconstruct the three-dimensional evolution of oceanic temperature anomalies over the last 2,000 years (Gebbie & Huybers 2019). The inversion fits an empirical ocean circulation model to modern-day tracer observations, historical temperature observations from the HMS Challenger expedition of 1872–1876 (Murray 1895), and the global-mean Ocean2k SST. The resulting ocean temperature evolution is dominated by the propagation of surface climate anomalies from the MCA and LIA into the subsurface ocean, where the propagation is coherent for several centuries (red line in Figure 4a). Although the Gebbie & Huybers (2019) inversion was not constrained with oceanic power laws, the resulting mean ocean temperature is consistent with a power-law estimate over the Common Era.

Early-twenty-first-century SST may already be warmer than MCA SST, but it is
less likely that modern mean ocean temperature has surpassed MCA values.

From the Gebbie & Huybers (2019) inversion, it was inferred that the MCA ocean stored 1,000 ZJ more than the ocean of the year 2000, and that the ∼500 ZJ of heat uptake during the modern warming era is just one-third of what is required to reach MCA levels. Amplification of the high-latitude SST signal relative to the global mean can produce a greater MCA–LIA mean ocean cooling, which explains the greater MCA heat content relative to the present day. When considering the range of Common Era scenarios consistent with a power law, however, some cases are admitted where the MCA and the present day have similar oceanic heat content.

Deep-Ocean Heat Uptake During Modern Warming

Figure 6 Ocean heat uptake below 2,000-m depth, in terms of a planetary energy imbalance, for 50-year averages given by Zanna et al. (2019) (blue line), Gebbie & Huybers (2019) (red line), and the power-law estimate from this review (black line, with 2σ error in gray). An observational estimate (purple, with 2σ error bar) for 1990–2010 is also included (Purkey & Johnson 2010).

The confidence in upper-ocean heat content during the modern warming era starkly contrasts with the remaining uncertainties in heat content below 2,000-m depth (Figure 6). Observational estimates have indicated a deep-ocean heat uptake of 68 ± 61 mW m−2 (2σ) when differencing hydrographic sections between 1990 and 2010 (Purkey & Johnson 2010, Desbruyères et al. 2017). Estimation of deep-ocean heat uptake over the entire instrumental era relies to a greater extent on circulation models. Simulations of modern warming that are initialized from equilibrium in 1870 suggest that heat penetrates downward (Gregory 2000) and that average deep-ocean heat uptake is small over 50-year time intervals (Zanna et al. 2019). These estimates would not capture ongoing trends from the earlier Common Era, if any existed. An inversion that accounts for the LIA found a deep-ocean heat loss of 80 mW m−2 early in the modern warming era (Gebbie & Huybers 2019), and our power-law estimate suggests that an even greater cooling is possible, although the uncertainties are large. These discrepancies highlight the ongoing effect that Common Era variability could play in the modern-day ocean. Unfortunately, recent observations do not appear to be sufficient to distinguish between these scenarios, as they all suggest a weak deep-ocean heat uptake in the early twenty-first century.

Deep-ocean cooling could exist as the result of
disequilibrium between the upper and deep ocean.

Oceanic disequilibrium exists at a range of spatial and temporal scales, from local, short-term variability to longer-term changes that are anticipated to generally have greater spatial extent. Oceanic disequilibrium has been anticipated as a result of the 1815 Tambora (Stenchikov et al. 2009) and 1883 Krakatoa (Gleckler et al. 2006) volcanic eruptions and their lingering effects on energy imbalance. More generally, ocean disequilibrium can result from the differing adjustment times of the interior ocean to surface forcing, where the deep-ocean response may take longer than 1,000 years (e.g., Wunsch & Heimbach 2008). Accordingly, some influence of changes in surface climate over the last millennium is potentially present today. The most isolated waters of the mid-depth Pacific, for example, should still be adjusting to the MCA–LIA transition. In this scenario, these deep waters are cooling, but they are anomalously warm due to the residual influence of the MCA. 

The degree to which the ocean’s long memory affects today’s ocean is uncertain due to difficulties in integrating state-of-the-art circulation models over the entire Common Era. An accurate assessment may also require a model that can skillfully predict ocean circulation changes in both the past and the future. The climate history of the Common Era should also be better constrained by recovering additional observations, such as historical subsurface temperature observations and paleoceanographic data. Proper inference of climate sensitivity depends on the past oceanic heat uptake, which this review suggests is tied to the long timescale of deep-ocean dynamics.

Do notice the scale on the left axis. As though we can measure the whole ocean (71% of earth surface) to 0.05 C. It’s a formula converting zettajoules to temp change.

 

via Science Matters

https://ift.tt/Lz8KvZ9

September 5, 2023 at 06:10PM

I Left Out the Full Truth to Get My Climate Change Paper Published

A very illuminating article was published by a climate scientist, one could say whistleblower, in The Free Press today

The full article is well worth the read. It is a clear indictment of narrative enforcement.

The paper I just published—“Climate warming increases extreme daily wildfire growth risk in California”—focuses exclusively on how climate change has affected extreme wildfire behavior. I knew not to try to quantify key aspects other than climate change in my research because it would dilute the story that prestigious journals like Nature and its rival, Science, want to tell. 

This matters because it is critically important for scientists to be published in high-profile journals; in many ways, they are the gatekeepers for career success in academia. And the editors of these journals have made it abundantly clear, both by what they publish and what they reject, that they want climate papers that support certain preapproved narratives—even when those narratives come at the expense of broader knowledge for society. 

To put it bluntly, climate science has become less about understanding the complexities of the world and more about serving as a kind of Cassandra, urgently warning the public about the dangers of climate change. However understandable this instinct may be, it distorts a great deal of climate science research, misinforms the public, and most importantly, makes practical solutions more difficult to achieve. 

https://www.thefp.com/p/i-overhyped-climate-change-to-get-published

Patrick Brown goes in to detail how the scales are tipped to enforce the policy relevant narrative, emphasis mine.

This type of framing, with the influence of climate change unrealistically considered in isolation, is the norm for high-profile research papers. For example, in another recent influential Nature paper, scientists calculated that the two largest climate change impacts on society are deaths related to extreme heat and damage to agriculture. However, the authors never mention that climate change is not the dominant driver for either one of these impacts: heat-related deaths have been declining, and crop yields have been increasing for decades despite climate change. To acknowledge this would imply that the world has succeeded in some areas despite climate change—which, the thinking goes, would undermine the motivation for emissions reductions. 

This leads to a second unspoken rule in writing a successful climate paper. The authors should ignore—or at least downplay—practical actions that can counter the impact of climate change. If deaths due to extreme heat are decreasing and crop yields are increasing, then it stands to reason that we can overcome some major negative effects of climate change. Shouldn’t we then study how we have been able to achieve success so that we can facilitate more of it? Of course we should. But studying solutions rather than focusing on problems is simply not going to rouse the public—or the press. Besides, many mainstream climate scientists tend to view the whole prospect of, say, using technology to adapt to climate change as wrongheaded; addressing emissions is the right approach. So the savvy researcher knows to stay away from practical solutions.

Here’s a third trick: be sure to focus on metrics that will generate the most eye-popping numbers. Our paper, for instance, could have focused on a simple, intuitive metric like the number of additional acres that burned or the increase in intensity of wildfires because of climate change. Instead, we followed the common practice of looking at the change in risk of an extreme event—in our case, the increased risk of wildfires burning more than 10,000 acres in a single day.

This is a far less intuitive metric that is more difficult to translate into actionable information. So why is this more complicated and less useful kind of metric so common? Because it generally produces larger factors of increase than other calculations. To wit: you get bigger numbers that justify the importance of your work, its rightful place in Nature or Science, and widespread media coverage. 

https://www.thefp.com/p/i-overhyped-climate-change-to-get-published

Brown does not pull punches.

To put it another way, I sacrificed contributing the most valuable knowledge for society in order for the research to be compatible with the confirmation bias of the editors and reviewers of the journals I was targeting. 

The full article is well worth reading at THE FREE PRESS

H/T Willie Soon, Cam_S, pat-from-kerbob, Duane T, a Climate Researcher who shall remain nameless, and I saw it on X first.

via Watts Up With That?

https://ift.tt/hxYAGyo

September 5, 2023 at 04:00PM

Pacific Typhoon Frequency Trending Down, Contradicting Earlier Climate Predictions

Typhoon Update: 

The number of Pacific typhoons have decreased over the past 7  decades. That’s the trend according to the latest data from the Japan Meteorological Agency (JMA).  

Today we look at the data from the Japan Meteorological Agency (JMA) for the number of typhoons formed in the Pacific in the month of August, now that the latest data are available:

Data source: JMA. Chart by Kirye

This past August saw 6 typhoons, and so over the past 70+ years the overall August trend in terms of frequency has been modestly downward.

Clearly the so-called “climate experts” have been wrong since they claimed  tropical storms and would intensify and become more frequent as a result of a warmer planet. That hasn’t been the case in the Pacific.

Annual trend also downward

Next we look again at the latest data from the Japan Meteorological Agency (JMA) for the number of typhoons formed annually in the Pacific since 1951.

Data source: JMA.

This is good news. All the talk about a climate crisis is fake and hype.

Donate – choose an amount

via NoTricksZone

https://ift.tt/dcpzU63

September 5, 2023 at 01:23PM

Debunking Oreskes

I doubt that there are many aficionados of this website who have not heard the name ‘Oreskes’. This is because Professor Naomi Oreskes is an academic who specialises in debunking climate change sceptics. Yes, she takes people like you and cruelly exposes your naïve acceptance of the supposed obfuscation, distortions and downright lies issued by Big Oil. You can read all about it in her infamous exposé, The Merchants of Doubt, where she documents in forensic detail how Big Oil employed a panoply of propaganda techniques taken straight from the tobacco industry ‘playbook’. And one reason why she is deemed qualified to do this is because she has a firm grounding in the history of statistics and how it should be employed.

Except the truth is that she has neither. In fact, such is her profound lack of understanding in those two vital areas, that one has to wonder why anyone listens to her at all. And it isn’t as though this critical shortcoming has hitherto been hidden from the public gaze. For many years she has been writing articles that others, who are far better qualified to comment, have been quick to destroy. And yet she is still here, exhibiting the same levels of pseudo-expertise that she claims exist only within the ranks of the climate change sceptical. How does that work?

Today, and purely for your entertainment1, I wish to take you back to 2015, to provide you with a prime example of her seriously flawed understanding of how statistics works. Furthermore, I will demonstrate to you how, even then, it wasn’t difficult to find people who were able to expose her junk wisdom. So prepare for a masterclass in debunking, not from me but from a certain Nathan Schachtman, Esq., PC, who for over 40 years has specialised in the application of statistics and causal analysis in order to address scientific and medical legal issues.

In typical fashion, Oreskes set out her intention to deprecate the climate change sceptic by giving her article2 the title, ‘Playing Dumb on Climate Change’. Nevertheless, the statistically trained lawyer, Schachtman, was having none of it and responded with his own article, ‘Playing Dumb on Statistical Significance’. So what is so dumb about the Oreskes article? I’ll let Schachtman explain:

Oreskes wants her readers to believe that those who are resisting her conclusions about climate change are hiding behind an unreasonably high burden of proof, which follows from the conventional standard of significance in significance probability. In presenting her argument, Oreskes consistently misrepresents the meaning of statistical significance and confidence intervals to be about the overall burden of proof for a scientific claim.

To illustrate Oreskes’ intentions, Schachtman provides the following quote from her article:

Typically, scientists apply a 95 percent confidence limit, meaning that they will accept a causal claim only if they can show that the odds of the relationship’s occurring by chance are no more than one in 20. But it also means that if there’s more than even a scant 5 percent possibility that an event occurred by chance, scientists will reject the causal claim. It’s like not gambling in Las Vegas even though you had a nearly 95 percent chance of winning.

And to explain why this misrepresents the meaning of statistical significance and confidence limits, he points out the following:

Although the confidence interval is related to the pre-specified Type I error rate, alpha, and so a conventional alpha of 5% does lead to a coefficient of confidence of 95%, Oreskes has misstated the confidence interval to be a burden of proof consisting of a 95% posterior probability. The “relationship” is either true or not; the p-value or confidence interval provides a probability for the sample statistic, or one more extreme, on the assumption that the null hypothesis is correct. The 95% probability of confidence intervals derives from the long-term frequency that 95% of all confidence intervals, based upon samples of the same size, will contain the true parameter of interest.

To add to this, Schachtman points out:

[A]lthough statisticians have debated the meaning of the confidence interval, they have not wandered from its essential use as an estimation of the parameter (based upon the use of an unbiased, consistent sample statistic) and a measure of random error (not systematic error) about the sample statistic.

All of this might seem a bit convoluted but the essence is this: Oreskes has taken a concept that represents the likelihood of data (i.e. a measure of random error about the sample statistic) and interpreted it as a posterior probability that the hypothesis is true. It’s a classic transposition of the conditional, i.e. treating P(E|H) as if it were P(H|E) . As such, it’s the same gaffe that Professor Fenton pointed out when the IPCC had stated in their AR6 executive summary for policy makers that there was at least a 95% degree of certainty that more than half the recent warming is man-made. In fact, what the body of the report had actually said was that the probability of observing the recent warming was only 5% if AGW was not deemed to be contributing over half. That is a very different statement. To turn the latter into the former requires one to take into account the a priori probability that the climate models are a faithful and accurate representation of the warming processes. And that’s a very open question, despite what Oreskes would have you believe.

However, the catalogue of Oreskean error does not end there, since she dares to venture further into the expert domain of the statistical lawyer with the following:

But the 95 percent level has no actual basis in nature. It is a convention, a value judgment. The value it reflects is one that says that the worst mistake a scientist can make is to think an effect is real when it is not. This is the familiar “Type 1 error”…The fear of the Type 1 [false positive] error asks us to play dumb; in effect, to start from scratch and act as if we know nothing. That makes sense when we really don’t know what’s going on, as in the early stages of a scientific investigation. It also makes sense in a court of law, where we presume innocence to protect ourselves from government tyranny and overzealous prosecutors — but there are no doubt prosecutors who would argue for a lower standard to protect society from crime.

Once again, the lawyer with the statistics background has to remind Oreskes that you cannot equate the 95% coefficient of confidence in statistical theory with the legal standard known as “beyond a reasonable doubt”:

The truth of climate change opinions do not turn on sampling error, but rather on the desire to draw an inference from messy, incomplete, non-random, and inaccurate measurements, fed into models of uncertain validity. Oreskes suggests that significance probability is keeping us from acknowledging a scientific fact, but the climate change data sets are amply large to rule out sampling error if that were a problem. And Oreskes’ suggestion that somehow statistical significance is placing a burden upon the “victim,” is simply assuming what she hopes to prove; namely, that there is a victim (and a perpetrator).

The bottom line is that if you want to talk about Type I errors and burdens of proof, you need a much better grasp of statistical concepts than would seem to be the case with Oreskes. She goes on to try to rescue the situation with arguments that look Bayesian in nature but even then she falters badly by choosing passive smoking as her example. Schachtman has a long and successful career handling passive smoking claims in the courts, and he takes no prisoners in dismantling her ‘scientific’ arguments – but I’ll let you read that part for yourselves. What I will do here instead is leave you with Schachtman’s own closing statement:

I will leave substance of the climate change issue to others, but Oreskes’ methodological misidentification of the 95% coefficient of confidence with burden of proof is wrong. Regardless of motive, the error obscures the real debate, which is about data quality. More disturbing is that Oreskes’ error confuses significance and posterior probabilities, and distorts the meaning of burden of proof. To be sure, the article by Oreskes is labeled opinion, and Oreskes is entitled to her opinions about climate change and whatever.  To the extent that her opinions, however, are based upon obvious factual errors about statistical methodology, they are entitled to no weight at all.

Ooof!  That’s gonna hurt in the morning.

Epilogue

On his blog, dated 28th February 2023, Schachtman recounts how our intrepid expert, Professor Oreskes, sought to provide her expert testimony in support of Michael E. Mann’s defamation case against National Review magazine, the Competitive Enterprise Institute (CEI), and Mark Steyn. Despite being a professor of the History of Science, Oreskes’ hopes of putting her weight fully behind Mann were dashed when Judge Alfred S. Irving, Jr. decreed that she had no relevant expertise to offer. Oreskes’ opinions, at issue in the Mann case, were on:

  • the general basis for finding scientific research to be reliable, and
  • that “think-tanks” (including the defendant CEI) “ignore, misrepresent, or reject” principled scientific thought on environmental issues.

On the first issue, Irving ruled that her opinions were redundant, given that she is a historian and not a climate scientist. On the second issue, Irving asked what her expert methodology was for deciding whether principled scientific thought had been ignored, misrepresented or rejected. She described for Irving’s consideration something she referred to as a ‘content analysis’ she had performed when investigating Exxon:

We applied a well-established method in social science, which is broadly accepted as being, you know, a reputable method of analyzing something, content analysis, in order to show that there was this fairly substantial disparity between what the company [Exxon] scientists were saying in their private reports and publishing in peer-reviewed scientific literature which was essentially consistent with what other scientists were saying versus what the company was saying in public in advertisements that were aimed at the general public.

Except that, in the Mann case, Oreskes had to admit that she hadn’t actually used ‘content analysis’. Candidly she had to concede:

If you want me to tell you what my method is, it’s reading and thinking. We read. We read documents. And we think about them.

On the basis that it could be assumed that the jury members had already mastered the concepts of reading and thinking for themselves, Oreskes undoubted talents were politely declined by the judge, leaving Mann to do his own bullshitting. At least the courts were spared Oreskes’ botched explanations of ‘statistical significance’ and having to listen to her explaining why the scientists’ fear of Type I errors is making them far too conservative for the public good.

 Footnotes:

[1] I say just for your entertainment because nothing I write here about Oreskes will have the slightest impact on her reputation outside of Cliscep.

[2] I’m afraid that the Oreskes article was published in the New York Times and so it is behind a paywall. However, fortunately for the impoverished readers of Cliscep, Schachtman’s takedown includes extensive quotes from the NYT article, so you needn’t worry about redirecting funds from your jealously protected heat pump savings account.

via Climate Scepticism

https://ift.tt/iIwBRN6

September 5, 2023 at 01:14PM