Month: June 2024

Hot Facts about Heat

Follow-up Essay by Kip Hansen — 28 June 2024 — 900 words

My recent essay titled “Why Do They Lie About Extreme Temperature Deaths?” highlighted the outright falsehoods being repeated in the mainstream media about the dangers of extreme temperatures, heat and cold, prompted and encouraged by the major Climate Crisis Propaganda cabals:  Covering Climate Now and Inside Climate News.  Of course, it is worse than that, in the United States, even federal governmental agencies, under the guise of informing the public, do the same. 

For instance, the Environmental Protection Agency, EPA, offers  series of Climate Change Indicators.   Why the EPA has a huge section on climate change is a mystery to me, other than rank agenda-pushing, but there you have it: the massive governmental Enviro-Climatism Agenda writ large. On the page “Climate Change Indicators: Heat-Related Deaths”, the EPA informs us, in the Background section of the page: 

“Heat is the leading weather-related killer in the United States”

This is simply a “talking point” and is not true, not even according to EPA’s own Climate Change Indicators.

We at WUWT (and the dozens of other kindred spirits in the blogosphere) are not alone in this fight against propaganda surrounding the Climate Change issue.  Allies are not restricted to those who normally cover climate, weather and related science news beats.

Here I call your attention to a helpful and very well-written editorial at Issues & Insights, “Heat Wave Sets Off New Round Of ‘Climate Crisis’ Liesby the I & I Editorial Board on June 19, 2024.  These folks are the professionals: “Issues & Insights is run by seasoned journalists who were behind the Pulitzer Prize-winning IBD Editorials [Investor’s Business Daily] page (before it was summarily shut down).”  It was brought to my attention via the “WUWT Tips and Notes” page by John Merline  at I&I (a sample of his past writing here).  This editorial is well worth reading and has some very clear and useful graphics.  There is a companion piece written a few days laterGoogle Doesn’t Want You To Know The Truth About Heat Waves And ‘Climate Change’”, which you should read as well.

Their lede:  “There’s a summer heat wave going on, which gives journalists the opportunity to fill up their stories with climate change boilerplate. It no longer matters whether any of it is true. Just the opposite, in fact. If you point out the truth, you’re accused of being a denier.”

The web page has a little note at the top:  “Follow up: As we predicted, Google is blocking its ads from appearing on this page because, according to Google, it contains “unreliable and harmful claims.””

I&I offer this graphic:

This shows annual Deaths per Million with data from the EPA’s to Climate Change Indicator pages, Heat-Related Deaths and Cold-Related Deaths.

[Aside:  There is a funny and on-going story about those two pages, which I will mention in the Author’s Comment section following the essay.)

The EPA’s data, derived from the CDC database of death certificates, clearly shows that, in the United States, when counting data as entered into Death Certificates (see my essay on that here), in the most recent years shown, that cold kills people at a rate of 5 to 6 per million every year, which heat kills at the rate of between 2 and 3 per million.  (In the “funny story” bit, EPA has updated heat deaths to “almost 5” in 2022 but has neglected to update cold deaths since 2016.)  And, as is the case all over the intellectual map, EPA still insists thatHeat is the leading weather-related killer in the United States” contrary to its own published data. 

It is oddly comforting to find that the Climate Realist viewpoint has allies in the business world, who are not cowed by the yammering propagandists and their enforcers.

What do you think this “Heat Kills More” talking point is based on?

I suspect that it is an agenda-serving comparison between Heat Related Deaths and deaths from tornadoes, hurricanes, flooding, lightening, cold, winter, wind and  ‘rip currents’ only according to the false NOAA NWS graphic I have discussed so many times.

The real data on relative Heat and Cold Related Deaths worldwide are covered in “Why Do They Lie About Extreme Temperature Deaths?” which includes links to the definitive studies at the end.

# # # # #

Author’s Comment:

The funny bit:  As I was working on this, I was making my own, more complete version of the Heat/Cold Death rates graphic from I&I (which I’ll use in a second follow-up next week).  As I labored on, through the morning, checking back to the EPA Climate Indicator pages repeatedly to pick up details, EPA updated the Heat-Related Deaths page adding several more years of (up-trending, of course) data:

I have been communicating with the Climate Indicators team at EPA about this:  they up-dated the Heat-Related, but not the Cold-Related, Deaths page.  And yes, golly, it does suspiciously look like they have managed to change down-trending data into up-trending data. Not jumping to conclusions yet.  I’ll let readers know when I have sorted it out with EPA.

I appreciate John Merline at I&I for looping us in on their “climate lies” work.

Thanks for reading.

# # # # #

via Watts Up With That?

https://ift.tt/LP2GnOU

June 29, 2024 at 04:01PM

Mining For Hockeysticks

Guest Post by Willis Eschenbach

The iconic “hockeystick” simply refuses to die. It was first created by Mann, Bradley and Hughes in their 1998 paper Global-scale temperature patterns and climate forcing over the past six centuries (hereinafter “MBH98”).

Figure 1. Original hockeystick graph

MBH98 claimed to show that after a long period with very little change, suddenly the world started warming, and warming fast.

Back a couple of decades ago, Steve McIntyre over at Climate Audit did yeoman work in discovering a host of errors in MBH98. And somewere in that time, someone, likely Steve but perhaps not, noted that the curious (and mathematically incorrect) procedure used in MBH98 could actively mine hockeysticks out of red noise.

Despite all of that, MBH was succeeded by various of what I call “hockalikes”, studies that purported to independently find a hockeystick in the historical record and thus were claimed to support and validate the original MBH98 hockeystick.

Of course, these repeated many of the same errors as had been exposed by McIntyre and others. Here is the money graphic from my post Kill It With Fire, which analyzed the Mann 2008 attempt to rehabilitate the hockeystick (M2008).

Figure 2. Cluster dendrogram showing similar groups in the proxies of the M2008 hockalike

Note that the hockeystick shape depends on only a few groups of proxies.

Now, what I realized a few days ago was that although I’d believed that the MBH98 incorrect math could mine hockeysticks out of red noise, I’d never tried it myself. And more to the point, I’d never tried it with simpler math, straight averages instead of the uncentered principal components method of MBH98. So this is basically my lab notebook from that investigation.

The most expansive of these hockalikes involve the PAGES dataset, which has had three incarnations—PAGES2017, PAGES2019, and PAGES2K. PAGES2K starts in the year 1AD and contains 600+ proxy records. Here are several PAGES2K reconstructions, from a Nature article promoting the claim that there is “Consistent multidecadal variability in global temperature reconstructions and simulations over the Common Era

Figure 3. Several historical reconstructions using the PAGES2K dataset.

Now, as Figure 3 shows, it’s true that several different investigations done by different teams have yielded very similar hockeystick shapes. While this seems to greatly impress the scientists, this post will show why that is both true and meaningless.

To do that, first we need to understand the steps in the process of creating proxy-based historical temperature reconstructions. A “proxy” is some measurement of differences in some measurable variable that changes with the temperature. For example, in general when it is warmer, both trees and coral grow faster. Thus, we can analyze the widths of their annual rings as a proxy for the surrounding temperature. Other temperature proxies are isotopes in ice cores, sediment rates in lakes, speleothems, magnesium/calcium ratios in seashells, and the like.

The process of creating a proxy-based historical dataset goes like this:

  1. Gather a bunch of proxies.
  2. Discard the ones that are not “temperature sensitive”. Temperature-sensitive proxies can be identified by seeing if they vary in general lockstep (or anti-lockstep) with historical temperature observation (high correlation).
  3. They might be positively correlated (both temperature and the proxy go up/down together) or negatively correlated (when one goes up the other goes down). Either one is sensitive to the temperature and thus, is useful. So we need to simply flip over the proxies with negative correlation.
  4. Use some mathematical method, simple or complex, to average all or some subset of the individual proxies.
  5. Declare success.

Seems like a reasonable idea. Find temperature-sensitive proxies, and average them in some fashion to reconstruct the past. So … what’s not to like?

To start with, here’s the description from the paper announcing the PAGES2K, entitled A global multiproxy database for temperature reconstructions of the Common Era.

Reproducible climate reconstructions of the Common Era (1 CE to present) are key to placing industrial-era warming into the context of natural climatic variability.

Here we present a community-sourced database of temperature-sensitive proxy records from the PAGES2k initiative. The database gathers 692 records from 648 locations, including all continental regions and major ocean basins. The records are from trees, ice, sediment, corals, speleothems, documentary evidence, and other archives. They range in length from 50 to 2000 years, with a median of 547 years, while temporal resolution ranges from biweekly to centennial. Nearly half of the proxy time series are significantly correlated with HadCRUT4.2 surface temperature over the period 1850–2014.

So PAGES2K has completed the first step of creating a proxy-based temperature reconstruction. They’ve gathered a host of proxies, and they’ve noted that about half of them are “temperature sensitive” based on their agreement with the HadCRUT surface temperature.

Again … what’s not to like?

To demonstrate what’s not to like, I created groups of 692 “pseudoproxies” to match the size of the PAGES2K dataset. These are randomly generated imitation “time series” starting in the Year 1, to match the length of the PAGES2K. I created them so their autocorrelation roughly matched the autocorrelation of the temperature records, which is quite high. That way they are “lifelike”, a good match for actual temperature records. Here are the first ten of a random batch.

Figure 4. Randomly generated pseudoproxies with high autocorrelation, also called “red noise”.

As you can see, all of them could reasonably represent the two-millennia temperature history of some imaginary planet. How good is their correlation with temperature observations? Figure 4 shows that data.

Figure 5. Correlations of 692 random pseudoproxies with the Berkeley Earth modern temperature observations.

This is about what we’d expect, with approximately half of the pseudoproxies having a positive correlation with the observational temperature data, the other half with a negative correlation, and most of the proxies not having a strong correlation with the temperature.

And here’s the average of all of the pseudoproxies.

Figure 6. Average, 692 pseudoproxies. The red line shows the start of the Berkeley Earth instrumental record. Note that there is no hockeystick—to the contrary, in this case, to avoid biasing my results, I’ve chosen a batch of pseudoproxies whose average goes down at the recent end. Nor is there any significant trend in the overall data.

OK, so we have the proxies, and we’ve calculated the correlation of each one with the instrumental record. Then, following Step 3 in the procedure outlined above, I flipped over those proxies which had a negative correlation to the instrumental record. That meant all the proxies were positively correlated with the Berkeley Earth data.

At this point, I was going to see what an average would look like if I selected only the pseudoproxies with a high correlation with the instrumental record, say 0.5 or more … but before that, for no particular reason, I thought I’d look at a bozo-simple average of the whole dataset. Color me gobsmacked.

Figure 7. Average of all of the pseudoproxies after simply flipping over (inverting) those with a negative correlation with the instrumental data.

YOICKS!

Here, we can see why all the different averaging methods yield the same “historical record” … because the procedure listed above actively mines for hockeysticks in random red noise.

One interesting detail of Figure 7 is that there is a sharp drop in the average before the start of the period used for the correlation. I assume this is because to get that large an increase, you need to first go down to a low point.

And this dip prior to 1850 is of interest because you can see it in both Panel A and Panel B of the PAGES2K reconstructions shown in Figure 3 above …

Another item of note is that the procedure has introduced a slight downward trend from the beginning to a sharp drop around 1775. I ascribe that to the procedure favoring “U” shaped datasets, but hey, that’s just me.

In any case, the slight downward trend is a real effect of the procedure. We know that because there’s no downward trend in the full dataset. We also know it’s a real effect for a more important reason—we see the same slight downward trend in the original MBH Hockeystick in Fig.1, and also in Panel “a” of Figure 2.

Finally, why is there so little variation in the “handle” of the hockeystick? Are the temperatures of the past really that stable?

Nope. It’s another artifact. The handle of the hockeystick is just an average of some presumably large number of random red noise datasets. When you average a bunch of random red noise datasets, you get a straight line.

Moving along, my next thought was, how much do I have to disturb the pseudoproxies in order to produce a visible hockeystick?

To investigate that, I took the same original dataset. In this case, however, I inverted only 40 proxies, the ones with the greatest negative correlation. So I was flipping only the strongest negative signals, and leaving the rest of the proxies that had negative correlation as untouched red noise. Here’s that result.

Figure 8. Average of all of the pseudoproxies after flipping over those with the top forty negative correlation with the instrumental data.

Note that less than six percent (forty) of the pseudoproxies were flipped, and all four hockeystick characteristics are already visible—reduced variation in the “handle” of the hockeystick, a slight downward trend to 1775, a sharp drop to 1850, and a nearly vertical hockeystick “blade” from 1850 on.

How about at the other end, where we select only the ones with the strongest correlation? Here’s the average of only the top quarter of the data (176 pseudoproxies) as measured by their correlation with the observational temperature.

Figure 8. Average of only the top quarter of the data, those with the best correlation with Berkeley Earth data.

Same thing. Straight handle on the hockeystick. Slow decline to 1775. Sharp drop. Vertical hockeystick blade after that.

Finally, after sleeping on it, I realized I’d looked at the best-case scenarios … but what about the worst-case? So here’s the half of the pseudoproxies with the worst correlation with the observational temperature.

Figure 9. Average of only the bottom half of the data, those with the worst correlation with Berkeley Earth data.

Despite using only the half of the pseudoproxies with the poorest correlation with temperatures, those with a correlation of 0.22 or less, we get the same story as before—same straight hockeystick handle, same slight drop to 1775, same sharp drop to 1850, and the same vertical hockeystick blade after 1850.

Now, there’s an interesting and easily missed point in the graphics above. While the shape stays the same, the greater the correlation, the taller the blade of the hockeystick. The different procedures changed the tip of the blade from ~0.1 with only 40 flipped, to ~1.5 using the worst-correlated pseudoproxies, to ~0.3 with all pseudoproxies flipped, to around ~0.7 using only the best correlated. So all of them showed the identical “hockeystick” form, and they only varied in the size of the blade. Curious.

Now I stated up above that this post would show why it is both true and meaningless that various studies all come up with hockeysticks.

The reason is quite evident in the figures above—no matter what the investigators do, since they are all using some variation of the standard procedure I listed at the top of the post, they are guaranteed to get a hockeystick. Can’t escape it. That procedure definitely and very effectively mines hockeysticks out of random red noise.


Here, it’s an irenic long summer evening, with bursts of childrens’ laughter radiating out of the open windows. I have the great joy of living with my gorgeous ex-fiance, our daughter and her husband, a granddaughter who is “almost five, Papa!” and a grandson heading towards three.

Is there a lovelier sound than their laughter?

Best to all,

w.

The Usual: When you comment please quote the exact words you are discussing. I can defend my words, but I can’t defend your interpretation of my words. And if you want to show I’m wrong, see How To Show Willis Is Wrong.

via Watts Up With That?

https://ift.tt/ycjDvfz

June 29, 2024 at 12:03PM

Scientists Say: Net Zero Wins Nearly Zero Results

Chris Morrison explains at his Daily Sceptic article Net Zero Will Prevent Almost Zero Warming, Say Three Top Atmospheric Scientists.  Excerpts in italics with my bolds and added images.

Recent calculations by the distinguished atmospheric scientists Richard Lindzen, William Happer and William van Wijngaarden suggest that if the entire world eliminated net carbon dioxide emissions by 2050 it would avert warming of an almost unmeasurable 0.07°C. Even assuming the climate modelled feedbacks and temperature opinions of the politicised Intergovernmental Panel on Climate Change (IPCC), the rise would be only 0.28°C. Year Zero would have been achieved along with the destruction of economic and social life for eight billion people on Planet Earth. “It would be hard to find a better example of a policy of all pain and no gain,” note the scientists. [Paper is Net Zero Averted Temperature Increase  by Lindzen, Happer and van Wijngaarden.]

In the U.K., the current General Election is almost certain to be won by a party that is committed to outright warfare on hydrocarbons. The Labour party will attempt to ‘decarbonise’ the electricity grid by the end of the decade without any realistic instant backup for unreliable wind and solar except oil and gas. Britain is sitting on huge reserves of hydrocarbons but new exploration is to be banned. It is hard to think of a more ruinous energy policy, but the Conservative governing party is little better. Led by the hapless May, a woman over-promoted since her time running the education committee on Merton Council, through to Buffo Boris and Washed-Out Rishi, its leaders have drunk the eco Kool-Aid fed to them by the likes of Roger Hallam, Extinction Rebellion and the Swedish Doom Goblin. Adding to the mix in the new Parliament will be a likely 200 new ‘Labour’ recruits with university degrees in buggerallology and CVs full of parasitical non-jobs in the public sector.

Hardly any science knowledge between them, they even believe that they can spend billions of other people’s money to capture CO2 – perfectly good plant fertiliser – and bury it in the ground. As a privileged, largely middle class group, they have net zero understanding of how a modern industrial society works, feeds itself and creates the wealth that pays their unnecessary wages. All will be vying to save the planet and stop a temperature rise that is barely a rounding error on any long-term view.

They plan to cull the farting cows, sow wild flowers where food
once grew, take away efficient gas boilers and internal combustion
cars and stop granny visiting her grandchildren in the United States.

On a wider front, banning hydrocarbons will remove almost everything from a modern society including many medicines, building materials, fertilisers, plastics and cleaning products. It might be shorter and easier to list essential items where hydrocarbons are absent than produce one where they are present. Anyone who dissents from their absurd views is said to be in league with fossil fuel interests, a risible suggestion given that they themselves are dependent on hydrocarbon producers to sustain their enviable lifestyles.

Unlike politicians the world over who rant about fire and brimstone, Messrs Lindzen, Happer and van Wijngaarden pay close attention to actual climate observations and analyses of the data. Since it is impossible to determine how much of the gentle warming of the last two centuries is natural or caused by higher levels of CO2, they assume a ‘climate sensitivity’ – rise in temperature when CO2 doubles in the atmosphere – of 0.8°C. This is about four times less than IPCC estimates, which lacks any proof. Understandably the IPCC does not make a big issue of this lack of crucial proof at the heart of the so-called 97% anthropogenic ‘consensus’.

The 0.8°C estimate is based on the idea that greenhouse gases like CO2 ‘saturate’ at certain levels and their warming effect falls off a logarithmic cliff. This idea has the advantage of explaining climate records that stretch back 600 million years since CO2 levels have been up to 10-15 times higher in the past compared with the extremely low levels observed today. There is little if any long term causal link between temperature and CO2 over time. In the immediate past record there is evidence that CO2 rises after natural increases in temperature as the gas is released from warmer oceans.

Any argument that the Earth has a ‘boiling’ problem caused by the small CO2 contribution that humans make by using hydrocarbons is ‘settled’ by an invented political crisis, but is backed by no reliable observational data. Most of the fear-mongering is little more than a circular exercise using computer models with improbable opinions fed in, and improbable opinions fed out.

The three scientists use a simple formula using base-two logarithms to assess the CO2 influence on the atmosphere based on decades of laboratory experiments and atmospheric data collection. They demonstrate how trivial the effect on global temperature will be if humanity stops using hydrocarbons. After years wasted listening to Greta Thunberg, the message is starting to penetrate the political arena. In the United States, the Net Zero project is dead in the water if Trump wins the Presidential election. In Europe, the ruling political elites, both national and supranational, are retreating on their Net Zero commitments. Reality is starting to dawn and alternative political groupings emerge to challenge the comfortable insanity of Net Zero virtue signalling. In New Zealand, the nightmare of the Ardern years is being expunged with a roll back of Net Zero policies ahead of possible electricity black outs.

Only in Britain it seems are citizens prepared to elect a Government obsessed with self-inflicted poverty and deindustrialisation. The only major political grouping committed to scrapping Net Zero is the Nigel Farage-led Reform party and although it could beat the ruling Conservatives into second place in the popular vote, it is unlikely to secure many Parliamentary seats under the U.K.’s first-past-the-post electoral system. Only a few years ago the Labour leader Sir Keir Starmer, who thinks some women have penises, and his imbecilic Deputy Leader Angela Rayner, were bending the knee to an organisation that wanted to cut funding for the police and fling open the borders. The new British Parliament will have plenty of people who still support Net Zero and assorted woke woo woo, and the great tragedy is that they will still be found across most of the represented political parties.

See Also 

Delusions of Davos and Dubai

 

via Science Matters

https://ift.tt/eI4DLHi

June 29, 2024 at 11:56AM

German Professor Shows That The Road To Green Hydrogen Is Long, Expensive

The long road to green hydrogen

An article on NDR German public broadcasting clearly conveys the dimensions of hydrogen. It reports on a planned factory for green hydrogen in Neumünster.

A hydrogen factory with an output of 50 megawatts corresponds to the output of ten wind turbines, which means that the output of ten wind turbines can be stored. A hydrogen factory of this size could produce 3,000 tons of green hydrogen with an energy content of 100 gigawatt hours, says Prof. Oliver Opel from the West Coast University of Applied Sciences (FHW) in Heide. If the green hydrogen were burned, it could be used to heat 5,000 single-family homes per year. And the waste heat could be used to heat a further 2,500 houses, according to Opel. If electricity is made from the green hydrogen again and heat pumps are used, 7,500 single-family homes could be heated, as well as another 7,500 homes with the waste heat.”

To put this into perspective: Schleswig-Holstein has around 650,000 single-family homes, 80,000 two-family homes and 95,000 multi-family homes.  It is in any case no surprise that high German electricity prices are an obstacle.

Prof. Oliver Opel heads the Institute for the Transformation of the Energy System, ITE, at the West Coast University of Applied Sciences in Heide. He says that the construction and operation of electrolysis plants are still too expensive. One crucial aspect is the high price of electricity. Opel explains: ‘In other European countries, the electricity price is much better. One option could be a division according to geographically different electricity price zones, as already exists in other countries.’

Opel also points to another problem: ‘The purchase prices for electrolysis systems have continued to rise, as they are nowhere near mass production.’”

The question of what the hydrogen and the electricity generated in this way will ultimately cost remains unanswered in the article. In any case, the country’s plans are ambitious. It would be a factor of 30 of the first project.

Schleswig-Holstein wants to achieve an electrolysis capacity of 1.5 gigawatts (1,500 megawatts) by 2030, according to the state government’s updated hydrogen strategy. The federal government has also set itself a target. By 2030, the government wants to achieve an electrolysis capacity of 10 gigawatts (10,000 megawatts) to cover the demand of 95 to 130 terawatt hours of electricity per year.

And according to the energy expert at the West Coast University of Applied Sciences, Oliver Opel, this is a real challenge, precisely because of the current poor framework conditions.”

So the road is not only long, but also expensive.

Donate – choose an amount

via NoTricksZone

https://ift.tt/4hdSX3e

June 29, 2024 at 09:52AM