Month: September 2024

Saturday

0 out of 10 based on 0 rating

via JoNova

https://ift.tt/LYxAJyK

September 27, 2024 at 09:30AM

CPS Net Zero Report’s Misses The Point

By Paul Homewood

The CPS is well known for challenging the left wing consensus, so it is doubly disappointing to see this report by John Penrose:

Press Release

 As new government data shows that the UK has the most expensive industrial electricity prices in the world, four times those in the US and 46% above the International Energy Agency median, a new CPS report by John Penrose sets out how the UK can reduce the cost of its energy and find a cheaper route to Net Zero:

 image

Despite international gas prices coming back down in the aftermath of Russia’s invasion of Ukraine, British households are still struggling with high energy bills because of outdated regulations pushing up costs.

‘A Cheaper Route to Net Zero’ by John Penrose, former Conservative MP and minister, Competition Tsar, leading an independent review of UK competition policy, and co-chair of the Commission for Carbon Competitiveness’ argues that the current system is hostage to the volatile international price of gas, to our detriment, and that regulations dictating how energy firms should behave have become steadily more complicated, detailed, expensive and slow – adding to the cost of energy.

The paper sets out twenty recommendations which could be adopted by the new government to help bring energy bills down while still committing to decarbonisation.

The recommendations are broadly summarised as:

  • uncoupling energy bills from the price of gas
  • addressing the intermittency of renewable energy in the most cost-effective way
  • slashing the cost of energy transmission with measures like ‘local discounting’ for customers who agree to pylons or onshore wind turbines being built near them
  • cutting red tape and strengthening Ofgem’s commitment to competition
  • reforming the Energy Price Cap
  • levelling the playing field for UK manufacturers, including through the introduction of a Carbon Border Adjustment Mechanism (CBAM)

https://cps.org.uk/research/a-cheaper-route-to-net-zero/

Penrose long ago signed up to the Net Zero dogma, so is surely the wrong person to write a report on how to cut Net Zero costs.

This new report ignores the elephant in the room. It is Net Zero policy which is responsible for high electricity prices. Fiddling around at the edges, as Penrose suggests, will not alter that fact.

Direct subsidies for renewables, along with the cost of standby capacity already cost £11.5 billion a year. Add on grid balancing costs, carbon tax and FITs, and this swells to at least £15 billion.

This is equivalent to £50/MWh, a fifth of retail prices.

To that can be added the £100 billion of grid upgrades planned over the next ten years, necessitated entirely by Net Zero considerations.

https://notalotofpeopleknowthat.wordpress.com/wp-content/uploads/2024/03/image-54.png

Penrose makes a big play of the fact that power prices have not fallen as sharply as gas prices since the peaks in 2022. He claims that this is the fault of our market system.

But the main factor has been the steady increase in renewable subsidies. Since 2019, these have risen by £3.6 billion, and will continue to rise in years to come.

There are other reasons why we have not seen the full effect of reduced gas prices in our bills.

Firstly the cost of gas power is artificially raised by carbon taxes. Secondly CCGT plants cannot run efficiently due to the intermittency imposed by wind and solar power, which effectively have priority access to the grid. If we used CCGT as a first resort, rather than a last, its cost would be much lower.

There is also the issue of coal to consider. Most EU countries still use plenty of coal for electricity generation – last year it accounted for 12%. This has helped to keep costs down at a time when gas is still dearer than it was before the Ukraine war.

Another factor is the mix of forward and spot buying on the market. The introduction of wind and solar power has led to the spot market being of much greater importance. This has brought much greater price volatility, and arguably higher prices overall. In an ideal world, electricity suppliers would source most of their power on the forward market, which CCGTs could provide at much lower prices as a result.

Perhaps the most ridiculous recommendation is the introduction of a Carbon Border Adjustment Mechanism, ie an import tariff on Chinese wind turbines and solar panels. The idea the UK manufacturers could make them as cheap as China does is absurd. The end result would be higher electricity prices.

If Penrose is serious about cutting electricity bills, there are things that could be done straightaway, for instance:

  • Cutting carbon prices to zero – this would immediately reduce the cost of gas power, thus having an effect on wholesale prices which tend to be set by the price of gas.
  • Cutting Renewable Obligation prices
  • Cancelling all future CfD auctions – if renewables really are as cheap as claimed, they should be able to compete in the market without govt support.

The Penrose report sadly has missed a big opportunity to highlight to the public the obscene costs they are already incurring for Net Zero, and to propose radical changes while we still have a chance to pursue them. Worst of all, they have not told the public how much higher their bills will go in future to meet Miliband’s mad agenda.

via NOT A LOT OF PEOPLE KNOW THAT

https://ift.tt/59h28dz

September 27, 2024 at 08:45AM

O C Register Publishes a Climate Alarmist “The Heat Rises” Weather Impact Chart Which Should instead be Called “The Hype, Distortion & Deception Rises”

Guest essay by Larry Hamlin

The Orange County Register published a full-page chart (shown below) allegedly portraying the impact of increasing weather temperatures on fatalities across the U.S. over various time intervals.

The lower portion of the chart is provided below for improved viewing showing the hyped 2023 and other time period weather fatalities supposedly compared to other weather events as contrived by NOAA as well as the July 2024 highest average temperature ranking of the 48 U.S. states.

The contrived weather fatalities chart information deceptions regarding “heat” have been addressed in detail by Kip Hansen in his excellent and comprehensive WUWT article here and shown below.

As noted in Kip’s article:

“In a recent News Brief, I pointed out that the major climate alarm propaganda cabals [CCNow, Inside Climate News] would be flooding the mainstream media outlets all around the world with the news that in the Northern Hemisphere, where most of the humanity lives, it is Summer, and summers tend to be hot.”

“Further, encouraged by the Climate Propaganda Cabals, news outlets rely on a report from world newspaper-of-record like The Guardian [a co-founder of the climate propaganda outlet Covering Climate Now CCNow]:

Extreme temperatures kill 5 million people a year with heat-related deaths rising, study finds [The Guardian]

Selectively quoting from that piece is common practice, despite the fact that there is a sub-headline that reads: “More people died of cold than heat in past 20 years, but climate change is shifting the balance.” One has to read the piece very carefully to find that it reports on Zhao et al. 2021 (a comprehensive peer reviewed global wide study) which did, in fact, find that heat related deaths were rising (as population also rises) and to discover that the gentle warming of the climate is preventing more cold deaths than the increase in heat deaths — resulting in a net reduction in extreme temperature deaths.

More exactly: 9.43% of all deaths were related to non-optimum temperatures. Of the same 5 million, 8.52% were cold-related and 0.91% were heat-related. Again, over 8.5 percent of deaths are cold-related and only 0.9 were heat related — that is almost 10 times as many cold-related deaths than heat related deaths.”

The extensive Zhao 2021 study (noted above) involved data analysis for 750 global locations contained within 43 countries and covered the period from 2000 to 2019.  

Kip’s WUWT article further notes:

This lying about heat and cold deaths is subject to a pretty good debunking by Joshua Cohen at Forbes, in his July 2023 piece Excessive Heat Can Kill, But Extreme Cold Still Causes Many More Fatalities.

[quoting below from that Forbes piece – note the author is writing about Zhao et al. 2021]

“Between 2000 and 2019, annual deaths from heat exposure increased globally. The 20-year period coincided with the earth warmed by about 0.9 degrees Fahrenheit. The heat-related fatalities disproportionately impacted Asia, Africa and Southern parts of Europe and North America.”

“Interestingly, during the 2000-2019 period examined in the study, while heat-related deaths rose, deaths from cold exposure fell. And they decreased by a larger amount than the increase in heat-related fatalities. Overall, researchers estimated that approximately 650,000 fewer people worldwide died from temperature exposure during the 2000-2019 period than in the 1980s and 1990s.”

“Bluntly, in the recent twenty years studied, about 650,000 lives were saved by the slow and steady warming of the climate 2000-2019.”

There is yet more in Kip’s article which provides links to 3 peer reviewed articles – 2 of which are published in The Lancet (the premier Journal of Medicine publication founded in 1823) that address the overwhelming result that cold produces many times more deaths than heat.

In addition to the Zhao 2021 and 2024 studies (the 2024 Zhao study increased the time interval study period to 30 years from 1990 to 2019) Kip’s article also addressed a Lancet European region study which involved assessment of 870 European urban areas over 30 countries.

Our World in Data provides the following chart from the Zhao et al. 2021 study showing the dominance of cold related deaths versus heat outcome occurs (note 9 times greater cold deaths in North America versus heat related deaths) worldwide as noted below.

.         

The Register’s Chart completely misrepresents, conceals and falsely characterizes extensive and overwhelmingly persuasive data from worldwide peer reviewed scientific studies that clearly establish that deaths from extreme cold hugely outnumber deaths from extreme heat and that modestly increasing temperatures lower overall deaths from extreme temperatures. How could the Register (and SCNG) manage to get this clearly established global outcome so incredibly wrong.  

The Register’s chart regarding the U.S. states July 2024 temperatures (provided above in the chart with the weather fatalities erroneous “heat” information) is typical of climate alarmist cherry picking. If the highest average monthly June or August 2024 temperatures had been used instead for California in this graph these months would have shown as California’s 3rd and 17th highest temperatures respectively.

The upper portion of the Register’s falsely hyped and misleading chart is shown below.

This chart also misleads readers in a number of very significant ways.

First, note that the Registers article graph temperature starting dates are all in the late 1950s and into the 1960’s which conceals and hides the well-established climate science data that clearly demonstrates the highest (by far) heat waves indexes across the entire U.S. occurred in the decade of the 1930s as shown in EPAs Heat Wave Index data shown below.

The period of the late 1950s and into the 1960s are representative of the lower levels of the Heat Wave index across the U.S.

Additionally, the Registers chart completely ignores the climate science data establishing that the huge population growth and increasing population density across the U.S. in the 1880 – 2023 period has resulted in Urban Island Heat effects contributing to increased summer temperatures that have exaggerated cities warming growth by at least 100% as noted in the study shown below.

The study determined that:

“It is interesting that the spatial (inter-station temperature difference) UHI effect is always stronger in the homogenized GHCN data than in the raw version of those data in Fig. 1. The very fact that there is a strong urban warming signal in the homogenized data necessitates that there must be a UHI impact on trends in those data. This is because the urban stations have grown substantially in the last 130 years. A recent paper by Katata et al. demonstrates that the homogenization technique used by NOAA does not actually correct urban station trends to look like rural station trends. It does breakpoint analysis which ends up adjusting some stations to look like their neighbors, whether urban or rural. To the extent that spurious warming from UHI is gradual through time, it “looks like” global warming and will not be removed through NOAA’s homogenization procedure. And since all classes of station (rural to urban) have undergone average population growth in the last 130 years, one cannot even assume that rural temperature trends are unaffected by UHI (see Fig. 2).”

The study concludes:

“But for the average “suburban” (100-1,000 persons per sq. km) station, UHI is 52% of the calculated temperature trend, and 67% of the urban station trend (>1,000 persons per sq. km). This means warming has been exaggerated by at least a factor of 2 (100%).

This also means that media reports of record high temperatures in cities must be considered suspect, since essentially all those cities have grown substantially over the last 100+ years, and so has their urban heat island.”

The significant issue of increasing population density contributing to increasing temperature measurement outcomes unrelated to claims of “climate change” is worldwide and unaddressed in global measurement system claims of climbing temperature outcomes as revealed in this further study shown below.

The study highlights the following regarding global temperature measurements systems:

“To review, the dataset is based upon over 13 million station-pairs of monthly average air temperature measurements at closely spaced GHCN stations between 1880 and 2023. It quantifies the average *spatial* relationship between 2-station differences in temperature and population density (basically, quantifying the common observation that urban locations are warmer than suburban, which are in turn warmer than rural). The quantitative relationships are then applied to a global population density dataset extending back through time.

The quantitative relationships between temperature and population are almost the same whether I use GHCN raw or adjusted (homogenized) data, with the homogenized data producing a somewhat stronger UHI signal. They are also roughly the same whether I used data from 1880-1920, or 1960-1980; for this global dataset, all years (1880 through 2023) are used together to derive the quantitative relationships.”

Provided below are some of the study’s results regarding UHI impacts on increasing measured temperatures as illustrated by specific color coding temperature increasing impacts over defined time intervals worldwide, across the U.S. (note California and Los Angeles impact color codes), Europe, India, China and other Asia regions.

The study concludes:

Over 50% of the population now lives in urban areas, and that fraction is supposed to approach 70% by 2045. This summer we have seen how the media reports on temperature records being broken for various cities and they usually conflate urban warmth with global warming even through such record-breaking warmth would increasingly occur even with no global warming.”

Unfortunately, there are yet more significant troubling problems  regarding the lack of quality and credibility of the temperature measurements taken across the U.S. (that allegedly are supposed to provide reliable and accurate climate temperatures measurements that are used to create government climate policy) as documented in great detail in the year 2022 report by Anthony Watts from The Heartland Institute as shown below.

In summary (the document is some 60 pages in length), the report includes the original results of an extensive study and evaluation in 2009 of more than 850 USHCN temperature measurement stations located across the nation and addressed these stations compliance with NOAA/NWS siting and other requitements needed to achieve reliable and accurate measurement data.

Some of the key results are reflected in the page shown below including “Approximately 90 percent of the USHCN stations failed to meet NWS’s own requirements which stipulate that stations must be 30 meters (100 feet) or more away from artificial or radiating /reflecting heat sources.” This critical issue represents a particularly significant measurement problem that is clearly apparent in the evaluations of the USHCN system presented in the Heartland report.

Based on these troublesome findings as well as other investigation findings the GAO conducted its own review in 2011 and found that (among many other findings):

“NOAA does not centrally track whether USHCN stations adhere to siting standards and the requirement to update station records, and it does not have an agency-wide policy regarding stations that do not meet its siting standards…

“Without centrally available information, NOAA cannot easily measure the performance of the USHCN in meeting siting standards and management requirements. Furthermore, federal internal control standards call for agencies to document their policies and procedures to help managers achieve desired results. NOAA has not developed an agencywide policy, however, that clarifies for agency staff whether stations that do not adhere to siting standards should remain open because the continuity of the data is important or should be moved or closed. As a result, weather forecast offices do not have a basis for making consistent decisions to address stations that do not meet the siting standards”.

Additional reviews and evaluations continued based on the 2009 report including a key 2015 peer reviewed study (Watts Et Al., 2015) which showed that well sited stations have significantly lower temperature trend outcomes as noted below from Heartland year 2022 report.

Additionally, a 2019 experiment conducted by Oak Ridge Laboratory proved conclusively that temperature measurement siting failures (particularly related to close proximity to artificial heat sinks) can result in important impacts on daily temperature maximum and minimum outcomes as noted below from the Heartland year 2022 report discussed below.

A typical example (one of many dozens of such contained in the report) of the failure of NOAA to site USHCN temperature measurement stations at a significant distance away from heat sinks is shown below where the heat coming from a nearby power transformer (infrared scan image) is clearly affecting the station measurement location.

“The abstract of the ATDD’s 2019 report explains the experimental design:

A field experiment was performed in Oak National Ridge Laboratory, TN, with four instrumented towers placed over grass at increasing distances (4, 30, 50, 124, and 300 m) from a built-up area. Stations were aligned in such a way to simulate the impact of small-scale encroachment on temperature observations. As expected, temperature observations were warmest for the site closest to the built-up environment with an average temperature difference of 0.31 and 0.24 °C for aspirated and unaspirated sensors respectively. Mean aspirated temperature differences were greater during the evening (0.47 °C) than day (0.16 °C) …

These results suggest that small-scale urban encroachment within 50 meters of a station can have important impacts on daily temperature extrema (maximum and minimum) with the magnitude of these differences dependent upon prevailing environmental conditions and sensing technology.

The 2019 NOAA Oak Ridge Laboratory publication vindicated the findings of the original 2009 Surface Stations publication as well as Watts et al.’s 2015 follow-up.”

The actions taken by NOAA to address the obvious and significant artificial heat source measurement station siting problems were half measures at best with the year 2022 Heartland report noting critical measurement problems that continued to plague the temperature measurement system validity and accuracy:

“NOAA and its subordinate agencies clearly went to great lengths to defend the quality of the USHCN network.

However, NOAA abruptly stopped using the USHCN dataset in 2014, switching to a new dataset called “nClimDiv.”

“USHCN’s 1,218 stations were dwarfed by the nascent nClimDiv initiative, which incorporates more than 10,000

installations in a network called “nClimGrid.” This new network combines the USHCN stations, in addition to thousands of stations from the Global Historical Climatology Network (GHCN).

The switch was likely a strategic maneuver by NOAA to draw attention away from the fact that its long-maintained

USHCN had been riddled with poorly sited locations, compromising the temperature records it produced.

Perhaps NOAA believed changing the name and the method would shield the system from further criticism.

NOAA / NCDC concurrently rolled out the new U.S. “Climate Reference Network” (USCRN), which it described as thusly:

NCDC developed the U.S. Climate Reference Network (USCRN) to address the most basic of climate change questions that Americans will ask in the mid-21st century, “How has the climate of the Nation changed in the last 50 years?” The USCRN measures temperature with superior accuracy

and continuity in places that land-use change will not likely impact during the next five decades. Built specifically for this purpose, it is unlike any other climate observation network in the United States. NCDC began the USCRN build-out in the lower 48 states in 2000 and completed the last of 114 station installations in 2008. Since 2005, the USCRN has operated a sufficient number of stations to generate accurate annual national temperature averages.”

However, a huge problem remains with measurement station reliability and accuracy that is hidden from view as described in the Heartland report as follows:

“Surprisingly, NOAA, NCDC, and NCEI do not use or cite the high-quality temperature data produced by the USCRN in any public reports. Instead, they use nClimDivdata, which contains all of the poorly sited USHCN stations, in addition to thousands of other stations that likely have the same set of station siting problems. NOAA / NCDC claims they then “adjust” the nClimDivdata to closely match the data from the USCRN. This “Band-Aid” approach does little to address problems that have been identified, and instead creates a dataset rife with multitudes of adjustments that may or may not fairly represent long-term temperature trends. Moreover, this approach does not address problems with individual station records, such as heat sink effects and biased temperature readings.

Furthermore, adjusting the nClimDivdata to closely match the data from the USCRN only affects 17 years of

data, failing to address any data produced before USCRN became operational in 2005. This means all of the

temperature data showing climate warming in the 20th century was not adjusted in the same manner as data

gathered after 2005, creating a disjointed U.S. climate dataset.”

The latest USHCN contiguous U.S. maximum temperature anomaly measurements (measured temperatures with superior accuracy and continuity in places that land-use change will not likely impact during the next 5 decades that remain away from artificial heat sinks) for the 19-year period from January 2005 to August 2024 (shown below) clearly show significant oscillating changes with little to indicate any established warming trend from 2005 to 2024 despite the ridiculous claims of climate alarmist propagandists that are reflected in the Registers completely misleading chart.

It is extremely unfortunate that the U.S. media is so incapable and incompetent at addressing the huge climate science shortcomings underlying climate alarmist propaganda as illustrated in this Register and SCNG chart.  

via Watts Up With That?

https://ift.tt/iZ0P9ny

September 27, 2024 at 08:03AM

Hurricane Helene

By Paul Homewood

 

 

image

image

https://www.bbc.co.uk/news/live/ckgmx8vm8pnt

Based on the satellite data we have available, Helene was at most a Cat 3 hurricane, with winds of about 125 mph.

The claims of 140 mph are based on aircraft data, which are not backed up by ground or satellite measurements.

The satellite-based Digital Dvorak data used to estimate wind speeds peaked at around 110kts, (125 mph), and as is often the case was well below the Warning Intensity, which is used as the official measurement, even though it is by definition a forecast.

image

http://rammb-data.cira.colostate.edu/tc_realtime/products/storms/2024al09/dgtldvor/2024al09_dgtldvor_202409270600.gif

Ground based data shows much lower wind speeds:

image

image

image

https://www.nhc.noaa.gov/archive/2024/al09/al092024.public.015.shtml?

At ground level, the highest recorded wind speeds seem to have been at Cedar Key, being much lower at 64mph sustained.

No ground measurements have been given for the eye of the storm, but this highlights the real problem of comparing satellite and aircraft data with the pre-satellite era. If Helene had hit the Panhandle in, say, the 1930s, it probably would have been categorised as Cat 1 at most.

There is a similar issue with minimum pressure readings. Helene is said to have had minimum pressure of 938mb, again based on Hurricane Hunter estimates. At ground level, the lowest reading was 948mb:

image

https://www.nhc.noaa.gov/archive/2024/al09/al092024.update.09270401.shtml?

As a reminder, US landfalling hurricanes of 940mb and below are common in the historical record:

image

https://www.aoml.noaa.gov/hrd/hurdat/All_U.S._Hurricanes.html

It is the same thing with storm surge. Despite warnings of 15 to 20ft of storm surge, fortunately it nowhere seems to have exceeded 9ft.

It may well be that higher winds and surges are found once NOAA get all their data in, but there is certainly no evidence that Helene was anything like as bad as reported.

Finally, let’s compare it with two other hurricanes, which on paper were of similar intensity to Helene.

The Galveston Hurricane in 1915 had winds of 145 mph and minimum pressure of 940mb – in other words similar to Helene. Yet Galveston brought immense damage, killing 275 in Texas alone and causing immense damage to buildings, even inland in Houston:

image

image

image

https://en.wikipedia.org/wiki/1915_Galveston_hurricane

.

There followed in 1932 another Texas hurricane, Freeport:

image

Freeport has minimum pressure of 935mb, only slightly less supposedly than Helene. It left 40 dead. There clearly is no comparison between Freeport and Helene either.

image

https://en.wikipedia.org/wiki/1932_Freeport_hurricane

What is particularly interesting is that Freeport exhibited similar rapid intensification to Helene, and also a similar route. Both hurricanes suddenly appeared off the Yucatan coast. Wikipedia describe this intensification.

image

We are repeatedly told that rapid intensification is essentially a new phenomenon, but this plainly is not true.

Helene cannot truly be said to be in the same ball park as Galveston and Freeport, despite supposedly being of similar intensity and strength. Unluckily there has only so far been one death recorded in Florida from Helene, when someone was hit by a falling road sign. Sadly this is the sort of accident which can happen in any storm event.

Even Cat 1 hurricanes can be deadly. But the media would like you to believe the hurricanes we get nowadays are unprecedented.

via NOT A LOT OF PEOPLE KNOW THAT

https://ift.tt/JlWstxi

September 27, 2024 at 05:32AM