Month: July 2025

Making PJM all wind and solar would cost over $2.4 trillion in battery backup

From CFACT

By David Wojick

I keep reading how big batteries are all it takes to make wind and solar reliable as the sole grid electricity source. The reality is that making wind and solar work at all requires a fantastic amount of battery backup, far more than is possible.

Below is an example using the PJM grid. PJM is America’s biggest grid operator, with a territory covering the Mid-Atlantic and points west. Their territory includes the Washington, DC metro area, where all the federal bigwigs live, making it a good place to start. I also live there.

We are quantifying a fantasy, so let’s keep it very simple. In fact, the basic question is why hasn’t PJM done this simple analysis? They do a lot of sophisticated grid modeling. Or maybe they have done this crucial assessment, but it is a secret, which is even worse.

Consider a single day in a typical peak demand summer heatwave. The heatwave is due to a stagnant high-pressure system called a Bermuda high, so there is not enough wind to generate usable wind power, no matter how much generating capacity is available.

It is sunny during the day, so let’s assume that for 8 hours we get enough solar to meet demand (or, as I prefer to call it, to meet need). For the other 16 hours, we meet demand using batteries. We import nothing because our neighbors are in the same needy boat.

Finally, for simplicity, I assume the demand is at the peak level for the entire 24 hour day. This overestimates things a bit, but we will find that does not matter. A fancier analysis would use a typical demand curve. PJM can handle that.

My example year is 2030, as that is a standard near-term transition target year for which we have reasonable estimates of peak demand. Here then are the very simple numbers.

PJM’s estimate peak demand for 2030 is about 180,000 MW.

Meeting that for 16 hours with batteries requires 2,880,000 MWh of usable storage.

Usable storage is between 20% and 80% of nameplate battery capacity, hence 60%.

Thus we need 4,800,000 MWh of nameplate battery capacity.

Storage facility capital costs vary, but $500,000 per MWh is a reasonable estimate.

This gives a total cost of $2.4 trillion, or $2,400,000,000,000, for the batteries to make wind and solar reliable in this case. This fantastic cost is clearly not feasible.

There are things that could make this number go down a bit, such as reduced cost per MWh. But given last year saw just 130,000 MWh installed worldwide, the production capacity does not exist, so we are talking about new mines and factories. It actually cannot be done by 2030, not even close.

But the realistic numbers would be much higher if this fantasy played out because low wind, near-peak heatwaves often last for several days, even a week. Ten trillion dollars is easily possible. We are, after all, talking about hundreds of thousands of tractor-trailer sized batteries, basically containers full of expensive chemicals. Moreover, this is just for PJM.

Batteries simply cannot make a transition to wind and solar power feasible. The amount, and hence the cost, of storage is far too great.

Given the simplicity of this analysis, using readily available data, the big question is why are these impossible numbers not already widely known? PJM and their big utilities all do detailed modeling and supposed reliability assessments. So does NERC, whose sole mission is reliability. Many utilities file annual Integrated Resource Plans with their state regulators, typically looking out 20 years or more.

That battery backup cannot make wind and solar powered grids possible is obvious given these incredible numbers. The electric power industry must know this, but their silence is deafening.


Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

via Watts Up With That?

https://ift.tt/nI12L4q

July 12, 2025 at 04:08PM

Astwood Bank Update – The Death of Met Office credibility, killed by Weapons Grade Gaslighting.

I have already reviewed Astwood Bank (as well as Cardiff and Aboyne) however, following yesterday’s outrageous and frankly dishonest claims by the Met Office, I feel I should go into further depth to expose their deliberate disinformation agenda. No credible meteorologist could accept that Astwood Bank reading is in any way reliable but, of course, this has nothing to do with either meteorology or indeed even any science. Political ideologies must rule.

Firstly another site aerial image to help orientation. Google maps and aerial images run north to top, south to the bottom. The site address is a domestic house known as Alwynne, 48 The Ridgeway Astwood Bank Redditch. I do not wish to be the cause of any upset to the owner, however, the Met Office itself freely publishes all this information and the homeowner is fully aware of the details being in the public arena.

The north of this site is a spur off the main road (The Ridgeway) down which the Google Streetview camera partly went. This is what is shown of that northern boundary.

To give an idea of the scale the brick pillars are 22 brick courses high – the working height of a brick is 3 inches so those pillars are 5 feet 6 inches (1.67 metres) plus capping stones. I conservatively estimate the wind barrier to the north in the region of 12 feet (3.7 metres) high. Another image angle offers a different perspective.

Very obviously indeed the tree to the end of the hedgeline is much taller again. The front of the house is shown to be a large area of concrete driveway and parking. This elevation is particularly important given the wind conditions of the afternoon of 11th July in this region. There is no anemometer published data for Astwood bank but the wind direction is. What very light breeze there may have been was from the north.

To gain a very good indication of wind speed, the nearest official site with an anemometer is the fully equipped Pershore site (formerly RAF Throckmorton hence very sophisticated wind speed and direction recording equipment) just 8 miles to the south south west.

This confirms a northerly very light breeze measured at 10 metres above ground level ranging from 6 kph down to dead flat calm throughout the record period. This will be discussed in detail later, however, it is blatantly obvious that any wind there was at the Astwood Bank area would have been completely blocked by the significant perimeter hedging and trees to this site.

The general eastern side to the screen is firstly the wide ranging house with the extensive concrete driveway and parking area. The property is single storey but with an elevated roof section indicating loft conversion and a ridge height in excess of 5 metres. Further treess, shrubs and conservatory extension form a total wind block to this side.

Now consider the length of shadow by the house and relate that to the shadow cast by the predominantly south hedge perimeter.

It is again obvious that this southern hedging is very high probably higher than the ridge of the house at 5 metres. Again a total wind block.

The final west elevation is a similarly well manicured thick hedge that sits just 3.5 metres from the screen. A differently timed image indicates the level of visual shade this hedge creates.

All of the above is to confirm beyond any doubt that this site is completely sheltered from wind from any angle and that in layman’s terms this is a “suntrap” garden. The relevance of this is that it is perfectly well known that in low wind speeds (and in this case at Astwood Bank it was guaranteed dead flat calm) Stevenson’s screens will (not possibly, nor maybe) in strong sunny conditions, overheat. This defect was in fact ascertained in 1884 in the same year that the Met Office formally adopted this type of instrument screen. Scottish meteorologist John Aitken identified this overheating effect which later adopted his name “Aitken Effect”

The Met Office itself used to openly acknowledge this problem. Their “Factsheet 17” entitled “Observations over land” specifically states “Anomalies may arise when the wind is light and the temperature of the outer wall is markedly different from the air temperature.” Clearly the term “anomalies” is a euphemism for overheating.

Dr Eric Huxter elaborated on this well known overheating problem and cited research by internationally regarded meteorologists of the Royal Meteorological Society and Reading University such as Dr Steven Burt. This research unequivocally supports the fact that Stevenson screens in such low wind speeds and sunny conditions will not give accurate representations of the real world temperature.

Is there any more real world further evidence to substantiate this as an over-recorded reading?

One of the nearest weather stations to Astwood Bank is the Class 2 site at Wellesbourne. I have not reviewed Wellesbourne yet as I was actually saving the good quality sites until the latter part of this project. Whilst I have referred to the likes of Rothamsted (England), Thomastown (Northern Ireland), Trawsgoed (Wales) and even Craibstone (Scotland) as good quality sites, these reports were partly to relieve the monotony of reporting on so many junk sites. There are in fact several very good quality sites to review shortly and Wellesbourne is one of them. Without going into details now, simply contrast this site’s location with Astwood Bank.

There certainly is nothing breaking the wind in this open site. Even in low wind speed conditions Wellesbourne is much less likely to encounter any Aitken Effect.

Wellesbourne peaked at 32.6°C which is 2.1°C lower than recorded at Astwood Bank.The latter’s figure undoubtedly the result of the well noted likely effect of Aitken warming effect – and the Met Office must know that. If not they are incompetent.

How this Astwood Bank site even came into its adopted Met Office existence reads more like a tale of a cottage industry than latter day 20th century science. The official public record is here, with this from the local newspaper

Hold this thought……..a hobbyist’s “passion” is now proof of the anthropogenic global warming narrative according to the UK tax payer funded Met Office from a 100% Junk site giving known corrupted readings.

I challenge any meteorologist to defend using this Astwood Bank site readings in this way.

This is political disinformation and marks the death of Met Office credibility.

via Tallbloke’s Talkshop

https://ift.tt/kB4d8sF

July 12, 2025 at 01:20PM

Moving, But Not In A Straight Line

Guest Post by Willis Eschenbach.

I must confess, I use WUWT as a lab notebook on steroids. It reflects my latest work, my latest calculations, my latest graphics, my latest theories. My thanks to all participants who make this possible: Anthony Watts, Charles The Moderator, anonymous moderators around the planet, as well as all the commentators and lurkers who keep me from going off the rails and suggest new avenues to explore. What a time to be alive!

Onwards. Here’s my latest.

I was watching a National Geographic documentary about the use of airborne lidar to look straight down and see through the trees of the Guatemalan jungle to expose Mayan ruins. The commenter said “If we see straight lines on the ground, it’s not natural. It’s something made by man.”

And it’s true—in general, nature doesn’t do straight lines. As the poet said:

“Glory be to God for dappled things –
        For skies of couple-colour as a brinded cow;
 For rose-moles all in stipple upon trout that swim;
        Fresh-firecoal chestnut-falls; finches’ wings;”

I’m reminded of this by what I see as a ludicrous claim—that regarding the climate, one of the more complex systems we’ve ever tried to analyze and understand, mainstream climate scientists say that there is a straight-line linear relationship between changes in downwelling longwave radiation at the top of the atmosphere and the surface temperature. This is a central belief in their understanding of the climate:

∆T =  λ * ∆F                                       (Equation 1 And Only)

This says that the change (delta, ““) in global average surface temperature (“T“) is equal to the change () in “forcing” (“F“) times a constant called lambda (“λ“) that is known as the “equilibrium climate sensitivity” (ECS).

And what is forcing when it’s at home? Forcing is a term of art in climate science. Radiative forcing is defined by the Intergovernmental Panel on Climate Change (IPCC) as:

“The change in the net, downward minus upward, radiative flux (expressed in W/m²) due to a change in an external driver of climate change, such as a change in the concentration of carbon dioxide (CO₂), the concentration of volcanic aerosols, or the output of the Sun.”

The “downward” radiation at the top of the atmosphere (TOA) is the incoming sunshine. It’s all the radiation entering the system.

The “upward” radiation is the longwave thermal radiation heading to space. It’s the total of all the energy leaving the system.

Now, that claim of linearity makes absolutely no sense to me. Let me explain why.

First, the surface temperature can change without affecting the TOA radiation balance. The climate system is a giant heat engine. Heat comes in at the hot end of any heat engine: in this case it’s the tropics. Then it does work with some of the heat, and the rest of the heat is exhausted at the cold end of the heat engine: in this case, the poles.

Note that only part of this heat is converted into work. The rest is just passing through, carried by the ocean and the atmosphere from the tropics to the poles and back out to space. Any variation in the percentage of the total flow which is converted to work will change the surface temperature without any change in the TOA radiation balance.

Next, the climate system isn’t free to adopt any configuration. It is ruled by the Constructal Law, and like a river meandering to the sea, it doesn’t move in straight lines. Like all flow systems far from equilibrium, the river is maximizing flow, and thus the river picks the longest possible path to the sea.

Similaryly, as a constructally ruled system, the climate is always seeking to maximize the flow from the tropics to the poles. And as that flow speed changes, the surface temperature changes … without any corresponding linear change in the TOA radiation balance.

Finally, their Equation 1 equates a quantity that IS conserved (watts per square meter) with a quantity which is NOT conserved (temperature). I’m not clear how that is even possible.

However, for the purpose of this discussion, let’s assume that they are right about that particular relationship between TOA forcing and temperature. We’ll follow that path and see where it leads.

As a first step along that path, let me return to the idea that the climate doesn’t move in straight lines. For example, below is a graph of gridcell-by-gridcell total cloud cooling/warming, which is a combination of the clouds’ effects on longwave and shortwave radiation plus the evaporative cooling related to rainfall. I’ve compared it to gridcell surface temperatures in a scatterplot with contour lines.

Now, I started doing these scatterplots like in Figure 1 below, comparing two variables for every 1° latitude by 1° longitude gridcell of the planet, for a simple reason. They show the long-term relationship between the two variables. Each gridcell on the planet is in a long-term, general steady state regarding the various measurable factors, like say thunderstorm prevalence. Annual averages of these relationships vary little, and a 24-year average reveals the underlying long-term relationship of the variables.

And this lets us investigate things like the long-term value of the equilibrium climate sensitivity … but I get ahead of myself …

Figure 1. Scatterplot plus density contour lines and LOWESS smooth. Total cloud cooling versus surface temperature, entire planet.

Back to figure 1, there’s much of interest. First, in gridcells with average temperatures below about -20°C, which is Greenland and Antarctica, the clouds warm the surface. Then when the frozen ocean comes into play, from -20°C  to where the gridcells average about freezing, there is cooling increasing with temperature.

Then the trend reverses, and cooling decreases with temperature up to the gridcells with an average temperature of 25°C or so. And above that, the cooling increases radically and almost vertically to the point where it is cooling those gridcells by -400 W/m2 or so.

In passing, note the peak around 25°C. If the temperature goes above that, the clouds increase their cooling, eventually to a radical extent. And when the temperature goes below ~ 25°C, the clouds decrease the amount of cooling. This is clear evidence of the thermoregulatory action of clouds, cooling more when it’s warmer and less when it’s cooler.

Finally, the predominant role of the ocean is evident in both the tighter grouping and the larger number of the blue oceanic dots compared to the brown dots showing land gridcells.

And to close the circle, the red/black line showing the change of cooling with temperature is kinda the definition of non-linear …

Now, these kinds of graphs are very useful for a simple reason. The slope of the red/black line at any point gives the average change in the y-axis variable for a 1°C change in the surface temperature. So for example, we can see that when it’s above say 25°C, the total cloud cooling increases extremely rapidly with each 1°C increase in temperature.

With all of that in mind, what can such graphs show us about the long-term relationship between temperature and forcing?

The mainstream theory goes like this: 

  • Doubling the amount of CO2 intercepts more of the upwelling longwave radiation headed to space.
  • This leads to a top-of-atmosphere (TOA) radiative imbalance. 
  • The Earth then warms up until the balance is restored.

So the question becomes … how much does the earth have to warm up to restore the 3.7 watts per square meter (W/m2) of TOA radiation imbalance that is said to result from a doubling of CO2 (2xCO2)? 

This amount of warming required to rebalance the TOA radiation imbalance is called the “equilibrium climate sensitivity” (ECS) to a doubling of CO2. It’s the “lambda” in the linear Equation 1 (and only) above.

To investigate the value of the ECS, here is the scatterplot of the TOA imbalance versus the surface temperature.

Figure 2. Scatterplot plus density contour lines and LOWESS smooth. Top of atmosphere radiative imbalance versus surface temperature, entire planet. The percentage (% area) numbers show the percentage of the surface area in each temperature interval.

Man, I love being surprised by my investigations. It’s the best part of my scientific education. I definitely did not expect the graph to look like that. But facts are facts. 

At temperatures below -20°C, the brown dots show it’s just land— Greenland and Antarctica. And there, curiously, the TOA imbalance gets more negative for each 1°C of warming. Then, at around -15°C, the slope reverses as the frozen ocean comes into play. It increases, somewhat linearly, until about 20°C or so, after which the imbalance starts increasing at a faster rate.

We can visualize these changes in detail by calculating the slope at each point in the red/black line. Recall that the slope is the change in TOA radiation imbalance per degree of warming. Here is that result.

Figure 3. Slope of the red/black trend line shown in Figure 2 above. If you wonder why the area-averaged change is so high, look at the percentages of the global area with annual average temperatures in each temperature interval.

I’ve included the area-weighted average of the change in the TOA balance from a 1° increase in surface temperature. It’s 6.6 W/m2 per °C. This implies an equilibrium sensitivity (ECS) of 0.6°C per doubling of CO2

I’m gonna say that is a reasonable estimate for the ECS, for a couple of reasons.

First, this ECS estimate of 0.6°C is not outside the range of other observational estimates of CO2 sensitivity. In the Knutti dataset there are the results of 172 people’s calculations of the ECS, using different methods. My estimate is at the low end, but it isn’t the lowest.

Figure 5. Estimates of ECS from theory and reviews, observations, paleoclimate studies, climatology, and climate models. Note that in the last half century these estimates have grown more scattered, not less. And it’s particularly true for climate models (yellow dots).

The second reason I think that my ECS estimate of 0.6°C per 2xCO2 is valid is that it agrees with what I said about my previous estimate of the ECS, which was based on my implementation of Bejan’s Constructal climate model. The model is described in the post below.

I fear most folks don’t understand the importance of the model of the global climate system that Bejan created. As I showed, it does a very accurate job of calculating several critical parameters of the climate using one and only one tuned parameter, the conductance. Conductance in this context is how fast the climate system can move the heat from the hot zone to the cold zone. The agreement of the model with reality is shockingly good. Read the post.

Using that model, I was able to experimentally determine my best estimate of the climate sensitivity. From that previous analysis:

This constructal model points out some interesting things about climate sensitivity.

First, sensitivity is a function of changes in rho (albedo) and gamma (greenhouse fraction). But not a direct function. It is the result of physical processes that maximize “q” [the flow from the hot zone to the cold zone] given the constraints of rho and gamma.

Next, the sensitivity is slightly different depending on whether the changes in albedo and greenhouse fraction are occurring in the hot zone, the cold zone, or both.

Next, assuming that there is a uniform pole-to-pole increase of 3.7 W/m2 in downwelling radiation from changes in either albedo or greenhouse fraction, the constructal model shows a temperature increase of ~1.1°C. (3.7 W/m2 is the amount of radiation increase predicted to occur from a doubling of CO2.)

Finally, this 1.1°C equilibrium climate sensitivity is a maximum sensitivity which does not include the various emergent thermoregulatory mechanisms that tend to oppose any heating or cooling. This means the actual sensitivity is lower than ~1.1°C per 2xCO2.

And my latest estimate, 0.6°C per 2xCO2, is indeed lower than the upper bound of 1.1°C per 2xCO2 found in Bejan’s model, just as I’d predicted.

And thus endeth my disquisition about non-linearity and how it led me to my latest ECS estimate.

My warmest regards to everyone, as always.

w.

PS—When you comment, please quote the exact words you are referring to. I choose my words carefully so I can defend them. I cannot defend your restatement of my words, no matter how well-meaning.


Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

via Watts Up With That?

https://ift.tt/EvtSoZi

July 12, 2025 at 12:06PM

Sunday

0 out of 10 based on 0 rating

via JoNova

https://ift.tt/imhxX2B

July 12, 2025 at 09:29AM