Month: July 2025

The Reification of Averages – Part One. How the Met Office manipulates measurements.

Reification is defined as:

The act of treating something abstract, such as an idea, concept, or relationship as if it were a concrete, tangible thing, essentially turning something non-physical into something perceived as a concrete object or entity. This can occur in various contexts, from everyday language to social theory.” 

Although a rarely used word, its definition is something meteorology applies on a very regular basis. When you hear the weather forecaster announcing “that’s 10 degrees warmer than it should be for the time of year” you are being subjected to reification of a totally abstract, indeed usually completely non-existent, concept.

The headline image is taken from an essential reading short book written by Darell Huff in the 1950s that is possibly even more relevant in the 2020s than it was when written. It is now free to read online here

https://mronline.org/wp-content/uploads/2019/05/HowToLieWithStatistics.pdf

Many of the aspects covered by Huff relate very well to how the UK Met Office (and meteorology in general) portrays its data and how this is then “lapped up” by much of the media. I would like to expand upon Stephen Connolly’s excellent article on comparing temperatures past and present with special regard to how modern instrumentation, Stevenson screen siting, spurious maintenance standards, Victorian design and technology and instrumentation can readily be used to “achieve” preset end results and how such abstract concepts take on an almost reality life of their own.

This is my 300th post since restarting the Surface Stations Project 24/8/2024 and marks the beginning of summarising issues identified so far in my research. Further parts will follow, for now the averaging system.

Normal ” Averaging.

Take a sample of 10 adult humans, 9 are fully able but one has sadly had a leg amputated. Crude averaging of this sample could say the average number of legs is 1.9 per person i.e. 19 legs divided by 10 people. This could then be extended to claim 9 people were above average in respect of their number of legs and one was below average. Nobody was actually the “average”.

Obviously the above is a ridiculous example and such a dubious construct would never be used, or would it? In 1960s Britain there was a widely banded statistic that the average family had 2.4 children and that term almost passed into folklore with a TV sitcom spun off it. Even the Office of National Statistics recently referred to it. https://blog.ons.gov.uk/2019/08/02/whatever-happened-to-2-point-4-children/ This was a classic example of reification, a completely abstract and impossible figure (there is no such thing as “point four” of a child and never has been) was being taken as accepted fact. Social science analysis and even house design was being driven by this reified fictional notion.

To take this point a stage further imagine a sample of 24 people’s earned income. One person does not work and has zero earned income. Another is a reasonably high earner on £100,000 per annum, the remaining 22 all earn exactly £30,000 per annum each. What the “average” income actually is depends on which type of averaging system chosen. It is reasonable to assume most people would consider the “average” to be £30,000 per annum i.e. the mode or most common point. An arithmetic mean would derive 22 times £30,.000 = £660,000 + £100,000 + £0 divided by 24 = £31,667 so not that different to the mode. However, according to meteorological averaging the figure derived would actually be £50,000 – not even remotely close to any of the sample groups actual earnings. 23 people would be deemed below average income, only 1 above average. How does this work and why?

Meteorological Averaging.

This works to its own unique system. Daily average (“mean”) temperature is derived from adding together the daily minimum to the daily maximum and then dividing by 2 . This demonstrates how the “average” income of £50,000 in the above example was derived. This is despite the fact that it is routine for the Met Office (and even the public) to have access to 24 separate hourly readings per day, every day of the year – in fact most Met Office automatic weather stations produce 1,440 readings per day on a minute by minute basis.

An example of this minute by minute “extremes effect” were the details revealed behind a daily national high at Cavendish. In this case the “high” lasted for just 1 minute and was bizarrely quoted to the 5th decimal place from equipment only calibrated to the first decimal place. As former Met Office manager, John Maynard commented that this “just showed the stupidity of the observer.” Similar one minute readings established the crossing of a somehow “scary” threshold at Wiggonholt.

This Meteorological averaging principle emphasises extremes rather than represents the norm and consequently regularly produces bizarre and nonsensical results. As I reported in the comparison of the months of May 2024 with May 2025 , the former dank, damp and dismal May 2024 was deemed considerably “hotter” than the record sunshine, dry and warmer conditions of May 2025. Through the reification of this nonsense by the argument from authority of the Met Office, this somehow managed to make both months indicative of an impending doom.

Deriving data from solely extreme inputs can produce even more absurd outcomes. In the 24 person income scenario above, if that one person formerly without income were to start earning at exactly the same rate as the 22 majority (i.e. £30,000 per annum), then meteorological style averaging would suddenly make the average income ( now £30,000 minimum + £100,000 maximum ÷ 2) = £65,000. All those earning £30,000 have just fallen an even longer way further behind the “average” income instantaneously!

It is worth bearing in mind that, whilst it is standard practise in statistics to eliminate extremes or “outliers” (typically the top and bottom 5 or 10%) in order to avoid skewing averages, meteorologists solely use such normally excluded aspects. Small marginal changes can cause almost seismic variations.

History

This clearly outdated averaging system derived from the earliest attempts to standardise temperature recording. At one time knowing what the highest or lowest temperature of the day in the open air actually was would have required physically viewing the thermometer continuously – unsurprisingly that never happened. Initially readings were taken at set times per day but even that presented difficulties in that time itself was not standardised in the UK until the railway era. Prior to this change “local time” operated largely by the parish church clock and approximations of solar time hence no simultaneity of readings could be guaranteed.

GMT was ultimately adopted across Great Britain by the Railway Clearing House in December 1847. It officially became ‘Railway Time’.

By the mid-1850s, almost all public clocks in Britain were set to Greenwich Mean Time and it finally became Britain’s legal standard time in 1880.

Realistically it was not until the advent of Six’s maximum/minimum thermometer in 1780 that any passably reliable direct readings could be taken though even these were initially subject to dubious calibration . This thermometer gave fixed points of the extremes that could be read at approximately the same time each day without corruption if early enough in the morning. It must be noted that even this system works on assumptions. The maximum reading shown at 9:00 is assumed to be from the day before, whilst the minimum is attributed to the day of the reading. This often is not the case for example when a warm front displaces a cold one overnight or vice versa. A very cold day may be followed by a warmer night and the readings attributions can be the wrong way around.

It was not until Rutherford considerably improved the Six design in 1790 that readings became more accurate but the day attribution issue still remains to this date at manual stations despite the fact that everyone of them has an electronic Platinum Resistance Thermometers (PRT) for at least the maximum readings with a continuous readout held in its memory. {n.b. Liquid in Glass Thermometers (LIGT) are usually retained for minimum readings.}

This production of just 2 readings per day and only of the extremes is the origin of this unrepresentative averaging system – and thence, unlike any other branch of science, this seems to have been set in stone and apparently CANNOT (or rather will not) be changed.

Despite all the options that have been available since with improved metrology , the Met Office, and other meteorological institutions, have maintained this averaging system derived almost from antiquity. Those temperatures forecasters claims to be above or below what “it should be”/ “ought to be” /”what we would expect for the time of year” et alia are in fact simply the reifications of a terribly bad system and in reality are quite “mean-ingless”.

Problems caused by this system.

Authors Anthony Woodward and Robert Penn produced the delightfully interesting book “The Wrong kind Of Snow” which covers weather extremes for every day of the year together with interesting and entertaining associated weather facts. For most days of the year the book highlights the highest and lowest ever recorded temperatures which reveals the remarkably wide range of UK weather from year to year.

Whilst it might be assumed that extremes would be geographically driven they are in fact anything but. The overall maximum annual temperature differential according to the Met Office over recent times runs from minus 27.2 °C to 40.3 °C making a 67.5 °C range. Surprisingly even places close together can display a very large range

– On 10/3/1929 a temperature of 21.3 °C was recorded at Roade (Northants) whilst on the same day just 2 years later and under 40 miles away Rickmansworth shivered at minus 15 °C…….a 36.3 °C a largely only temporal differential.

– The notably mild south east England climate of Kent has recorded the low temperature of minus 21.3°C on a 30th of January whilst the same day in a different year the high was recorded as plus 15.5 °C. in north east Scotland…….. a 36.8°C both temporal and geographic differential.

What the above figures demonstrate is that there is no specific  temperature for any time of year or location but rather a very large potential range. Claiming any “should be/ought to be” for the time of year is a nonsense derived from the reification of an absurd averaging system resolving down extremes to unrepresentative “daily means”.

The current spate of Met Office “gaslighting” the UK public into comparing current summers to the exceptional summer of 1976 cannot be supported by the hard numbers of high temperatures exceeding thresholds. In lieu of the reality the Met Office resorts to its statistical misrepresentation tool of daily mean.

In 1976 there were  “16 consecutive days over 30 °C (86 °F) from 23 June to 8 July” far in excess of any subsequent summer heatwaves. However, in also being towards the end of a long term drought (which had started at the end of summer 1975) the weather also displayed desert-like (radiation frosts) daily temperature ranges to colder overnight temperatures. There were exceptionally rare “black frost” events recorded in southern England from the 28th to the 31st July. This aridity effect lowered the minima recorded and thence meteorological averaging took over to distort the reality of continuous very hot days, – archetypical “How to Lie with Statistics”

System Summary.

What is shown with this meteorological averaging system is that,

1. It is controlled by extremes regardless of how brief they may actually be. Just 2 separate 1 minute readings out of 1,440 can set the mean daily temperature.

2. It is outdated, only established by historic shortcomings and is failing to utilise the benefits of technological improvements.

3. It produces fixed headline numbers for reification.

4. Most importantly for some – it could easily be “played” or manipulated.

Playing the system. ……to be continued.

via Tallbloke’s Talkshop

https://ift.tt/nKH4l5w

July 23, 2025 at 06:33AM

VCEA could cause Dominion’s average customer to pay over $40,000 for batteries

The Virginia Clean Economy Act (VCEA) mandates that Dominion Energy, the state’s big electric utility, rapidly shift its power generation to wind and solar.

via CFACT

https://ift.tt/B1j2ZNs

July 23, 2025 at 05:06AM

Another Day, Another Model of Future Climate Doom

From the Institute for Basic Science and the “we’re all gonna die” department comes Episode #2971 of model madness via press release:

Earth’s future climate at 9 km worldwide resolution

Global Warming does not affect our planet evenly. Some areas such as the Arctic region or high mountain peaks warm faster than the global average, whereas others, including large parts of the tropical oceans, show reduced temperature trends compared to the mean. The heterogeneity of future rainfall patterns is even more pronounced. To adapt to future climate change, policymakers and stakeholders need detailed regional climate information, often on scales much smaller than the typical resolution (~100-200 km) of climate models used in the reports of the Intergovernmental Panel on Climate Change (IPCC).

A team of scientists from the IBS Center for Climate Physics (ICCP), Pusan National University in South Korea and the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI), Bremerhaven, Germany has achieved an important breakthrough in climate modeling, providing unprecedented insights into Earth’s future climate and its variability. Their research was published in the open access journal Earth System Dynamics.

Utilizing the AWI-CM3 earth system model, a novel iterative global modeling protocol, and two of South Korea’s fastest supercomputers (Aleph at the Institute for Basic Science and Guru at the Korea Meteorological Administration), the researchers have simulated climate change at scales of 9 km in the atmosphere and 4-25 km in the ocean. These extensive computer model simulations offer a more accurate representation of future climate conditions, enabling better planning for climate adaptation.

The AWI-CM3 high-resolution model accurately represents global climate, including small-scale phenomena, such as rainfall in mountainous regions, coastal and island climate processes, hurricanes and ocean turbulence (Fig. 1). By resolving more regional details and their interactions with the large-scale atmosphere and ocean circulations, the model demonstrates a superior performance compared to most lower-resolution climate models.

A snapshot of simulated climate conditions. Blue/red shading: sea surface temperature deviations from zonal mean; gray/white shading: low clouds; green/pink shading: 10m wind speed; blue/yellow shading in upper panels: hurricane precipitation. The figure illustrates the ubiquity of mesoscale climate phenomena, such as Tropical Instability Waves in the equatorial Atlantic and Pacific, hurricanes (making landfall in Hawaiʻi in this snapshot), ocean cold wakes generated by hurricanes, stratocumulus cloud decks and patchy day-time convection over the Amazon forest. Credit: Institute for Basic Science

A main product of the simulations is a set of detailed global maps of expected climate change (e.g., temperature, rainfall, winds, ocean currents, etc.) for an anticipated 1oC future global warming.

“It is important to keep in mind that Global Warming is spatially quite heterogenous. For a 1oC global temperature increase, the Siberian and Canadian Arctic will warm by about 2oC, whereas the Arctic Ocean will experience warming of up to 5oC. In high mountain regions, such as the Himalayas, the Andes and the Hindu Kush, the model simulates a 45-60% acceleration relative to the global mean”, says MOON Ja-Yeon from the ICCP, and lead author of the study. To ensure broad access to these high-resolution climate projections, the team has launched an interactive data platform, where users can explore future climate change on regional and global scales (Fig. 2). Normalized climate change data for a 1°C Global Warming level can be downloaded and opened directly in the Google Earth application. These data can provide information on expected future changes in climate variables, such as windspeed and clouds, which are relevant for the future deployment of wind or solar farms, respectively.

Download data of climate change (e.g., temperature, wind speed, precipitation, etc.) per 1oC global warming from 9 km AWI-CM3 Global Warming simulations. Go to: https://ift.tt/PUIxshW. Credit: Institute for Basic Science

“Our study also highlights the regional impacts of major modes of climate variability, such as the Madden Julian Oscillation, the North Atlantic Oscillation, and the El Niño-Southern Oscillation, as well as their response to greenhouse warming” says Prof. Thomas JUNG from the AWI and co-corresponding author of the study. According to the AWI-CM3 simulations, the amplitude of both, the Madden Julian Oscillation and of the alternating El Niño and La Niña events will increase in the future, which will lead to intensified rainfall impacts in affected regions. ​The simulations further indicate an increase in the frequency and intensity of extreme rainfall events (>50 mm/day) in areas such as eastern Asia, the Himalayas, the Andes, Amazonia, mountain-tops in Africa and the east coast of North America with significant implications for flooding, erosion, and landslides.

“Most global climate models used in the assessment reports of the IPCC are too coarse to resolve small islands, such as those in the western tropical Pacific. These islands are already threatened by global sea level rise. Our new climate model simulations now provide new regional insights into what these regions can expect in terms of changes in ocean currents, temperatures, rainfall patterns and weather extremes. We hope that our dataset will be used extensively by planners, policy- and decision-makers and the public.”, says Prof. Axel TIMMERMANN, Director of the ICCP and co-corresponding author of the study.

The study’s findings offer critical information for assessing climate risks and implementing adaptation measures on regional scales.

Journal: Earth System Dynamics


Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

via Watts Up With That?

https://ift.tt/56Ptmwz

July 23, 2025 at 04:03AM

Met Office Still Opening Junk Sites

By Paul Homewood

 

More from Chris Morrison:

 

image

Evidence continues to mount that the UK Met Office is chasing ‘hottest evah’ temperature extremes by deliberately siting new measuring stations in locations likely to be affected by heat spikes and unnaturally warmed ambient air. In the last 10 years to the middle of 2024, 81.5% of new sites were junk Class 4 and 5 operations with potential internationally-recognised errors up to 2°C and 5°C respectively. Incredibly, eight of the 13 newly-opened sites over the last five years were of junk status.

Now comes news of a new site recently opened in Wales at Whitesands that in the words of citizen super sleuth Ray Sanders, “appears to be a deliberate attempt to produce artificially elevated readings both now and ever increasingly in the future”.

Read the full story here.

What may have started simply because of incompetence now has the unmistakeable whiff of corruption. Why else would the Met Office still be deliberately and knowingly opening up junk weather stations? Eight out of the thirteen sites opened in the last five years, for instance.

If the Met Office wants to keep the few remaining shreds of credibility it still holds, it must immediately remove from its network all junk and near junk sites, which are totally unsuitable for climatological purposes. That must include Class 3 sites as well.

via NOT A LOT OF PEOPLE KNOW THAT

https://ift.tt/SJn3eFP

July 23, 2025 at 03:56AM