Month: May 2023

Bureau Capitulates: But Overseas Model Unlikely to Solve All Temperature Measurement Issues

It has only taken ten years, that is how long a few of us have been detailing major problems with how the Australian Bureau of Meteorology measures daily temperatures. Now, I’m informed, the Bureau are ditching the current system and looking to adopt an overseas model that it claims will be more reliable.

There will be no media release.

There was no media release when the Bureau ditched its rainfall forecasting system (POAMA) once described as state of the art, and quietly adopted ACCESS-S1 based on the UK Met Office GloSea5-GC2. (As though the British are any better at accurate rainfall and snowfall forecasts.)

Until yesterday the Bureau has claimed one-second spot temperature readings from its custom-designed resistance probes, did not need to be numerically average; something overseas bureaus routinely do in an attempt to achieve consistency with measurements from the more inert traditional mercury thermometers.

Consistency across long temperature series is, of course, critical to accurately assessing climate variability and change.

The Australian Bureau has long claimed numerical averaging is not necessary because its ‘thick’ probe design exactly mimicked a mercury thermometer.

Then this design was phased out and replaced with the ‘slimline’. Still no inter-comparison studies.

I welcome the switch to ‘the overseas model’ if this means that the Bureau will begin numerical averaging of spot readings from its resistance probes in accordance with World Meteorological Organisation recommendations.

But the problem of reliable temperature measurements doesn’t begin or end with numerical averaging.

The Bureau, and the Met Office in the UK, have been tinkering with how they measure temperatures since the transition from mercury thermometers to resistance probes began in the 1990s. Not only with how they average – or not, but with probe design, and also with power supply.

It is important to understand that resistance probes hooked-up to data-loggers measure temperature as a change in electrical resistance across a piece of platinum. And, this is the important bit, the voltage delivered to the probe is critical for accurate temperature measurement. Not just in Australia, but around the world. And there are no standards.

When using a traditional mercury thermometer, temperature is read from a scale along a glass tube that shows changes in the thermal expansion of that liquid, which happens to be mercury. The mercury thermometer was once the world standard.

The new, automated, and potentially more precise method for measuring temperatures via platinum resistance, is reliable in controlled environments; satellites that are measuring temperatures at different depths within the atmosphere use these resistance probes. But it gets much more complicated when trying to measure temperatures on Earth, and especially at busy places like airports, which have become a primary site for the automated electronic weather systems using platinum resistance probes from which global average temperatures are now derived.

At airports, the electrical system relied upon to measure temperatures very precisely, must be insulated from other electrical systems including radar and even chatter between a pilot wanting to land his jumbo and the control tower.

The electronics now used to measure climate change, are not only susceptible to electrical interference at these airports, but also changes in voltage that can be caused by something as simple as turning on and off runway lights – at dusk and dawn.

To know how reliable, the new system is, we need the parallel data not just for Australia, but for overseas airports including Heathrow and Cochin – in India, the world’s first airport fully powered by solar energy.

To know that warming globally has not, at least in part, been caused by a move to resistance probes, we need to see the inter-comparison data showing the equivalent temperature measurements from mercury thermometers at the same place and on the same day.

I’m reliably informed by a past Bureau employee that upgrading power supplies in 2012 caused a 0.3-to-0.5-degree Celsius increase across about 30 percent of the Australian network. (That would get us some way to the 1.5 degree Celsius tipping point, even if we closed down every coal-fired power stations.)

Perth-based researcher Chris Gillham documented this uptick in Australian temperatures in correspondence to me last October, and as an abrupt change in the difference between the Bureau’s monthly mean temperature as reported from ACORN-SAT and the satellite data for Australian as measured by the University of Alabama Huntsville.

‘Australia UAH’ is the Australian component of the University of Alabama satellite monitoring. ACORN 2.1 is the official homogenised/remodelled Australian temperature series that the bureau uses for reporting climate change. More information on the extent to which electronic temperature measuring systems can cause discontinues in temperature series can be found in an important report by Chris Gillham entitled ‘Have automatic weather stations corrupted Australia’s temperature record’.

The step-up in warming in the official data for the entire Australian continent is also noted in peer-review publications by climate scientists including Sophie Lewis and David Karoly. I have written to Sophie Lewis about the problems in relying on Bureau data. But instead of attributing the change to equipment and voltage, Lewis, Karoly and other climate scientists ascribe it to anthropogenic greenhouse warming.

Because university climate scientists ignore my correspondence, and rely entire on advice from the Bureau’s current management, they could not know otherwise. The Bureau current management refuse to report this change documented by its technicians and communicated to me unofficially by retired former managers.

A relevant question: why did the Bureau’s Chief Executive Andrew Johnson not report the 0.3-to-0.5-degree Celsius increase across about 30 percent of the Australian network (caused entirely by a change to the power supply, not air temperatures) in its 2013-14 Bureau of Meteorology Annual Report to Federal Parliament?

In that same report Johnson does comment on the infrastructure upgrades that caused the artificial warming in the official temperature data.

Meanwhile, the Bureau’s management, including Johnson, continues to lament the need for all Australians to work towards keeping temperatures below a 1.5 degrees Celsius tipping point. Just today we are told to expect another hike in the price of electricity because of the need to transition to renewables including solar.

This headline is from today’s Courier Mail. Other newspapers report: The offers, which cover New South Wales, South Australia and south-east Queensland, indicate prices will rise between 19.6% and 24.9% for residents, similar to the draft levels announced in March. Victoria also announced a 25% rise to its default offer.

The Bureau continues to support a transition to renewable energy, without explaining the potential effect, even on the reliability of its temperature measurements. As the Bureau has not explained the effect of the transition to resistance probes more generally. (I’ve characterised the Ayers and Warne papers as fake in a 6-part jokers series, republished by WattsUpWithThat.)

An overseas colleague has explained how something as simple as applying a 100Hz frequency to a power circuit to extend the life of a battery – necessary with solar systems – can cause maximum temperatures to drift up on sunny days. To be clear, as the voltage increased the recorded temperature increased additional to any actual change in air temperature!

Go to the NASA page about temperature measuring and you will see a picture of someone atop a mountain in Montana and a solar panel. That solar panel will be supplying to a battery that will not only provide the voltage used to measure electrical resistance across the platinum wire, but also for the periodic upload of that same temperature data to a satellite.

Weather stations are set up throughout Glacier National Park in Montana to monitor and collect weather data. These stations must be visited periodically for maintenance and to add or remove new research devices. Credit: GlacierNPS, CC BY 2.0, via Wikimedia Commons.

Since the transition to the resistance probes that use voltage to measure temperature, problems at remote locations from mountains to lighthouses have included aging batteries – solar powered of course – but unable to provide sufficient current at critical times.

I’ve been shown data from such a remote location, were minimum temperatures reliably drop 2 degrees Celsius on the hour, at the same time every hour through the night, as the battery is drained with each satellite upload of temperature data.

This is the same temperature data that is being used in Australia, and around the world, to justify extreme economic and social intervention in the name of stopping climate change.

IN SUMMARY

In the 1990s, not just in Australia, but around the world, there was a fundamental change in the equipment and methods used to measure temperatures.

This created a discontinuity in the long temperatures series that begin around 1880, and that are used by the IPCC to measure climate variability and change.

By neither the IPCC, nor NASA, nor the UK Met Office have documented the effect of this change.

I’ve been asking the Australian Bureau, that provides data into the global databases, how they know that temperature measurements from the resistance probes at places like Cape Otway lighthouse are consistent with readings from mercury thermometers. (I’ve written extensively about how temperatures are measured at this lighthouse, including as part 3 of my 8-part series about hyping daily maximum temperatures.)

Now I ask, how can the bureau know that NASA and the UK Met Office are reliably measuring temperatures if they have not seen the US and UK parallel temperature datasets?

The parallel data are the recordings from the mercury thermometers measuring at the same location and the same place as the resistance probes. This data will give some indication of the extent of the many discontinuities created in the record by the change over to probes. I have estimated that the bureau is holding parallel data for approximately 38 Australian locations with on average 15 years of data.

When John Abbot first lodged a Freedom of Information request for some of this data for Brisbane Airport back in 2019, he was told that the parallel data did not exist.

Abbot took the issue to the Australian Information Commissioner, who sided with the bureau falsely confirming that it did not exist.

It was only after an appearance at the Administrative Appeal Tribunal in Brisbane on 3rd February, where I attended as an expert witness, and the drawn-out mediation process that followed, that three years of Brisbane Airport parallel data was finally made available.

As Graham Lloyd explained on the front page of the Weekend Australia thereafter, my analysis of this data shows that the resistance probes at Brisbane Airport measure temperatures that are quite different from the mercury thermometer most of the time. The Bureau has been neither able to confirm, nor deny, the statistical significance of the difference. But it doesn’t dispute the actual numbers.

John Abbot recently lodged another FOI for more parallel data for Brisbane Airport. This time around the bureau has acknowledged the existence of the data and even that some of the ‘field books’ have already been scanned, and so are available in an electronic form. But the bureau claims it will only be able to release another three years of data for this one site (Brisbane Airport), again this time – never mind the 15 years of parallel data that exists for Brisbane Airport and another 37 locations that vary geographically and electrically. We need this comparative data, including to assess the reliability of the current global warming forecasts.

I’ve been reliably informed, the Bureau is intent on drawing out provision of this parallel data to Abbot and me for Australia, while changing the ‘model’ it is using to measure temperatures as though the overseas systems are reliable.

So, I ask, again, where are these numbers for overseas locations, including the parallel data for the overseas mountains and airports – not to mention lighthouses?

It is critical for everyone to be able to see this data, especially if the Australian bureau is to adopt an overseas model for temperature measuring on the basis it must be more reliable.

*****

The feature image shows me at the Goulburn Airport weather station in late July 2017. This weather stations was shown to have had a limit set on how cold temperatures could be recorded for a period of 20 years until Lance Pidgeon and I blew the whistle on the fiasco.

via Jennifer Marohasy

https://ift.tt/a05KYwt

May 25, 2023 at 08:03PM

Professors: The Entire Fossil Fuel Industry Must Be ‘Euthanized’ To Save Humanity From Warmth

Two University of Michigan professors insist we “must reduce the emission of greenhouse gases to zero” to stabilize the planet’s temperature. But because 80% of our energy use still comes from carbon-based sources today, “ending it will not be easy.” The death of all fossil fuel industry must be imposed, euthanasia-style.

It has now reached the point that academic elites are no longer concealing their real inclinations and intentions in massaged semantics or subtleties.

Two US business professors argue that the looming climate catastrophe (which they believe has been caused solely by human greenhouse gas emissions) necessitates that “the shape and structure of modern capitalism will have to be changed.”

No more roads. No more plastic or steel  or electronic products. No more air travel. All the industries that use petroleum products of any kind, no matter how essential, must end this practice, effective immediately. Fossil fuel use must be 100% eliminated.

The cost to get to zero greenhouse gas emissions? Estimates range from $100 to $150 trillion over the next 30 years.

And putting a price on carbon use doesn’t nearly go far enough. It’s not possible to get to zero emissions just by making fossil fuel use more expensive. The entire fossil fuel industry – the producers as well as the recipients – must undergo, as the authors put it, “compassionate destruction.”

If there is any resistance to the total destruction of fossil fuel use, then euthanasia – “the act or practice of killing or permitting the death of hopelessly sick or injured individuals,” must be put into practice. Imposed. Forced.

“A future in which we address climate change may require that the entire sector be euthanized, imposing death ahead of its imminent arrival.”

It is not our job to question. The situation is so real, so dire, that our only job now is to “come to terms with the extreme decision that has to be made for the patient.”

Image Source: Hoffman and Ely, 2023

via NoTricksZone

https://ift.tt/ca1xSlK

May 25, 2023 at 04:37PM

Green Schemes Broken by Reality

James E. Hanley provides a roundup of failed Green expensive ventures in his Real Clear Policy articles Green Projects Hit Iron Wall.  Excerpts in italics with my bolds and added images.

Developers looking to build thousands of wind turbines off the Mid-Atlantic and New England coast are coming up against a force even more relentless than the Atlantic winds: the Iron Law of Megaprojects, offering a warning of the trouble ahead for green-energy projects.

The Iron Law, coined by Oxford Professor Bent Flyvbjerg, says that “megaprojects” — which cost billions of dollars, take years to complete, and are socially transformative — reliably come in over budget, over time, over and over.

From Boston’s Big Dig to California’s high-speed rail to
New York’s 12 years-overdue and 300% over-budget East Side Access rail project,
big boondoggles routinely demonstrate the validity of the rule.

Offshore wind projects are not immune to the Iron Law, regularly experiencing vast cost overruns before a single watt is generated.

The New York state government, looking to replace oil- and gas-fired powerplants with hundreds of wind towers off Long Island, set out in 2019 to create an offshore wind supply chain from scratch, beginning with a massive state-funded turbine fabrication facility about 100 miles north of New York City on the Hudson River.

Port of Albany factory’s fate at stake as leaders race for a solution The $700 million-plus project is expected to create work for generations, but hopes are dwindling that more funding will become available

Ground still hasn’t even been broken, but the budget certainly has: The price of that Port of Albany facility has already doubled from $350 million to $700 million. An additional $100 million may be needed for equipment costs, raising the final price tag to $800 million.

It’s been billed the future hub for wind power infrastructure. So far, though, the only thing that continues to get billed over and over in recent years is the Connecticut taxpayer.

A similar situation is playing out in New London, Connecticut, where a state-funded pier facility being built to support that state’s offshore wind buildout has more than doubled in price from an original estimate of $95 million to $250 million.

Commonwealth Wind Declares that the largest offshore wind farm in the state’s pipeline “cannot be financed and built” under existing contracts,

And in Massachusetts, developer Commonwealth Wind has asked the state to scrap its power purchase guarantees and rebid the project, arguing that inflation and supply chain problems mean the project is not financially viable under its current contracts.

Big projects tend to exceed their cost projections for many reasons. One is the unanticipated, and sometimes unprecedented, complexity of these projects. Further uncertainties and costs arise from the challenge of navigating the red tape of the modern regulatory state. In addition, there is the risk of inflation for projects that take years, sometimes decades, to develop.

Underlying all these is often a failure to spend enough time on careful planning
that treats reality as a fundamental constraint.

But sometimes project sponsors may simply worry that accurate cost projections could scare away public support at the outset, and choose to employ what Prof. Flyvbjerg politely calls “strategic misrepresentation.”

As former San Francisco Mayor Willie Brown said, “If people knew the real cost from the start, nothing would ever be approved. . . . Start digging a hole and make it so big, there’s no alternative to coming up with the money to fill it in.”

If that sounds too cynical, note that the current Chair of the Connecticut Port Authority has admitted that when officials first proposed the pier facility, they already knew it would cost more than they were claiming.

Ironically, the New York and Connecticut projects aren’t even big enough to be considered megaprojects, and yet even they have run into the Iron Law of being over budget and behind schedule. The challenges won’t diminish with bigger and more ambitious green energy projects.

In New York, the state’s huge Climate Leadership and Community Protection Act — of which the Port of Albany project is the first substantial investment — is projected to cost between $270 and $290 billion. At that price it is a gigaproject composed of numerous individual megaprojects.

The benefits, mostly in the form of greenhouse gas reductions, are supposed to be up to $415 billion. But if the overall cost of the policy climbs by merely 55 percent, which is in the normal range for megaprojects (and much less than the Port of Albany cost overrun), the costs will exceed the benefits, creating a net loss for New Yorkers.

If costs balloon to twice the initial estimates, which is not uncommon, the state stands to spend more than more than a hundred billion dollars more than gained in benefits That would be a loss of over $30,000 per New York household by 2050.

And that’s assuming the benefits are as good as promised. It gets even worse if,
as is common, the benefits have been overstated.

The tale of megaprojects is a cautionary one for the whole country as we attempt to transition away from fossil fuels. Cost estimates for a nationwide transition span from $4.7 trillion to over $60 trillion – almost three times U.S. GDP. Such uncertainty should give us pause for thought before jumping wildly into the financial unknown.

If we’re not careful, we may be digging Willie Brown-style holes, and politically and financially we may find ourselves in too deep to ever get ourselves out.

via Science Matters

https://ift.tt/IlpD7hq

May 25, 2023 at 04:30PM

Open letter to Dr Hoesung Lee, Chair of the IPCC

The following letter was sent to Dr. Lee, the Chair of the IPCC earlier today (May 25th, 2023) by Dr. A.J. (Guus) Berkhout, President of Clintel, Emeritus Professor of Geophysics, and a member of the Royal Netherlands Academy of Arts and Sciences

Professor Dr. Hoesung Lee, Chair of the IPCC,
c/o World Meteorological Organization
7bis Avenue de la Paix C.P. 2300
CH -1211 Geneva 2, Switzerland.

The Hague, May 25, 2023

Dear Dr. Hoesung Lee,

With the recently published Synthesis Report, the IPCC finished its sixth assessment cycle, consisting of seven reports in total. An international team of scientists from the 1500-strong Climate Intelligence Foundation (Clintel) has assessed several claims from the Working Group 1 (The Physical Science Basis) and Working Group 2 (Impacts, Adaptation and Vulnerability) reports. Results have been summarized in Clintel’s report The Frozen Climate Views of the IPCC:

Thorough analysis by Clintel shows serious errors in latest IPCC report – Clintel

As background information, I wish to remind you of the 2010 InterAcademies Council (IAC) review of IPCC procedures, which was commissioned in the aftermath of disastrous publicity regarding errors in earlier IPCC reports and revelations of efforts by IPCC Lead Authors to stifle debate. The IAC concluded in part (emphasis added by me):

Given that the IAC report was prompted in part by the revelation of errors in the last assessment, the committee examined IPCC’s review process as well. It concluded that the process is thorough, but stronger enforcement of existing IPCC review procedures could minimize the number of errors. To that end, IPCC should encourage review editors to fully exercise their authority to ensure that all review comments are adequately considered. Review editors should also ensure that genuine controversies are reflected in the report and be satisfied that due consideration was given to properly documented alternative views. Lead authors should explicitly document that the full range of thoughtful scientific views has been considered.[1]

2010 InterAcademies Council (IAC) review of IPCC procedures

We regrettably conclude that the IPCC has failed to follow this advice and the AR6 exhibits the same flaws as before, namely biased selection of evidence, failure to reflect genuine controversies and failure to give due consideration to properly documented alternative views.

To give one example, the IPCC ignored crucial peer-reviewed literature, showing that normalised disaster losses have decreased since 1990 and that human mortality due to extreme weather has decreased by more than 95% since 1920. The IPCC’s authors asserted the opposite conclusions based on cherry-picked evidence, claiming increases in damage and mortality due to anthropogenic climate change, and the review process failed to correct this inaccuracy.

Clintel’s 180-page report, The Frozen Climate Views of the IPCC, is the first large scale international ‘assessment’ of the IPCC’s Sixth Assessment Report. In 13 chapters the Clintel report shows that IPCC makes numerous serious scientific errors that overall reflect a bias in favour of ‘bad news’ and against ‘good news’. This was the case throughout the report and especially in the preparation of the Summary for Policy Makers. The good news about disaster losses and climate related deaths was left out of the Summary for Policy Makers all together, for instance.

Additionally, where the IPCC AR6 has taken account of evidence that points away from a dismal, worst-case outlook, such as recognition that the RCP8.5, SSP5-8.5 and SSP3-7.0 emission scenarios are low likelihood and that models systematically overstate warming in the tropical troposphere, these findings are buried deep in the chapters and are not emphasized for the media or policy makers. Even worse, despite having concluded in its discussion of emission scenarios that the extreme ones are low likelihood, they are nevertheless given maximum prominence in other parts of the report for the purpose of projecting climate impacts.

Finally, we note that the IPCC has remained silent while the UN Secretary-General and other high-ranking officials repeatedly misrepresented the findings of the IPCC. For example, Secretary-General Guterres said of the Working Group 1 report:[2]

“Today’s IPCC Working Group 1 report is a code red for humanity. The alarm bells are deafening, and the evidence is irrefutable: greenhouse‑gas emissions from fossil-fuel burning and deforestation are choking our planet and putting billions of people at immediate risk.”

Secretary-General Guterres

The AR6 Working Group 1 report did not say these things, yet the IPCC never corrected him nor challenged any of the similarly inaccurate media coverage that distorts the contents of your report.

With all respect Dr. Lee, seriously misleading the world on such an important subject and on such a large scale is unacceptable for a UN organization that claims to be scientific. The errors and biases that Clintel has found in the AR6 report are worse than those that led to the 2010 IAC Review, indicating ongoing failure of the IPCC to live up to its mission.

The Clintel network therefore requests the following:

  • That the IPCC commissions a team with representation from Clintel and other independent persons not involved in IPCC Leadership to review whether the IPCC has fully implemented and followed the reforms recommended by the 2010 IAC Review, and whether more reforms are needed.
  • That the IPCC reviews prominent statements by major world leaders and media outlets paraphrasing the contents of the AR6 and correct the record where those statements are misleading or inaccurate.
  • That the IPCC meets with representatives from Clintel to receive input on the key deficiencies highlighted in our report that require a formal correction.

Looking forward to your response,

Yours sincerely,

Dr. A.J. (Guus) Berkhout, President of C

lintel, Emeritus Professor of Geophysics, Member of the Royal Netherlands Academy of Arts and Sciences

P.S. The main objective of the Climate Intelligence Foundation (Clintel) is to generate knowledge and understanding of the causes and effects of climate change, as well as of the effects of climate policy. Clintel published the World Climate Declaration, which has now been signed by more than 1500 scientists and experts worldwide, thus rivalling in size and credentials the IPCC’s Working Group authorship lists. See www.clintel.org.

The letter is lightly edited to get it into blog post format.

  1. https://www.interacademies.org/news/interacademy-council-report-recommends-fundamental-reform-ipcc-management-structure

  2. https://press.un.org/en/2021/sgsm20847.doc.htm

via Watts Up With That?

https://ift.tt/V5i9rSM

May 25, 2023 at 04:29PM