Month: January 2022

Sea Level Scare Machine 2021 Update

3047060508_737c7687bd_o.0.0

Such beach decorations exhibit the fervent belief of activists that sea levels are rising fast and will flood the coastlines if we don’t stop burning fossil fuels.  As we will see below there is a concerted effort to promote this notion empowered with slick imaging tools to frighten the gullible.  Of course there are frequent media releases sounding the alarms.  Recently for example:

From the Guardian Up to 410 million people at risk from sea level rises – study.  Excerpts in italics with my bolds.

The paper, published in Nature Communications, finds that currently 267 million people worldwide live on land less than 2 metres above sea level. Using a remote sensing method called Lidar, which pulsates laser light across coastal areas to measure elevation on the Earth’s surface, the researchers predicted that by 2100, with a 1 metre sea level rise and zero population growth, that number could increase to 410 million people.

The climate emergency has caused sea levels to rise and more frequent and severe storms to occur, both of which increase flood risks in coastal environments.

Last year, a survey published by Climate and Atmospheric Science, which aggregated the views of 106 specialists, suggested coastal cities should prepare for rising sea levels that could reach as high as 5 metres by 2300, which could engulf areas home to hundreds of millions of people.

The rest of this post provides a tour of seven US cities demonstrating how the sea level scare machine promotes fear among people living or invested in coastal properties.  In each case there are warnings published in legacy print and tv media, visual simulations powered by computers and desktop publishing, and a comparison of imaginary vs. observed sea level trends.

Prime US Cities on the “Endangered” List

Newport , R.I.

Examples of Media Warnings

Bangor Daily News:  In Maine’s ‘City of Ships,’ climate change’s coastal threat is already here

Bath, the 8,500-resident “City of Ships,” is among the places in Maine facing the greatest risks from increased coastal flooding because so much of it is low-lying. The rising sea level in Bath threatens businesses along Commercial and Washington streets and other parts of the downtown, according to an analysis by Climate Central, a nonprofit science and journalism organization.

Water levels reached their highest in the city during a record-breaking storm in 1978 at a little more than 4 feet over pre-2000 average high tides, and Climate Central’s sea level team found there’s a 1-in-4 chance of a 5-foot flood within 30 years. That level could submerge homes and three miles of road, cutting off communities that live on peninsulas, and inundate sites that manage wastewater and hazardous waste along with several museums.

UConn Today:  Should We Stay or Should We Go? Shoreline Homes and Rising Sea Levels in Connecticut

As global temperatures rise, so does the sea level. Experts predict it could rise as much as 20 inches by 2050, putting coastal communities, including those in Connecticut, in jeopardy.

One possible solution is a retreat from the shoreline, in which coastal homes are removed to take them out of imminent danger. This solution comes with many complications, including reductions in tax revenue for towns and potentially diminished real estate values for surrounding properties. Additionally, it can be difficult to get people to volunteer to relocate their homes.

Computer Simulations of the Future

Newport Obs Imaged

Imaginary vs. Observed Sea Level Trends (2021 Update)

 

Boston, Mass.

Example of Media Warnings

From WBUR Radio Boston:  Rising Sea Levels Threaten MBTA’s Blue Line

Could it be the end of the Blue Line as we know it? The Blue Line, which features a mile-long tunnel that travels underwater, and connects the North Shore with Boston’s downtown, is at risk as sea levels rise along Boston’s coast. To understand the threat sea-level rise poses to the Blue Line, and what that means for the rest of the city, we’re joined by WBUR reporter Simón Ríos and Julie Wormser, Deputy Director at the Mystic River Watershed Association.

As sea levels continue to rise, the Blue Line and the whole MBTA system face an existential threat. The MBTA is also facing a serious financial crunch, still reeling from the pandemic, as we attempt to fully reopen the city and the region. Joining us to discuss is MBTA General Manager Steve Poftak.

Computer Simulations of the Future

Boston Obs Imaged2

Imaginary vs. Observed Sea Level Trends (2021 Update)

 

New York City

Example of Media Warnings

From Quartz: Sea level rise will flood the neighborhood around the UN building with two degrees warming

Right now, of every US city, New York City has the highest population living inside a floodplain. By 2100, seas could rise around around the city by as much as six feet. Extreme rainfall is also predicted to rise, with roughly 1½ times more major precipitation events per year by the 2080s, according to a 2015 report by a group of scientists known as the New York City Panel on Climate Change.

But a two-degree warming scenario, which the world is on track to hit, could lock in dramatic sea level rise—possibly as much as 15 feet.

Computer Simulations of the Future

NYC Obs Imaged

Imaginary vs. Observed Sea Level Trends (2021 Update)

 

Philadelphia, PA.

Example of Media Warnings

From NBC Philadelphia:  Climate Change Studies Show Philly Underwater

NBC10 is looking at data and reading studies on climate change to showcase the impact. There are studies that show if the sea levels continue to rise at this rate, parts of Amtrak and Philadelphia International Airport could be underwater in 100 years.

Computer Simulations of the Future

Philly Obs Imaged

Imaginary vs. Observed Sea Level Trends (2021 Update)

 

Miami, Florida

Examples of Media Warnings

From WLRN Miami: Miles Of Florida Roads Face ‘Major Problem’ From Sea Rise. Is State Moving Fast Enough?

One 2018 Department of Transportation study has already found that a two-foot rise, expected by mid-century, would imperil a little more than five percent — 250-plus miles — of the state’s most high-traffic highways. That may not sound like a lot, but protecting those highways alone could easily cost several billion dollars. A Cat 5 hurricane could be far worse, with a fifth of the system vulnerable to flooding. The impact to seaports, airports and railroads — likely to also be significant and expensive — is only now under analysis.

From Washington Post:  Before condo collapse, rising seas long pressured Miami coastal properties

Investigators are just beginning to try to unravel what caused the Champlain Towers South to collapse into a heap of rubble, leaving at least 159 people missing as of Friday. Experts on sea-level rise and climate change caution that it is too soon to speculate whether rising seas helped destabilize the oceanfront structure. The 40-year-old building was relatively new compared with others on its stretch of beach in the town of Surfside.

But it is already clear that South Florida has been on the front lines of sea-level rise and that the effects of climate change on the infrastructure of the region — from septic systems to aquifers to shoreline erosion — will be a management problem for years.

Computer Simulations of the Future

Florida Obs Imaged

Imaginary vs. Observed Sea Level Trends (2021 Update)

 

Houston, Texas

Example of Media Warnings

From Undark:  A $26-Billion Plan to Save the Houston Area From Rising Seas

As the sea rises, the land is also sinking: In the last century, the Texas coast sank about 2 feet into the sea, partly due to excessive groundwater pumping. Computer models now suggest that climate change will further lift sea levels somewhere between 1 and 6 feet over the next 50 years. Meanwhile, the Texas coastal population is projected to climb from 7 to 9 million people by 2050.

Protecting Galveston Bay is no simple task. The bay is sheltered from the open ocean by two low, sandy strips of land — Galveston Island and Bolivar Peninsula — separated by the narrow passage of Bolivar Roads. When a sufficiently big storm approaches, water begins to rush through that gap and over the island and peninsula, surging into the bay.

Computer Simulations of the Future

Galv Obs Imaged

Imaginary vs. Observed Sea Level Trends (2021 Update)

 

San Francisco, Cal.

Example of Media Warnings

From San Francisco Chronicle:  Special Report: SF Bay Sea Level Rise–Hayward

Sea level rise is fueled by higher global temperatures that trigger two forces: Warmer water expands oceans while the increased temperatures hasten the melting of glaciers on Antarctica and Greenland and add yet more water to the oceans.

The California Ocean Protection Council, a branch of state government, forecasts a 1-in-7 chance that the average daily tides in the bay will rise 2 or more feet by 2070. This would cause portions of the marshes and bay trail in Hayward to be underwater during high tides. Add another 2 feet, on the higher end of the council’s projections for 2100 and they’d be permanently submerged. Highway 92 would flood during major storms. So would the streets leading into the power plant.

From San Francisco Chronicle Special Report: SF Bay Sea Level Rise–Mission Creek

Along San Francisco’s Mission Creek, sea level rise unsettles the waters.  Each section of this narrow channel must be tailored differently to meet an uncertain future. Do nothing, and the combination of heavy storms with less than a foot of sea level rise could send Mission Creek spilling over its banks in a half-dozen places, putting nearby housing in peril and closing the two bridges that cross the channel.

Whatever the response, we won’t know for decades if the city’s efforts can keep pace with the impact of global climatic forces that no local government can control.

Though Mission Creek is unique, the larger dilemma is one that affects all nine Bay Area counties.

Computer Simulations of the Future

SF Obs Imaged

Imaginary vs. Observed Sea Level Trends (2021 Update)

 

Summary: This is a relentless, high-tech communications machine to raise all kinds of scary future possibilities, based upon climate model projections, and the unfounded theory of CO2-driven global warming/climate change.  The graphs above are centered on the year 2000, so that the 21st century added sea level rise is projected from that year forward.  In addition, we now have observations at tidal gauges for the first 21 years, 1/5 of the total expected.  The gauges in each city are the ones with the longest continuous service record, and wherever possible the locations shown in the simulations are not far from the tidal gauge.  For example, NYC best gauge is at the Battery, and Fulton St. is also near the Manhattan southern tip.

Already the imaginary rises are diverging greatly from observations, yet the chorus of alarm goes on.  In fact, the added rise to 2100 from tidal gauges ranges from 6 to 9.5 inches, except for Galveston projecting 20.6 inches. Meanwhile models imagined rises from 69 to 108 inches. Clearly coastal settlements must adapt to evolving conditions, but also need reasonable rather than fearful forecasts for planning purposes.

Footnote:  The problem of urban flooding is discussed in some depth at a previous post Urban Flooding: The Philadelphia Story

Background on the current sea level campaign is at USCS Warnings of Coastal Floodings

And as always, an historical perspective is important:

post-glacial_sea_level

 

via Science Matters

https://ift.tt/3r4T9sL

January 27, 2022 at 01:07PM

Good News! Many Vax Mandates end in England, Ireland, Bolivia, Czech Republic, Denmark

It’s a miracle. Common sense.  For the first time in two years government rules are shrinking.

A lot of this is thanks to Omicron, the gift from Africa.

In terms of infections the UK is past the peak, but most of Europe is pyroclastic. France just recorded a half a million new Covid cases in a single day, and with a test positivity of 31% awesome percent. Something like 1 – 3% of the entire French nation caught Covid yesterday.

Omicron in Europe

All the past awful waves shrink before Omicron.   | Source OWID

Despite the bonfire of cases, the deaths are lower than any wave:

A glorious contrast in graphs:

Daily new confirmed COVID-19 deaths per million people

Deaths per million in Europe    | OWID ..

With the disaster averted in the hospitals, suddenly political leaders are changing tune. We might think this is all just thanks to the magic of Omicronic nicety, but cases are still rising in places like Denmark and France and yet the Governments are already pulling back. That’s surely thanks to the pressure of mass protests. Without that pain and anger, political leaders wouldn’t be acting so early.

Though it helps that millions of people are catching Omicron and know first-hand that it just doesn’t make sense to force people to get vaccinated.

There’s a wave of mandates disappearing

Vaccine passports were wound back in England last week.  But 80,000 Doctors, dentists and nurses still face losing their jobs next week due to vaccine requirements for them. It’s about 5% of total healthcare staff. There are many rumours the UK government is considering postponing it.

The US Supreme Court threw out Joe Biden’s illegal mandates for not just private but federal employees, but not before 98% of employees were cajoled, coerced or tricked into getting vaccinated.

And suddenly other countries are winding back the rules.

Ireland ends vaccine mandates:

Epoch Times

Almost all CCP (Chinese Communist Party) virus restrictions in Ireland will end on Saturday, including domestic COVID-19 Certificates, curfews, social distancing, and capacity limits.

From 6 a.m. on Saturday, COVID certificates, which are currently required as proof of vaccination or recovery to access indoor hospitality venues, cinemas, theatres, gyms, and leisure centres, will be scrapped.

Bolivians blocked the roads in protest, and the mandates ended:

Epoch Times

ANTA CRUZ, Bolivia—On Jan. 19 the administration of socialist President Luis Arce canceled the requirement of proof of vaccination against the CCP (Chinese Communist Party) virus to enter any public establishment or place of commerce.

The Movement for Socialism (MAS) party officials announced the original “supreme decrees” 4640 and 4641 on Dec. 28, which triggered nationwide protests and legal backlash in the cities of Santa Cruz, La Paz, Cochabamba, El Alto, and Sucre.

On Jan. 17 protesters established road blockades leading from the city of El Alto into the capital La Paz. The head of the rural magisterium in La Paz, Rudy Callisaya, said the blocks would stay in 20 provinces in La Paz department until the government agreed to dismiss the vaccine decrees. The roads Callisaya pledged to impede with other protesters are a vital part of the supply chain allowing food and essential goods to arrive at La Paz.

No more mandates in The Czech Republic:

The Czech Republic charts its own path on Covid

The new Czech government has pledged to undo the previous regime’s vaccine mandate for over-60s and selected professions in spite of the fact that the Czech Republic has lower levels of vaccination than Austria and Germany. Health Minister Vlastimil Válek has assured people that they “do not have to worry about sanctions” resulting from their vaccination status.

Denmark have so many infections they are stopping all restrictions including vax mandates

Politico

Denmark is to lift all remaining COVID-19 restrictions, with Omicron hospital admissions and deaths remaining stable and high rates of vaccination.

“Tonight we can … find the smile again. We have incredibly good news, we can now remove the last coronavirus restrictions in Denmark,” Prime Minister Mette Frederiksen said at a press conference, following recommendations from the Epidemic Commission and with all the main political parties’ support. The last restrictions will be dropped on February 1.

The announcement comes as a new subvariant of Omicron, BA.2, is gaining a foothold in Denmark and driving infections up, with 46,000 new COVID-19 cases recorded on Wednesday.

Dutch bars, restaurants and museums were allowed to reopen on Wednesday…Cafes, bars and restaurants closed since mid-December can now reopen with reduced capacity and until 10pm as long as customers have a Covid pass…

France on Tuesday reported a new daily record of 501,635 new cases but, again, while hospital admissions have risen, only about half as many patients are in intensive care as during previous waves, and the number has been falling since 12 January.

Some of the harshest places in the world are lightening up a bit

Austria still has one of the strongest vaccine punishment regimes in the world, from mid-March a new law will see the unvaccinated subjected to fines of up to €3,600 (£2,994). At the moment this would apply to an astonishing 28% of the population.

At least, this week the government announced it would let the unvaccinated out of their homes. They have been locked down since last November and could only leave for work or food. Since those monster fines were only just legislated a week ago, presumably the government needs a bit longer to come up with a facesaving excuse to wind them back too.

Germany is debating whether to mandate vaccination. The threats to force vaccinations on all adults have been discussed for months but are only just being debated in parliament now. There are accusations from the Opposition leader that Schols hasn’t even produced his own legislation. It sounds like a bit of a go-slow.

Chicago is talking about winding back the passports which started on Jan 3rd, “by spring”.

0 out of 10 based on 0 rating

via JoNova

https://ift.tt/3r70W9A

January 27, 2022 at 12:43PM

Scientists And Media Outlets Increasingly ‘Scolded’ And ‘Pressured’ To Blame Extreme Weather On Humans

Although potentially “misleading” and “specious,” realizing the goal of fomenting “action” on climate change means uncertainties and caveats must be journalistically eliminated. Media outlets are pressured by “green groups” to opportunistically claim every extreme weather event – including the ensuing damage – is caused by human greenhouse gas emissions.

In recent decades there has been a deintensification of extreme weather (precipitation) events.

Image Source: Koutsoyiannis, 2020

Deaths and property losses from extreme weather events have also been on the decline in recent decades (Broccard, 2021).

Image Source: Broccard, 2021

Models cannot simulate extreme events and mechanistically attribute them to human activity (Bellprat and Doblas-Reyes, 2016).

Image Source: Bellprat and Doblas-Reyes, 2016

While they admit “climate-centric framings of disasters can be misleading and problematic,” Lahsen and Ribot (2022) nonetheless seem to defend the practice of journalists and media outlets systematically dismissing uncertainties and doubt in attributing extreme weather to humans. They even acknowledge that alarmism is coached.

Where is the science in this?

“Powerful science leaders hope that identification of the role of climate change in extreme weather events will ‘spur more immediate action’ to mitigate climate change.”

“[T]he progressive research and information center Media Matters for America regularly scolds U.S. media outlets for failing to mention that climate change is driving the conditions that create this ‘new normal’ of frequent crises”

“[L]eading climatology communications advisors associated with the World Meteorological Organization (WMO) invoke examples from around the world to criticize media outlets for ‘far too often’ failing to seize on ‘clear opportunity’ to call attention to the climate as cause (Hassol et al., 2016). They coach experts to begin communications about such events by clearly defining climate change as cause, “[r]ather than starting with caveats, uncertainties, and what we cannot say,’ as scientists often do”

Image Source: Lahsen and Ribot, 2022

via NoTricksZone

https://ift.tt/3g1M2ei

January 27, 2022 at 12:43PM

Statistical Analysis Can Provide Evidence That the Official Australian Acornsat 2.1 Temperature Analysis Has Added Non-Climatic Warming to The Record, 1910 To 2019.

By Bob Irvine

Conclusions based on large complex datasets can sometimes be simplified and checked by comparing expected results with the actual numbers generated. This is the case with the 58 temperature stations that cover the period 1910 to 2019 in the Australian BOM record.

Andy May was right to be suspicious when the USA warming as recorded of 0.25C per century

is changed using corrections and a gridding algorithm to 1.5C per century. It is, however, nearly impossible to prove non-climatic warming due to data manipulation by normal means. The fact that an algorithm increases warming may be legitimate, “time of observation” (TOB) for example.

A similar thing has happened in Australia.

“ACORN is the Australian Climate Observation Reference Network of 112 weather stations across Australia. The rewritten ACORN 2 dataset of 2018 updated the ACORN 1 dataset released in 2011. It immediately increased Australia’s per decade rate of mean temperature warming by 23%…” (Chris Gilham)

Again, it is right to be suspicious but difficult to prove by simply looking at the overall result. The changes could be legitimate, despite there being virtually no TOB issue in Australia.

EVIDENCE THAT NON-CLIMATIC WARMING HAS BEEN ADDED BY THE ACORNSAT CORRECTIONS

The term “corrections” here refers to all changes to the raw data for any reason. Some of these changes will be legitimate and some will distort the data in an artificial way. What follows is evidence that the artificial distortion of the data is significant and the “correction algorithms” should be revisited.

SOME BACKGROUND

I have used the following dataset for all calculations and acknowledge Chris Gilham’s effort in compiling it.

http://www.waclimate.net/acorn2/index.html 

My term “magnitude of the corrections” for each station’s max, min and mean is derived as follows.  From the Albany maximum data at the above link.                                                                

 Average change per decade: ACORN 2.1 0.16C / raw 0.08C

The “magnitude of the correction” is the difference between “Acorn Sat 2.1” and “raw” for each station (max, min, mean). i.e. 0.08 (0.16 – 0.08) in this case.

The “Acorn 2.1 anomaly” in this case is 0.16C.

The Acorn 2.1 corrections are designed to correct the data for station moves, equipment changes etc. and the final result is expected to represent the real temperature anomalies at a particular station more realistically than the raw data.

While it is expected that the corrections will either warm or cool the data at any particular station, it can also be expected that the “magnitude of the correction” bear no relationship to the final temperature anomaly. The final temperature anomaly as represented by Acorn 2.1 should depend entirely on the real temperature anomalies at the particular station and, therefore, be completely unrelated to the “magnitude of the corrections”.

For example, if two hypothetical stations were a few hundred meters apart but showed a large difference in temperature anomaly due to differing vegetation or equipment etc., then Acorn 2.1 would hypothetically adjust one say by 1.0C per decade and the other by say 0.03C per decade to finish with the same temperature anomaly for each station as is expected. They are next to each other after all.

The “magnitude of the corrections”, in this case 1.0C/Dec. and 0.03C/Dec., should not correlate with the final temperature as recorded by Acorn 2.1. i.e., the final temperature anomalies in this hypothetical case will finish up being approximately the same while the “magnitude of the two corrections” are very different. I hope this is clear.

If we then take the 58 Australian stations that cover the period 1910 to 2019, we should find that the “magnitude of their corrections” and their final “Acorn 2.1 anomaly” have approximately zero correlation if the corrections are not artificially affecting the final Acorn 2.1.

To check this, I have calculated the Pearson Correlation Coefficient (PCC) for these two data sets. (“Magnitude of the 58 corrections” compared to the “Acorn 2.1 anomalies” for the 58 stations).

If this PCC is;

  • Negative – then the corrections to the raw data are adding artificial or spurious cooling to the raw data.
  • Approximately zero – then the corrections are not adding artificial cooling or warming and are doing their job as intended.
  • Positive – then the corrections are adding artificial or spurious warming to the raw data.

JUSTIFICATION FOR THE METHOD USED

The BOM comparison station selection method has a number of reference points and two variables per reference point that they are trying to correlate.

My data is exactly the same and I use the same method as they do to correlate my variables.

The BOM’s reference points are the 12 months.

My reference points are the 58 temperature stations.

The BOM’s two variables are the Temp. anomalies at station 1 and station 2 for each particular month.

My two variables are the “Magnitude of the Correction” and the “Acorn-Sat homogenised Temp. anomaly” at each particular station”.

A Pearson Correlation Coefficient (PCC) is specifically designed for this application.

The BOM statisticians obviously thought this the best way to approach this issue, as I do.

See also Appendix A for a stress test of the method used.

LEGEND

A PCC only indicates correlation and is not designed to indicate certainty or put a number on probability etc. The significance of a PCC value depends entirely on the intrinsic nature of the datasets being compared.

In their Pairwise Homogenisation Algorithm, AcornSat extensively use nearby comparison stations to deal with incontinuities in any given station. For a comparison station to be considered useable they compare the monthly temperatures for a 5-year period either side of the incontinuity. If the PCC for these two monthly datasets is greater than 0.5 then that comparison station is considered accurate enough to one tenth of a degree Celsius.

I have used the AcornSat logic and PCC figure as a guide when comparing the “magnitude of the corrections” with the “Acorn 2.1 anomaly” for the 58 stations.

My legend;

  • If the PCC is greater than 0.5 then – It is almost certain that the corrections are adding significant non-climatic warming to the raw data.
  • If the PCC is greater than 0.3 and less than 0.5 – then it is very-likely that the corrections are adding significant non-climatic warming to the raw data.
  • If the PCC is greater than 0.1 and less than 0.3 – then it is likely that the corrections are adding some non-climatic warming to the raw data.
  • If the PCC is between -0.1 and 0.1 then it is likely that the corrections are not adding non-climatic cooling or warming to the raw data. The corrections are doing their job.
  • If the PCC is less than -0.1 – then it is likely that the corrections are adding some non-climatic cooling to the raw data.
  • Etc.

RESULTS

Figure 1 graphs the 58 Australian stations that cover the period 1910 to 2019. The general slope of the data points up to the right of the graph is solid evidence that the Acorn 2.1 corrections are adding non-climatic warming to the record.

Figure 1. If the corrections were doing their job properly and were not adding spurious warming to the record, the 58 station points on this graph would on average be approximately horizontal across the graph. The fact they rise to the right means that the AcornSat correction algorithms are adding spurious non-climatic warming to the record.

Table 1 below shows clearly how the final Acorn 2.1 warming anomaly increases with the magnitude of the corrections. The implication is that the corrections are causing some of that apparent temperature increase.

Table 1, The “magnitude of the corrections” for the 58 stations are divided into 5 categories and their 5 corresponding average temperature anomalies are compared. An increase in the “magnitude of the corrections” coincides with an increase in the Acorn 2.1 temp. anomaly. The red value for “X” covers the 14 stations where AcornSat believed the raw data was accurate and needed little changing.

Table 2, below indicates the Pearson Correlation Coefficient (PCC) (Magnitude of Corrections v Acorn temp. anomaly) for the total dataset. The red line has a lower PCC and implies that when less correction is required the final Acorn 2.1 temperatures have less non-climatic warming.

Table 2, Shows the PCC for the total dataset and, in red, the PCC for the stations that AcornSat deemed the raw data accurate enough to not need significant correction.

CONCLUSION

According to the precedent set by AcornSat for the use of the Pearson correlation coefficient (PCC) when applied to data of this type, a PCC of 0.56 (magnitude of corrections verses Acorn 2.1 temp. anomaly) as found here indicates that “It is almost certain that the corrections are adding significant non-climatic warming to the raw data”.

For some of the stations in the dataset, AcornSat decided that the raw data was relatively accurate and consequently did not need significant homogenisation or correction. These stations are likely to give a more accurate temperature reading and a better idea of Australia’s temperature history over the period covered.

As a check I calculated the same PCC for the 14 stations that only needed minimal correction (Red data in Table 1 and Table 2). A lower PCC of 0.25 still indicates some artificial warming but not nearly as much as for the whole dataset.

The average temperature anomaly per decade for the 14 stations that needed minimal correction is about 0.1°C/Decade (Table 1). As calculated, this will include some artificial warming so the actual average warming for these more accurate stations is likely to fall in the range, 0.08°C to 0.10°C per decade.

This method could possibly be used to check the adjustments to the USA datasets. These adjustments are fiercely contested so it would be good to have some objective analysis. Let the cards fall where they will.

APPENDIX “A”

This appendix seeks to stress test the method used above. It does this by applying the method to two extreme homogenisation algorithms. One that we know adds zero artificial warming and one that we know adds large artificial warming in proportion to the correction. In the first instance (“C” in table 1.) we expect the Pearson Correlation Coefficient (PCC) to be zero. In the second (“E” in Table 1.) we expect the PCC to be 1.0. If this is true in both cases, then the method used, and the conclusions drawn in the main paper above are likely to be correct.

THE TEST

We use accurate satellite technology to determine the temperature anomalies for all parts of an imaginary small island.

This imaginary accurate satellite data has been available for the last 100 years and tells us that the temperature of all parts of this island have increased at the same rate. This rate being 0.1C per decade over the last century (“B” in Table 1).

We the readers are the only ones aware of the true and accurate temperature rise for all parts of the island over the last 100 years as described.

The inhabitants of the island are unaware of this accurate satellite data set. In an attempt to determine the temperature rise, over the 100 years on the island, these inhabitants set up 10 temperature stations 100 years ago, all equally spaced around the island.

Three of these stations (Station # 1 to 3) were well sited with good equipment that has not needed to be changed or relocated for any reason for the 100 years. They each showed a correct temperature anomaly of 0.1C per decade.

The other 7 stations (Station # 3 to 10) were subject to various changes related to equipment, vegetation, station moves etc. The raw data from all these 7 stations showed anomalies less than the correct 0.1C per decade. To correct for these inconsistencies the inhabitants developed two homogenisation algorithms (“C” and “E”) that they then applied to these 7 variant stations.

To test which of these two algorithms was suitable, the inhabitants applied the test described in the main paper above.

STATION # A B C D E F
1 0.1 0.1 0.1 0 0.1 0
2 0.1 0.1 0.1 0 0.1 0
3 0.1 0.1 0.1 0 0.1 0
4 0.09 0.1 0.1 0.01 0.11 0.02
5 0.08 0.1 0.1 0.02 0.12 0.04
6 0.07 0.1 0.1 0.03 0.13 0.06
7 0.06 0.1 0.1 0.04 0.14 0.08
8 0.05 0.1 0.1 0.05 0.15 0.1
9 0.04 0.1 0.1 0.06 0.16 0.12
10 0.03 0.1 0.1 0.07 0.17 0.14
Mean 0.072 0.1 0.1 0.028 0.128 0.056  

Legend;

A – Raw Temperature data for each station. (°C per decade).

B – Correct Satellite Data. (°C per decade).

C – Final temperature data after first accurate homogenisation algorithm is applied. It is correct and     matches the satellite data. (°C per decade).

D – “Magnitude of the Corrections” as applied to “C”. Derived by subtracting Raw data from “C”. (°C per decade).

E – Final temperature data after using a homogenisation algorithm that adds artificial warming to the record. (°C per decade).

F – “Magnitude of the Corrections” as applied to “E”. Derive by subtracting raw data from “E”. (°C per decade).

Table 1. Compares the 10 stations with their raw temperature, correct satellite temperature, and two possible final temperatures after the two homogenisation processes are complete. The two “magnitude of the corrections” (“D” and “F”) are also shown and are derived by subtracting the raw temp. from both final homogenised temp. sets (“C” and “E”).

RESULT

The Pearson Correlation Coefficient (PCC) between “C” and “D” is zero as expected.

The Pearson Correlation Coefficient (PCC) between “E” and “F” is 1.0, also as expected.

CONCLUSION

On the imaginary island in this appendix, a homogenisation algorithm that was known to be accurate (“C”) gave a PCC when the “magnitude of its corrections” was compared to its final temperatures, of zero.

A homogenisation algorithm that was known to add artificial warming to the record in proportion to the corrections (“E”) had a similarly calculated PCC of 1.0.

Both these results are consistent and would be expected if the method and results of the main paper above were accurate.

The case for saying that the Australian BOM homogenisation algorithm adds artificial warming to the record is strong and possibly proven.

via Watts Up With That?

https://ift.tt/32FyHWc

January 27, 2022 at 12:18PM