Month: March 2017

Russian Scientists Dismiss CO2 Forcing, Predict Decades Of Cooling, Connect Cosmic Ray Flux To Climate

Russian Scientists Dismiss CO2 Forcing, Predict Decades Of Cooling, Connect Cosmic Ray Flux To Climate

via NoTricksZone
http://notrickszone.com

Scientific Papers Predict

Cooling In Coming Decades


A new scientific paper authored by seven scientists affiliated with the Russian Academy of Sciences was just published in the scientific journal Bulletin of the Russian Academy of Sciences: Physics.

The scientists dismiss both “greenhouse gases” and variations in the Sun’s irradiance as significant climate drivers, and instead embrace cloud cover variations — modulated by cosmic ray flux — as a dominant contributor to climate change.

A concise summary: As cosmic ray flux increases, more clouds are formed on a global scale.  More global-scale cloud cover means more solar radiation is correspondingly blocked from reaching the Earth’s surface (oceans).   With an increase in global cloud cover projected for the coming decades (using trend analysis), a global cooling is predicted.


Stozhkov et al., 2017

Cosmic Rays, Solar Activity, and Changes in the Earth’s Climate

Stozhkov, Y.I., Bazilevskaya, G.A., Makhmutov, V.S., Svirzhevsky, N.S., Svirzhevskaya, A.K., Logachev, V.I., Okhlopkov, V.P.

“One of the most important problems facing humanity is finding the physical mechanism responsible for global climate change, particularly global warming on the Earth. … Summation of these periodicities for the future (after 2015) allows us to forecast the next few decades. The solid heavy line in Fig. 1 shows that cooling (a drop in ΔT values) is expected in the next few decades.”



“Figure 2 shows the dependence between the annual average changes ΔT in the global temperature in the near-surface air layer and charged particle flux N in the interval of altitudes from 0.3 to 2.2 km. We can see there is a connection between values ΔТ [temperature] and N [charged particle flux]: with an increase in cosmic ray flux N, the values of changes of global temperature decrease. This link is expressed by the relation ΔT = –0.0838N + 4.307 (see the dashed line in Fig. 2), where the ΔT values are given in °C, and the N values (in particle/min units) are related to the charged particle flux measured at an altitude of 1.3 km. The correlation coefficient of the line with the experimental data is r = –0.62 ± 0.08.”



“Our results could be connected with the mechanism of charged particle fluxes influencing the Earth’s climate; it includes, first of all, the effect charged particles have on the accelerated formation of centers of water vapor condensation, and thus on the increase in global cloud cover. The total cloud cover is directly connected with the global temperature of the near surface air layer.”


Another newly published scientific paper also projects cooling in the coming decades.  Dr. Norman Page, geologist, attributes climate changes to natural (60-year and millennial-scale) cycles of solar activity (and cloud cover changes), and he notes that the rise in solar activity since the depths of the Little Ice Age has been the predominant climate driver.  The millennial peak in solar activity occurred in about 2004, and within the next few years the temperature is projected to drop significantly.  Annotated graphs depicting the robust correlation between cloud cover changes and global temperature, as well as the forecasted global cooling, are included below.


Page, 2017

The coming cooling: Usefully accurate climate forecasting for policy makers

“This paper argues that the methods used by the establishment climate science community are not fit for purpose and that a new forecasting paradigm should be adopted. Earth’s climate is the result of resonances and beats between various quasi-cyclic processes of varying wavelengths. It is not possible to forecast the future, unless we have a good understanding of where the earth is in time in relation to the current phases of those different interacting natural quasi periodicities. Evidence is presented specifying the timing and amplitude of the natural 60 ± year and, more importantly, 1000 year periodicities (observed emergent behaviors) that are so obvious in the temperature record. Data related to the solar climate driver are discussed and the solar cycle 22 low in the neutron count (high solar activity) in 1991 is identified as a solar activity millennial peak and correlated with the millennial peak – inversion point – in the RSS temperature trend in about 2004. The cyclic trends are projected forward and predict a probable general temperature decline in the coming decades and centuries. Estimates of the timing and amplitude of the coming cooling are made. If the real climate outcomes follow a trend which approaches the near term forecasts of this working hypothesis, the divergence between the IPCC forecasts and those projected by this paper will be so large by 2021 as to make the current, supposedly actionable, level of confidence in the IPCC forecasts untenable.”



“The global millennial temperature rising trend seen in Figure 11 from 1984 to the peak and trend inversion point in the Hadcrut3 data at 2003/4 is the inverse correlative of the Tropical Cloud Cover fall from 1984 to the Millennial trend change at 2002. The lags in these trends from the solar activity peak at 1991 (Figure 10) are 12 and 11 years, respectively. These correlations suggest possible teleconnections between the GCR flux, clouds, and global temperatures.”



“Unless the range and causes of natural variation, as seen in the natural temperature quasi-periodicities, are known within reasonably narrow limits, it is simply not possible to even begin to estimate the effect of anthropogenic CO2 on climate. Given the lack of any empirical CO2-climate connection reviewed earlier and the inverse relationship between CO2 and temperature [during the Holocene, when CO2 rose as temperatures declined] seen in Figure 2, and for the years 2003.6–2015.2 in Figure 4, during which CO2 rose 20 ppm, the simplest and most rational working hypothesis is that the solar ‘activity’ increase is the chief driver of the global temperature increase since the LIA.”

via NoTricksZone http://notrickszone.com

March 22, 2017 at 08:32PM

Energy Disaster Spells the End for Australia’s Renewable Energy Target

Energy Disaster Spells the End for Australia’s Renewable Energy Target

via STOP THESE THINGS
http://ift.tt/2kE7k62

It happened in the blink of an eye: Australia went from energy heavyweight to a country in crisis looking for the equivalent of life support. Since STT got going in December 2012, we have drifted from frustration to despair and back waiting and hoping for mainstream journalists to recognise the existential threat posed by Australia’s […]

via STOP THESE THINGS http://ift.tt/2kE7k62

March 22, 2017 at 06:32PM

On the Politicization of Electricity (intervention breeding intervention)

On the Politicization of Electricity (intervention breeding intervention)

via Master Resource
http://ift.tt/1o3KEE1

“Government-orchestrated retail competition in electricity largely failed. With that failure came the return of regulatory-mandated, utility-administered wasteful energy efficiency programs. This time the programs carried the added justification of countering global warming.”

Prior to the oil shocks of the 1970s, energy was just another input in the management of capital, labor and other operating costs. Tradeoffs were made between energy costs and capital spent to increase efficiency. During the natural turnover of capital equipment, energy efficiency improved along with productivity, quality and waste reduction. Effective energy use was a technical matter where efficiency had to make economic sense.

Oil  and  gas  shortages in  the  1970s  were caused  by  government  price controls, but the news media hyped the concept of “running out” of resources. This brought politics into the use of energy, an example of how the problems from government intervention can breed more intervention.

With the stated goal of being “energy independent,” a variety of federal government boondoggles were initiated, including synthetic fuels subsidies to the Strategic Petroleum Reserve. The electric utility industry faced restrictions on the use of natural gas for generation, which encouraged new coal-fired capacity.

Nuclear generation appeared to the best option given questions about fossil fuels. A significant part of the utility industry exhibited a lemming-like propensity for doing the wrong thing in mass. State utility regulators played a part in  encouraging  the  proliferation of what turned out to be very expensive  investments.

Management of energy became a formalized separate management responsibility among end users. The price shocks from the oil  market and, later, the power market made energy efficiency measures attractive. The market poured forth new devices and improvements in energy-using equipment that greatly reduced the energy intensity of the U.S.  economy. The managers of enterprises faced an array of new energy control products that continues increasing to this day.

Along with the market response to high energy prices, new laws were complicating the use of energy. Natural  gas  use  was  discouraged in industrial as well as utility boilers. Gas supplies were perceived by policy makers as a soon-to-be-exhausted resource. Some industrial processes were switched to electricity. In such cases overall efficiency was taking a back seat to conserving natural gas. Tax credits influenced energy investments. Saving energy had patriotic implications. Building codes now included energy standards.

The shock from cost overruns on nuclear generating plants caused a loss of confidence by the upper managements of electric utilities. During this moment of weakness the concept of having a public policy on energy use came to the fore among utility regulators. The idea was that if the utilities invested in energy efficiency, then they would not need to build so many power plants. In regulatory circles the new buzz words were “least-cost planning” and “demand-side management.”

The utilities had the good sense to be suspicious of the grandiose claim that efficiency improvements would slow the overall demand for energy. But they were soon bought off with promises of guaranteed profits on approved but expensive efficiency programs. During the late 1980s and early 1990s, a number of sound economists pointed out the flaws of asking a provider of services to reduce his own sales. Further, with amazing accuracy, these economists predicted failure.

Billions of dollars were spent by the nation’s utilities on demand-side programs with little to show for it but inflated claims. The money, of course, came from utility customers. Those who had already invested in energy efficiencies were taxed to pay for the same improvements in their competitors’ facilities. According to economist Franz Wirl both the utilities and customers gamed the system of giveaways for efficiency measures.

Then a wave of rationality struck the electricity industry. Competition for end users was instituted in many  states  and  planned  in  still  more. Customer  choice  akin  to  that  in  telecommunications  was  seen  as  the  wave  of  the  future. The  regulated  utilities  began  the  wholesale  dumping  of  their  wasteful  customer  efficiency  programs. In  the  industry  this  was  known  as  “getting  trash  off  the  books.”

The regulated suppliers rightly feared unregulated competitors who would not add frivolous costs to their services. End users enjoyed a period of stable and even declining rates as the utilities were forced to make rational decisions in a quasi market environment. But the retail market concept was flawed.  Rather than evolving through  the  market  process,  political dictates  prevailed  in  designing  the  structure.

The result was a Frankenstein’s monster. It looked like the real thing but lacked the essential features necessary for a real market. Entrepreneurs were not allowed to develop alternative delivery systems. The scheme  could be called mandatory access with guaranteed return on overvalued assets.

However, on  the  wholesale  level  time-sensitive  pricing  became  the  norm  and  made electricity a commodity that is exchanged like other commodities using marginal cost pricing. This rationality spread to some sectors of the retail market and brought much-needed customer feedback to utilities.

Government-orchestrated retail competition in electricity largely failed. With that failure came the return of regulatory-mandated, utility-administered wasteful efficiency programs. This time the programs carried the added justification of countering global warming. In a typical utility program $6 is collected for giving away low-energy light  bulbs  that  a  consumer  can  buy  for less than $1.50, and  the  utilities  are  asking  to  recover  hundreds of millions of dollars without verifying they achieved any energy savings.

The bad news for consumers has not ended with the rebirth of these demand-side management programs. Another round of high-cost nuclear plants is underway.  The lesson that should have been learned from the last round of nuclear plant building is that the utilities need to be subjected to market conditions and their attendant risks.  Instead, we see the utilities going to their state regulators and general assemblies and getting guaranteed recovery on investments with no upper cap!   Green  energy purchase mandates  are  driving  up  the  cost  of  electricity  already,  and  will  certainly get worse.

End users are trying to be politically correct and are adopting high-sounding green goals. The end-user energy management  function now includes issuing press releases filled with platitudes. One-size-fits-all energy measures are being promoted to obtain certification of an enterprise’s commitment to saving the earth.  The overhead surrounding energy saving programs has increased.

Dubious  environmental  effects  are  included  along  with  hard  energy  saving  numbers to justify politically correct investments and practices. Utility rates, which have always included cross-subsidies for politically favored groups, are being distorted even further from market results with additional features encouraging certain green behavior.

Conclusion

The zigs  and  zags  of  state  and  federal  energy  policies  have  caused  massive  mal-investment by both  energy  suppliers  and  users.   The best energy policy is no energy policy. Market forces should replace regulation; supply and demand with unfettered competition should determine energy prices. Proper  energy  prices  will  lead  to  wise  customer  investments  and  practices.  Energy efficiency is its own reward and needs no encouragement from governments.

————

Jim Clarkson is founder and head of Resource Supply Management (RSM), an energy procurement and management company based in South Carolina. He is also a director of the Institute for Energy Research and its advocacy arm, The American Energy Alliance.

The post On the Politicization of Electricity (intervention breeding intervention) appeared first on Master Resource.

via Master Resource http://ift.tt/1o3KEE1

March 22, 2017 at 06:04PM

New Report: Current Global Warming Is ‘Not Outside The Range Of Natural Variations’

New Report: Current Global Warming Is ‘Not Outside The Range Of Natural Variations’

via The Global Warming Policy Forum (GWPF)
http://www.thegwpf.com

A UK-based climate policy group has put out an annual climate assessment “exclusively on observations rather than climate models” to serve as a counterpoint to those put out by the United Nations and government agencies that warn of unabated global warming.

The Global Warming Policy Foundation’s (GWPF) climate assessment, like the World Meteorological Organization (WMO), noted 2016 was likely the warmest year on record due to an incredibly strong El Niño warming event that boosted tropical ocean temperatures starting in 2015.

That’s about all the GWPF’s report has in common with the WMO’s assessment for 2016, which warns “the influence of human activities on the climate system has become more and more evident.”

“There is little doubt that we are living in a warm period,” said Ole Humlum, a physical geography professor at the University Centre in Norway and author of the GWPF report.

Humlum is a global warming skeptic who’s spent decades studying glaciers and climate. Humlum argues that while the world is warming, it’s well within the bounds of natural variability.

“However, there is also little doubt that current climate change is not abnormal and not outside the range of natural variations that might be expected,” Humlum said.

Humlum found that while 2016 was the warmest year on record, it was mostly due to the incredibly strong El Niño. The WMO, on the other hand, claims El Niño only contributed between 0.1 to 0.2 degrees Celsius to 2016’s record 1.1 degree Celsius anomaly.

He argues El Niño was the main driver behind record high temperatures last year because “global air temperatures were essentially back to the level of the years before the recent 2015–16 oceanographic El Niño episode.”

In general, the WMO’s report takes a more ominous tone in general when describing climatic conditions in 2016. WMO reported the “increase in global temperature is consistent with other changes in the climate system.”

“Globally averaged sea-surface temperatures were also the warmest on record; global sea levels continued to rise; and Arctic sea-ice extent was well below average for most of the year,” WMO reported.

“The year was marked by severe droughts that affected agricultural production and left people exposed to food insecurity in southern and eastern Africa and Central America,” WMO reported, also mentioning Hurricane Matthew, heavy flooding in Asia and coral reef bleaching.

Humlum counters that unseasonably high temperatures in the Arctic were driven by El Niño. Heat transported from the tropics to the poles. Both poles saw record-low sea ice levels, but that could also be driven by natural cycles.

“In the Arctic, a 5.3-year periodic variation is important, while for the Antarctic a cycle of about 4.5 years duration is important,” Humlum wrote. “Both these variations reached their minima simultaneously in 2016, which explains the recent minimum in global sea-ice extent.”

Humlum also noted how surface-based temperature datasets have diverged from satellite-based readings since 2003. Surface data shows about 1 degree Celsius more warming than satellites.

Full post

via The Global Warming Policy Forum (GWPF) http://www.thegwpf.com

March 22, 2017 at 09:44AM