Month: June 2018

Frackingphobia: Facts vs. Fears

Hydraulic fracturing (AKA “fracking”) is in the news every day, and often in a disparaging way, despite the great benefits bestowed on nations applying the process, especially the US.

On a recent river cruise I found myself at a table with a couple from California, and the woman began spouting about the dangers and horribleness of fracking. My civility censor was suppressed by the wine I’d consumed, and I interrupted to say she was talking Bullshit. She halted, then asked her husband, a retired geologist, to comment, and he stated that fracking is a risky business. The geologist husband did not present any evidence for his view, IMO he was only speaking to support his spouse. I said I respected his opinion but still disagreed. The next day I apologized for my rudeness but said I still think she has been misled. We shared a congenial dinner later on, but avoided the subject.

The experience revealed I had been unprepared to engage on the details of the fracking issue. So this post is to summarize some research to assemble persuasive facts and resources to counter the fear mongering on this subject.

1.Obama’s EPA Found Fracking Has Not Contaminated Drinking Water

(Source: EPA Has Not Actually Changed Its Conclusion On Risks Of Fracking To Drinking Water by Robert Rapier for Forbes) Excerpts in italics with my bolds.

First, let me provide a bit of background on hydraulic fracturing. I find that most people who are against fracking don’t actually know what it is. The EPA report goes out of its way to blur the lines as well by lumping it all into “activities in the hydraulic fracturing water cycle.” By doing this, if a guy driving a truck filled with fracking chemicals has a wreck, it’s a “fracking issue.” So let’s define some terms.

Hydraulic fracturing has been around since the late 1940s, and has now been used in the U.S. more than a million times to increase production from oil and gas wells. Fracking involves pumping water, chemicals and a proppant down an oil or gas well under high pressure to break open channels (fractures) in the reservoir rock trapping the deposit. Oil and gas do not travel easily through these some formations, which is why they need to be fractured. The proppant is a granular material (usually sand) designed to hold those channels open, allowing the oil (or natural gas) to flow to the well bore.
While fracking has been around for decades, two developments in recent years are responsible for thrusting the technique into the public eye. The first is the fairly recent development in which fracking was combined with horizontal drilling, another common technique used in the oil and gas industry.

Like fracking, horizontal drilling was invented decades ago, and has been widely used in the oil and gas industry since the 1980s. As its name implies, horizontal drilling involves drilling down to an oil or gas deposit and then turning the drill horizontal to the formation to access a greater fraction of the deposit.

The marriage of these two techniques of hydraulic fracturing and horizontal drilling enabled the shale oil and gas boom in the U.S.

But the second development is what primarily thrust the technique(s) into the public spotlight. Some of the shale oil and gas formations are in areas that had never experienced significant fossil fuel development. Many locals resented this intrusion into their lives, and anti-fracking sentiments fed into a great deal of misinformation around the technique.

The movie Gasland is a perfect example. Director Josh Fox, whose family farm lies atop the Marcellus Shale in Pennsylvania, relied on misinformation and appeals to emotion instead of scientific data. Nevertheless, it was embraced by anti-fracking activists, and many who had never heard of fracking became convinced the technique was regularly polluting water supplies.

The concern among anti-fracking activists was that the fractures that allowed oil and gas to reach the well bore could also allow oil, gas, and chemicals to seep into the water supplies. But the reason this is a remote possibility is that a mile or more of rock will separate an oil and gas formation that is being fractured and an underground water resource. The fractures themselves extend for a few hundred feet, thus unsurprisingly there has never been a proven case where chemicals migrated from a fracked zone into water supplies.

That hasn’t stopped some from claiming that fracking has contaminated water supplies. However, those cases have always been a result of some activity peripheral to fracking. For example, if a well is improperly cemented it can leak. That in fact has happened, leading to the charge that “fracking contaminated the water.” There is an important distinction, however, and that is that this is not a result of the fracking process. A well may leak regardless of whether it was fracked. But activists (and now the EPA) seem bent on blurring the lines to the greatest extent possible by lumping lots of peripheral activities into the “fracking process.”

In 2010, Congress asked the EPA to investigate the safety of fracking. In 2015, the EPA issued a draft report. The bombshell statement from that report was that there was no evidence that fracking had “led to widespread, systemic impacts on drinking water resources in the United States.” This report was cheered by the fossil fuel industry, but caused a backlash with environmentalists, and spawned many counterclaims that the “fracking process” had led to contaminated water.

In December 2016 the EPA released its final report on the topic: Hydraulic Fracturing for Oil and Gas: Impacts from the Hydraulic Fracturing Water Cycle on Drinking Water Resources in the United States. Environmentalists were quick to note that the EPA had deleted its previous claim of no evidence of widespread water contamination, and were now reporting that “hydraulic fracturing activities can impact drinking water resources under some circumstances.” This story from The New York Times, for instance, was pretty typical of the reporting on the issue: Reversing Course, E.P.A. Says Fracking Can Contaminate Drinking Water.

But did the EPA actually reverse course? No. They gave examples where fracking could contaminate water. For instance they state that “Injection of hydraulic fracturing fluids directly into groundwater resources” can cause contamination. Yeah, no joke. Likewise, filling your car with gasoline can contaminate drinking water, because if you spill the gasoline all over the ground, it can get into the drinking water.

The EPA’s final report on hydraulic fracturing wasn’t that much different from the draft report. As in the previous report, the EPA noted that activities related to — but not exclusive to — fracking, have contaminated water supplies. Chemical spills happen all the time, but if the chemicals in question are for fracking, it becomes a “fracking issue.” Note that if the chemicals in question are to be used for fighting fires, we don’t say “firefighting contaminates water.” We should properly identify and address the actual problem, which in this instance would be the cause of the chemical spill.

Ultimately, the final report deleted a phrase from the draft report that there was no evidence of widespread impact on water supplies, and selectively used hypotheticals to show how fracking “could” contaminate water supplies. This is the Obama Administration laying down one more speed bump for the oil and gas industry while it still can.

Shale gas drilling rig in Ohio.

2. Discredited Fracking Studies are used to Target School Children
(Source: New Activist Report Rehashes Discredited Fracking Studies to Target School Children by Seth Whitehead for EnergyinDepth

A new Environment America “report” uses a couple old anti-fracking tactics — exploitation of children and blatant misinformation from activist studies — to try to stoke fears and rally support for its extremist call to ban fracking nationwide.

The ominously-titled “Dangerous and Close: Fracking Puts the Nation’s Most Vulnerable People at Risk” finds there are nearly 2,000 child care facilities, better than 1,300 schools, nearly 250 nursing care providers and more than 100 hospitals within a one-mile radius of fracked wells in the nine states examined, stating:

“Given the scale and severity of fracking’s impacts, fracking should be prohibited wherever possible and existing wells should be shut down beginning with those near institutions that serve our most vulnerable populations.”

Here are the report’s most egregious claims, followed by the facts.

Environment America Claim: “Fracking creates a range of threats to our health, including creating toxic air pollution that can reduce lung function even among healthy people, trigger asthma attacks, and has been linked to premature death. Children and the elderly are especially vulnerable to fracking’s health risks.”

A pumpjack works in the Bakken shale of North Dakota.

REALITY: There is actually ample evidence that fracking is improving overall air quality and health by reducing major pollutants such as fine particulate matter, sulfur dioxide and nitrogen dioxide. Furthermore, all three studies EA singles out as “evidence” close proximity to fracking sites can lead to the myriad of adverse health effects have been thoroughly debunked.

EA even cites an Earthworks study that claims “A series of 2012 measurements by officials of the Texas Commission on Environmental Quality (TCEQ) found VOCs levels so high at one fracking location that the officials themselves were forced to stop taking measurements and leave the site because it was too dangerous for them to remain.”

EA fails to mention TCEQ responded to Earthworks’ report by saying the agency has collected “several millions of data points for volatile organic compounds” in the Barnett Shale and Eagle Ford Shale and “Overall, the monitoring data provide evidence that shale play activity does not significantly impact air quality or pose a threat to human health.”

EA also conveniently ignores that the West Virginia Department of Environmental Protection (DEP) and the Colorado Department of Public Health (CDPH) have conducted air monitoring near well sites as well and found no credible risk to public health.

Environment America Claim: “Currently, oil and gas companies are exempt from key provisions in the Safe Drinking Water Act, the Clean Air Act, the Clean Water Act, and the Resource Conservation and Recovery Act.”

REALITY: The notion that the oil and natural gas industry is under-regulated is absolutely absurd narrative activists such as EA continue to push. Oil and gas production activities are subject to eight federal laws: including all relevant provisions of the Safe Drinking Water Act (SDWA); Clean Water Act (CWA); Clean Air Act (CAA); Resources Conservation and Recovery Act (RCRA); Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA); the EPCRA; Toxic Substances Control Act (TSCA); and Federal Insecticide, Fungicide and Rodenticide Act (FIFRA). Additionally, the oil and gas production sector is also heavily regulated at the state level.

A drilling rig works  in the Eagle Ford shale, South Texas region.

Environment America Claim: “Exposure to low levels of many of the chemicals used in or generated by oil and gas extraction activities can contribute to a variety of health effects, including asthma, cancer, birth defects, damage to the reproductive system and impaired brain development. For example, children’s long-term exposure to low levels of benzene, generally classified as a carcinogen, also harms respiratory health.”

REALITY: It is essential to understand that toxicity is completely dependent on dose level and exposure. The mere presence of benzene, for example, does not mean that it is present in toxic levels, as the numerous studies air monitoring studies referred to earlier illustrate. EA insinuates that even low-level benzene exposure is harmful. But benzene is actually present in countless everyday products such as shampoo, tooth paste, paint, PVC pipes and countless plastic products.

Environment America Claim: “Fracking targets the oil and gas trapped in shale formations… Sometimes that means wells are drilled in rural areas, such as portions of Colorado or North Dakota, and sometimes that wells are in densely populated areas, such as Los Angeles…”

REALITY: There are no fracking or unconventional oil production operations in the city of Los Angeles — none. EA attempts to justify this claim by employing the common activist tactic of expanding the definition of fracking to encompass all oil and gas related activity:

“Throughout this report, we refer to “fracking” as including all of the activities needed to bring a well into production using high-volume hydraulic fracturing. This includes drilling the well, operating that well, processing the gas or oil produced from that well, and delivering the gas or oil to market. The oil and gas industry often uses a more restrictive definition of “fracking” that includes only the actual moment in the extraction process when rock is fractured – a definition that obscures the broad changes to environmental, health and community conditions that result from the use of high-volume hydraulic fracturing in oil and gas extraction.”

Fracking is not used as a completion technique at any of the urban drill sites in the city. All of the facilities recover oil through traditional water flood operations. The report’s attempt to shoehorn fracking and unconventional production into its report proves that it is not engaged in an honest attempt to inform the public.

Environment America Claim: “Because of the health hazard created by radon, Pennsylvania has a long record of radon measurements in homes. An analysis of those radon measurements by researchers at Johns Hopkins School of Public Health found that radon levels have increased in counties with extensive fracking since 2004, and also found elevated radon levels on the first floor of houses located within 12.5 miles of a fracked well.”

REALITY: The Johns Hopkins study EA is referring to actually found the highest concentrations of radon were in areas with no shale development and direct sampling found radon not linked to fracking. As is the case with so many of the studies EA uses as evidence, the authors merely speculated fracking was the cause.

Environment America Claim: “Oil and gas production at fracked wells releases volatile organic compounds and nitrogen oxides that contribute to the formation of smog.”

REALITY: Oil and gas production is not a major contributor to ground-level ozone.

As EID has emphasized before, publicly available information demonstrates oil and gas production is not the significant contributor to ozone levels. Vehicle exhaust adds far more non-methane volatile organic compounds (NMVOCs) and nitrogen oxides (NOx) — both precursors to ground-level ozone — to the atmosphere than oil and gas production, as data from the EPA’s 2016 Greenhouse Gas Inventory clearly demonstrates.

Not only do oil and gas activities account for just six percent of total NOx emissions, which play more of a role in ground-level formation than VOCs, another recent NOAA report found that “The increased use of natural gas has…led to emissions reductions of NOx (40%) and SO2 (44%).”

Environment America Claim: “Contaminants can reach water supplies through faulty well construction, through surface spills, through improper wastewater disposal, or potentially through migration from the shale layer itself.”

REALITY: The EPA’s landmark five-year study confirmed, “hydraulic fracturing activities have not led to widespread, systemic impacts to drinking water resources,” and at least 15 other studies say the fracking process, specifically, have not contaminated groundwater.

Conclusion

EA’s claims in this report — aimed at generating headlines — are quite profound.

“Schools and day care centers should be safe places for kids to play and learn,” said Rachel Richardson, director of Environment America’s Stop Drilling program and co-author of the report. “Unfortunately our research shows far too many kids may be exposed to dirty air and toxic chemicals from fracking right next door.”

The problem is EA’s “research” merely found that there are some schools, nursing homes and hospitals near oil and natural gas development. It made no effort to collect its own data to support their claim that this is leading to adverse health effects.

Instead, it relied on long-debunked studies and tired fear tactics. Maybe that’s why the report’s hyperbolic claim that it “serves as a reminder of the unacceptable dangers of fracking, its potential to harm, and the need to bring this risky form of drilling to an end” was virtually ignored by the media.

3. Extensive research Study Found No link between groundwater pollution and fracking.
(Source: National Science Foundation and Duke University study summarized by Jeffrey Folks for American Thinker The science is settled, fracking is safe.)

Among the 130 wells studied, the researchers found only a subset of cases, including seven in Pennsylvania and one in Texas, in which faulty well construction or cementing was to blame for the seepage of gases into groundwater. According to Professor Avner Bengosh of Duke University, “[t]hese results appear to rule out the migration of methane up into drinking water aquifers from depth because of horizontal drilling or hydraulic fracturing.” That is to say, in the rare cases where it occurs, gases are entering the water supply from outside the borehead as a result of faulty well construction or poor cementing, both of which are manageable problems.

While the new report answers the most important question, proving beyond doubt that fracking itself does not cause gas to seep into the water supply, it does not address several other important questions. One of these is the frequency of contamination of water supplies by naturally occurring petroleum, methane, and other gases.

Natural pollution of this kind would seem to be extremely common, and in fact this natural process has been known for millennia. At sites where petroleum seeped to the surface, as in the vicinity of the 19th-century Drake oil field in Pennsylvania, Native Americans had made use of the oily substance as a lubricant for hundreds if not thousands of years. That oil, flowing naturally to the surface, was “contaminating” nearby streams and groundwater.

What humans add to natural emisions as a result of drilling is so minor as to be of little consequence. If some future study confirmed this fact, it would help to counter the myth that oil and gas drilling is polluting an otherwise pure land and sea environment. The reality is that wherever shale and other carbon-rich formations occur, natural leakage of petroleum and/or methane is inevitable. Oil and gas are naturally occurring features that are constantly interacting with the environment and entering the water supply through natural processes. As is so often the case, the idea that there once existed an environment free of all that modern intellectuals might consider unpleasant is simply a fantasy.

The NSF/Duke report is crucial to the debate over the safety of hydraulic fracturing. The oil and gas industry has already achieved a near perfect safety record, given the handful of failed wells in proportion to more than one million that have been fracked. The industry needs to continue working to achieve certainty that wells do not fail. It also needs to do a better job of communicating its intention to do so to a skeptical public.

4. Is Fracking Safe? The 10 Most Controversial Claims About Natural Gas Drilling by Seamus McGraw Popular Mechanics 2016

Members of Congress, gas companies, news organization, drilling opponents: They’ve all made bold claims about hydraulic fracturing (fracking) and the U.S. supply of underground natural gas. We take on 10 controversial quotes about natural gas and set the record straight.

WE ARE THE SAUDI ARABIA OF NATURAL GAS.” SEN. JOHN KERRY, D-MASS., MAY 2010

Less than a decade ago, industry analysts and government officials fretted that the United States was in danger of running out of gas. No more. Over the past several years, vast caches of natural gas trapped in deeply buried rock have been made accessible by advances in two key technologies: horizontal drilling, which allows vertical wells to turn and snake more than a mile sideways through the earth, and hydraulic fracturing, or fracking. Developed more than 60 years ago, fracking involves pumping millions of gallons of chemically treated water into deep shale formations at pressures of 9000 pounds per square inch or more. This fluid cracks the shale or widens existing cracks, freeing hydrocarbons to flow toward the well.

These advances have led to an eightfold increase in shale gas production over the past decade. According to the Energy Information Administration, shale gas will account for nearly half of the natural gas produced in the U.S. by 2035. But the bonanza is not without controversy, and nowhere, perhaps, has the dispute over fracking grown more heated than in the vicinity of the Marcellus Shale. According to Terry Engelder, a professor of geosciences at Penn State, the vast formation sprawling primarily beneath West Virginia, Pennsylvania and New York could produce an estimated 493 trillion cubic feet of gas over its 50- to 100-year life span. That’s nowhere close to Saudi Arabia’s total energy reserves, but it is enough to power every natural gas—burning device in the country for more than 20 years. The debate over the Marcellus Shale will shape national energy policy—including how fully, and at what cost, we exploit this vast resource.

HYDRAULIC FRACTURING SQUANDERS OUR PRECIOUS WATER RESOURCES.” Green Party of Pennsylvania, April 2011

There is no question that hydraulic fracturing uses a lot of water: It can take up to 7 million gallons to frack a single well, and at least 30 percent of that water is lost forever, after being trapped deep in the shale. And while there is some evidence that fracking has contributed to the depletion of water supplies in drought-stricken Texas, a study by Carnegie Mellon University indicates the Marcellus region has plenty of water and, in most cases, an adequate system to regulate its usage. The amount of water required to drill all 2916 of the Marcellus wells permitted in Pennsylvania in the first 11 months of 2010 would equal the amount of drinking water used by just one city, Pittsburgh, during the same period, says environmental engineering professor Jeanne VanBriesen, the study’s lead author. Plus, she notes, water withdrawals of this new industry are taking the place of water once used by industries, like steel manufacturing, that the state has lost. Hydrogeologist David Yoxtheimer of Penn State’s Marcellus Center for Outreach and Research gives the withdrawals more context: Of the 9.5 billion gallons of water used daily in Pennsylvania, natural gas development consumes 1.9 million gallons a day (mgd); livestock use 62 mgd; mining, 96 mgd; and industry, 770 mgd.

NATURAL GAS IS CLEANER, CHEAPER, DOMESTIC, AND IT’S VIABLE NOW.” OILMAN TURNED NATURAL-GAS CHEERLEADER T. BOONE PICKENS, SEPTEMBER 2009

Burning natural gas is cleaner than oil or gasoline, and it emits half as much carbon dioxide, less than one-third the nitrogen oxides, and 1 percent as much sulfur oxides as coal combustion. But not all shale gas makes it to the fuel tank or power plant. The methane that escapes during the drilling process, and later as the fuel is shipped via pipelines, is a significant greenhouse gas. At least one scientist, Robert Howarth at Cornell University, has calculated that methane losses could be as high as 8 percent. Industry officials concede that they could be losing anywhere between 1 and 3 percent. Some of those leaks can be prevented by aggressively sealing condensers, pipelines and wellheads. But there’s another upstream factor to consider: Drilling is an energy-intensive business. It relies on diesel engines and generators running around the clock to power rigs, and heavy trucks making hundreds of trips to drill sites before a well is completed. Those in the industry say there’s a solution at hand to lower emissions—using natural gas itself to power the process. So far, however, few companies have done that.

“[THERE’S] NEVER BEEN ONE CASE—DOCUMENTED CASE—OF GROUNDWATER CONTAMINATION IN THE HISTORY OF THE THOUSANDS AND THOUSANDS OF HYDRAULIC FRACTURING [WELLS]” SEN. JAMES INHOFE, R-OKLA., APRIL 2011

The senator is incorrect. In the past two years alone, a series of surface spills, including two blowouts at wells operated by Chesapeake Energy and EOG Resources and a spill of 8000 gallons of fracking fluid at a site in Dimock, Pa., have contaminated groundwater in the Marcellus Shale region. But the idea stressed by fracking critics that deep-injected fluids will migrate into groundwater is mostly false. Basic geology prevents such contamination from starting below ground. A fracture caused by the drilling process would have to extend through the several thousand feet of rock that separate deep shale gas deposits from freshwater aquifers. According to geologist Gary Lash of the State University of New York at Fredonia, the intervening layers of rock have distinct mechanical properties that would prevent the fissures from expanding a mile or more toward the surface. It would be like stacking a dozen bricks on top of each other, he says, and expecting a crack in the bottom brick to extend all the way to the top one. What’s more, the fracking fluid itself, thickened with additives, is too dense to ascend upward through such a channel. EPA officials are closely watching one place for evidence otherwise: tiny Pavillion, Wyo., a remote town of 160 where high levels of chemicals linked to fracking have been found in groundwater supplies. Pavillion’s aquifer sits several hundred feet above the gas cache, far closer than aquifers atop other gas fields. If the investigation documents the first case of fracking fluid seeping into groundwater directly from gas wells, drillers may be forced to abandon shallow deposits—which wouldn’t affect Marcellus wells.

“THE GAS ERA IS COMING, AND THE LANDSCAPE NORTH AND WEST OF [NEW YORK CITY] WILL INEVITABLY BE TRANSFORMED AS A RESULT. WHEN THE VALVES START OPENING NEXT YEAR, A LOT OF POOR FARM FOLK MAY BECOME TEXAS RICH. AND A LOT OF OTHER PEOPLE—ESPECIALLY THE ECOSENSITIVE NEW YORK CITY CROWD THAT HAS SETTLED AMONG THEM—WILL BE APOPLECTIC AS THEIR PRISTINE WEEKEND SANCTUARY IS CONVERTED INTO AN INDUSTRIAL ZONE, CRISSCROSSED WITH DRILL PADS, PIPELINES, AND ACCESS ROADS.” New York magazine, Sept. 21, 2008

Much of the political opposition to fracking has focused on the Catskill region, headwaters of the Delaware River and the source of most of New York City’s drinking water. But the expected boom never happened—there’s not enough gas in the watershed to make drilling worthwhile. “No one has to get excited about contaminated New York City drinking water,” Penn State’s Engelder told the Times Herald-Record of Middletown, N.Y., in April. The shale is so close to the surface that it’s not concentrated in large enough quantities to make recovering it economically feasible. But just to the west, natural gas development is dramatically changing the landscape. Drilling rigs are running around the clock in western Pennsylvania. Though buoyed by the economic windfall, residents fear that regulators can’t keep up with the pace of development. “It’s going to be hard to freeze-frame and say, ‘Let’s slow down,’?” Sen. Robert P. Casey Jr., D-Pa., said last fall. “That makes it more difficult for folks like us, who say we want to create the jobs and opportunity in the new industry, but we don’t want to do it at the expense of water quality and quality of life.”

“NATURAL GAS IS AFFORDABLE, ABUNDANT AND AMERICAN. IT COSTS ONE-THIRD LESS TO FILL UP WITH NATURAL GAS THAN TRADITIONAL GASOLINE.” REP. JOHN LARSON, D-CONN., CO-SPONSOR OF H.R. 1380, A MEASURE THAT WOULD PROVIDE TAX INCENTIVES FOR THE DEVELOPMENT AND PURCHASE OF NATURAL GAS VEHICLES, MARCH 2011

That may be true. Plus, there’s another incentive: Vehicles powered by liquefied natural gas, propane or compressed natural gas run cleaner than cars with either gasoline or diesel in the tank. According to the Department of Energy, if the transportation sector switched to natural gas, it would cut the nation’s carbon-monoxide emissions by at least 90 percent, carbon-dioxide emissions by 25 and nitrogen-oxide emissions by up to 60. But it’s not realistic: Nationwide, there are only about 3500 service stations (out of 120,000) that offer natural gas—based automotive fuel, and it would cost billions of dollars and take years to develop sufficient infrastructure to make that fuel competitive with gasoline or diesel. And only Honda makes a car that can run on natural gas. That doesn’t mean natural gas has no role in meeting the nation’s short-term transportation needs. In fact, buses in several cities now rely on it, getting around the lack of widespread refueling opportunities by returning to a central terminal for a fill-up. The same could be done for local truck fleets. But perhaps the biggest contribution natural gas could make to America’s transportation picture would be more indirect—as a fuel for electric-generation plants that will power the increasingly popular plug-in hybrid vehicles.

“DO NOT DRINK THIS WATER” HANDWRITTEN SIGN IN THE DOCUMENTARY GASLAND, 2010

It’s an iconic image, captured in the 2010 Academy Award—nominated documentary GasLand. A Colorado man holds a flame to his kitchen faucet and turns on the water. The pipes rattle and hiss, and suddenly a ball of fire erupts. It appears a damning indictment of the gas drilling nearby. But Colorado officials determined the gas wells weren’t to blame; instead, the homeowner’s own water well had been drilled into a naturally occurring pocket of methane. Nonetheless, up to 50 layers of natural gas can occur between the surface and deep shale formations, and methane from these shallow deposits has intruded on groundwater near fracking sites. In May, Pennsylvania officials fined Chesapeake Energy $1 million for contaminating the water supplies of 16 families in Bradford County. Because the company had not properly cemented its boreholes, gas migrated up along the outside of the well, between the rock and steel casing, into aquifers. The problem can be corrected by using stronger cement and processing casings to create a better bond, ensuring an impermeable seal.

“AS NEW YORK GEARS UP FOR A MASSIVE EXPANSION OF GAS DRILLING IN THE MARCELLUS SHALE, STATE OFFICIALS HAVE MADE A POTENTIALLY TROUBLING DISCOVERY ABOUT THE WASTEWATER CREATED BY THE PROCESS: IT’S RADIOACTIVE.” ProPublica, November 2009

Shale has a radioactive signature—from uranium isotopes such as radium-226 and radium-228—that geologists and drillers often measure to chart the vast underground formations. The higher the radiation levels, the greater the likelihood those deposits will yield significant amounts of gas. But that does not necessarily mean the radioactivity poses a public health hazard; after all, some homes in Pennsylvania and New York have been built directly on Marcellus shale. Tests conducted earlier this year in Pennsylvania waterways that had received treated water—both produced water (the fracking fluid that returns to the surface) and brine (naturally occurring water that contains radioactive elements, as well as other toxins and heavy metals from the shale)—found no evidence of elevated radiation levels. Conrad Dan Volz, former scientific director of the Center for Healthy Environments and Communities at the University of Pittsburgh, is a vocal critic of the speed with which the Marcellus is being developed—but even he says that radioactivity is probably one of the least pressing issues. “If I were to bet on this, I’d bet that it’s not going to be a problem,” he says.

“CLAIMING THAT THE INFORMATION IS PROPRIETARY, DRILLING COMPANIES HAVE STILL NOT COME OUT AND FULLY DISCLOSED WHAT FRACKING FLUID IS MADE OF.” Vanity Fair, June 2010

Under mounting pressure, companies such as Schlumberger and Range Resources have posted the chemical compounds used in some of their wells, and in June, Texas became the first state to pass a law requiring full public disclosure. This greater transparency has revealed some oddly benign ingredients, such as instant coffee and walnut shells—but also some known and suspected carcinogens, including benzene and methanol. Even if these chemicals can be found under kitchen sinks, as industry points out, they’re poured down wells in much greater volumes: about 5000 gallons of additives for every 1 million gallons of water and sand. A more pressing question is what to do with this fluid once it rises back to the surface. In Texas’s Barnett Shale, wastewater can be reinjected into impermeable rock 1.5 miles below ground. This isn’t feasible in the Marcellus Shale region; the underlying rocks are not porous enough. Currently, a handful of facilities in Pennsylvania are approved to treat the wastewater. More plants, purpose-built for the task, are planned. In the meantime, most companies now recycle this water to drill their next well.

“THE INCREASING ABUNDANCE OF CHEAP NATURAL GAS, COUPLED WITH RISING DEMAND FOR THE FUEL FROM CHINA AND THE FALL-OUT FROM THE FUKUSHIMA NUCLEAR DISASTER IN JAPAN, MAY HAVE SET THE STAGE FOR A ‘GOLDEN AGE OF GAS.” WALL STREET JOURNAL SUMMARIZING AN INTERNATIONAL ENERGY AGENCY REPORT, JUNE 6, 2011

There’s little question that the United States, with 110 years’ worth of natural gas (at the 2009 rate of consumption), is destined to play a major role in the fuel’s development. But even its most ardent supporters, men like T. Boone Pickens, concede that it should be a bridge fuel between more polluting fossil fuels and cleaner, renewable energy. In the meantime, the U.S. should continue to invest in solar and wind, conserve power and implement energy-efficient technology. Whether we can effectively manage our natural gas resource while developing next-gen sources remains to be seen. Margie Tatro, director of fuel and water systems at Sandia National Laboratories, says, “I think natural gas is a transitioning fuel for the electricity sector until we can get a greater percentage of nuclear and renewables on the grid.”

 

5.Compendium of Studies Demonstrating the Safety and Health Benefits of Fracking

The United States has made massive improvements in air quality over the past decade
and study after study has shown that the increased use of natural gas for electricity
generation – made possible by the shale revolution – is the reason we’ve achieved this
feat.

This progress is the centerpiece of Energy In Depth’s new report – Compendium of
Studies Demonstrating the Safety and Health Benefits of Fracking – which includes data
from 23 peer-reviewed studies, 17 government health and regulatory agencies and
reports from 10 research institutions that clearly demonstrate:
• Increased natural gas use — thanks to hydraulic fracturing —has led to dramatic
declines in air pollution. The United States is the number one oil and gas producer in
the world and it has some of the lowest death rates from air pollution in the world.
Numerous studies have shown that pollution has plummeted as natural gas production
has soared.
Emissions from well sites and associated infrastructure are below thresholds
regulatory authorities consider to be a threat to public health – that’s the conclusion of
multiple studies using air monitors that measure emissions directly.
• There is no credible evidence that fracking causes or exacerbates asthma. In fact,
asthma rates and asthma hospitalizations across the United States have declined as
natural gas production has ramped up.
• There is no credible evidence that fracking causes cancer. Studies that have directly
measured emissions at fracking sites have found emissions are below the threshold
that would be harmful to public health.
• There is no credible evidence that fracking leads to adverse birth outcomes. In fact,
adverse birth outcomes have decreased while life expectancy has increased in areas
that are ramping up natural gas use.
Fracking is not a credible threat to groundwater. Study after study has shown that
there are no widespread, systemic impacts to drinking water from hydraulic fracturing.
It is well known that the shale revolution has been a boon to our nation’s economy,
its geopolitical position, and the millions of consumers and manufacturers who
continue to benefit from historically low energy costs. But the case in support of
shale’s salubrious effect on air quality and health continues to be an underreported
phenomenon – this new report puts the health benefits of our increased use of natural
gas in the spotlight.

Conclusion
To be clear, no form of energy development, whether we’re talking about fossil fuels or
renewables, is risk free. But the data clearly show, time and time again, that emissions
from fracking are not a credible risk to public health.

In fact, the data show that enormous reductions in pollution across the board are
attributable to the significant increases in natural gas consumption that hydraulic
fracturing has made possible.

They show power plant emissions of SO2 declining by 86 percent, emissions of NOx
declining by 67 percent, and emissions of mercury by 55 percent. They also show
hospitalizations for asthma declining as natural gas ramps up. At the same time life
expectancy and birth outcomes have improved.

And, of course, all these positive health outcomes can be largely traced back to
significantly cleaner air, thanks to fracking.

via Science Matters

https://ift.tt/2HLiGO1

June 13, 2018 at 06:57PM

Yesterday’s Climate Debate of the Decade: a summary from an attendee

Summary of the event Conversations on Climate Change held in Charleston, WV, Geary Auditorium, on June 12th, 2018.

By Brian Lindauer

“Our premise was this: Climate change is undeniable, but there is disagreement as to whether human activity is causing it, and if so, to what degree.”

So states the informational material provided by Spilman Thomas & Battle to the attendees of their privately organized forum on climate change. It’s quite curious that the evening was put together by, of all things, a law firm. One might expect this sort of event to have been put together by a university’s science department, or perhaps one of the national scientific organizations, such as our own David Middleton’s favorite, American Association for the Advancement of Science in America. In this case, however, it was a confluence of interests that prompted Spilman Thomas & Battle to organize the evening.

They describe themselves as a super-regional law firm based in the mid-Atlantic, but given that they’re headquartered in Charleston, their client base includes businesses in the energy sector, manufacturing, and related industry. As such, climate change and its potential regulatory impacts are of deep concern to them. Adding to this, several of the partners hold a personal interest in the subject. So when it came time for the firm to choose a subject for one of their periodic public forums, climate change seemed an obvious choice.

So the organizers arranged for two sides to be represented, with Dr. Michael Mann and Dr. David Titley on one side, and Dr. Judith Curry and Dr. Patrick Moore on the other.

The event was held at University of Charleston’s Geary Auditorium, but just to be clear, was not actually an official university event.

With Spilman partner, Nicholas Preservati, moderating and introducing the topic and speakers, the attendees were informed that the position of the organizers was simple: it’s not IF there’s climate change, but rather, how much has man contributed to it through the addition of CO2 into the atmosphere. This was to be the framework under which all the presenters agreed to speak.

The format for the evening’s discussion was simple. Each speaker would have fifteen minutes to present. After the four presentations, a question and answer session would follow, in which the moderator would present previously vetted questions to the speakers. The speakers each addressed three questions during this phase. Finally, the speakers were provided with the opportunity to give a final comment, limited to two minutes each. The order of speakers for the presentations was Mann, Curry, Titley, Moore, with this order being maintained through each of the phases.

What was promised was a collegial discussion, “a fascinating and enlightening conversation” between world-renowned scientists on an issue that has been divisive, and at times, vitriolic. This was largely what was delivered. Not a debate. But rather the presentation of a diversity of viewpoints.

Now, in the interest of completeness, I’ll offer a synopsis of each speaker and their main points below. But before I do, I think it might be useful to offer some overall thoughts regarding the event. Going in, I think it’s unlikely that anyone would find an event such as this sufficient to change a mind that’s already made up. What it can do, however, is introduce a topic, or suggest an idea, that might lead an individual to do some deeper exploring. I know this was the true hope of the organizers. And listening to the information presented, I do believe there was enough there for a curious mind to be intrigued.

The forum presented information ranging from Mann’s hockey stick to the paleoclimate record. We heard claims of induced ice melt causing irreversible sea level rise, as well as a counter-claim showing a completely natural explanation that has nothing to do with CO2 driven warming.

National security concerns were discussed related to a potential “500 million people in play,” migrating due to sea level rise. (As a point of reference, we were reminded of the staggering impact on Europe that one million Syrian refugees had, with it being left up to us to infer the impact of 500 times this number.) And extreme weather events, such as droughts and flooding were repeatedly referenced.

In the end, I think Dr. Curry was most accurate when she described the CO2 control knob theory as “overtly simplistic.” The idea that man is responsible for permanently harming the climate is an easy thing to believe. We burn fossil fuels. This releases CO2 which warms the atmosphere. This warming makes all these other bad things inevitable. It’s simple. Direct. And there’s enough evidence easily available to convince an unwary scientist of its veracity.

It’s only when you dig deeper though, and eschew the seductive easy explanation, that you begin to note that the evidence might not be so easily explained by your theory after all. This can be hard for people to accept, though, and there’s no telling what will trigger it for each individual. Did people walk away from the event believing Dr. Mann’s claim that there’s “no worthy debate to be had” on the science? Or did they hear Dr. Curry’s scientific questioning and Dr. Moore’s unfettered passion and wonder, if there’s no debate to be had, how is it these two incredibly intelligent individuals, and noted scientists in their fields, don’t agree?

No one can answer this for sure, but we can certainly hope…after all, besides wanton destruction of the earth’s climate, isn’t hope what we humans do best?

Speaker Summaries:

Dr. Michael Mann, at ease and confident at the podium, led off the evening by stating his hope for “a robust conversation” on how to address climate change. His presentation was based around the idea that the only debate to be had is on what to do about man-made climate change. Indeed, he stated this position several times, reinforcing it by clarifying that there’s no worthy debate to be had on whether there’s a problem, or that man has caused it. As a justification for this, Dr. Mann explained that the science behind anthropogenic climate change is verifiable fact. Incontrovertible. Well known and agreed upon for over a hundred years.

Of all the claims made throughout the evening, this is the one I found to be the most personally problematic. Clearly scientists such as Curry and Moore aren’t, to borrow a tired phrase, “denying” the basic science of atmospheric and radiative physics. To claim otherwise, or even to imply through omission, that they do so is unfair, untrue, and frankly, does nothing to increase the credibility of the presenter.

At any rate, moving on, as anyone familiar with this subject could guess, Dr. Mann’s presentation centered on his “iconic” hockey stick graph, noting that this year marks the 20th anniversary of its publication. The point he made sure to emphasize with the hockey stick was the “warming spike” of the late 20th century is unnatural, and unprecedented in tens of thousands of years. He noted that 2014, 2015, and 2016 were each record-breaking years for global temperatures, and cited his 2017 paper which ostensibly demonstrated there was only a 1 in 3000 chance that three consecutive years of global warming would be due to natural causes. In the course of his presentation, Dr. Mann made two specific claims: temperatures were now likely to rise by 4 to 5 degrees Celsius and sea levels by 6 to 8 feet.

Dr. Judith Curry’s careful and precise approach was an interesting contrast to Dr. Mann’s. Whereas he spoke engagingly, but quickly, Dr. Curry never broke stride from her measured and deliberate pace. I’m not sure how much lecturing she did during her tenure at Georgia Tech, but she certainly seemed practiced and poised at the podium during her presentation.

In it she systematically described how she moved from agreement with the IPCC to a skeptic position. She noted the areas where there is agreement between scientists: that global temperatures have increased, that humans have contributed to the rise in CO2, and that CO2 is a greenhouse gas. She also pointed out the crucial point of disagreement was not related to these basic scientific premises, but rather, was in how much of the temperature increase can be attributed to CO2. She pointed out that upon deeper investigation, many of the observations used by scientists, such as Dr. Mann, to support the man-made climate change theory had natural explanations, and needed no help from CO2 to understand. Seal level rise made its second appearance for the evening, when Dr. Curry used it as a specific example that had natural explanations.

She also presented a clear delineation between the two competing understandings of the climate: CO2 Control Knob versus Natural Variability. If you were interested in summarizing Dr. Curry’s general position on from this one brief presentation, you would conclude that she strongly believes “you get what you get” with the earth’s climate, and that’s it’s unlikely that mankind has forced any significant perturbation.

It’s interesting to note that Dr. Curry took the time in her short presentation to describe the madhouse effect, and how it’s being played out in the climate science community. I found her most scathing critique of the night summed up in her first point on this: a “rampant overconfidence in an overly simplistic theory of climate change”. Without claiming to know her personally, I would describe this as classic Curry. Precise. Sharp. And to the point.

Dr. David Titley, was next, and was clearly a gifted speaker, with light jests and humor sprinkled throughout his presentation. Interestingly, in this crowd his jokes seemed to miss more often than they hit, despite it being what he described as a “target rich environment”. All I can say is, tough crowd doc! Keep at it and you’ll find your groove eventually.

In all seriousness, Dr. Titley, like Dr. Mann, attempted to conflate the uncontested scientific premises of John Tyndall and Svante Arrehnius with the claims of man-made climate change. In doing this, he drew an analogy between a three-legged stool and the three bases for his scientific convictions: fundamental theory, observations, and predictions. In Dr. Titley’s estimation, we have a fundamental theory that matches our observations, and as for our predictions, if anything, they’re too conservative.

As evidence, Dr. Titley showed a graph which purported to demonstrate Dr. James Hansen’s analysis from 20 years ago (apologies for the vague description…my notes are a bit unclear here) and how it had fared against observations. In the analysis presented by Dr. Titley, Hansen actually under-predicted the warming (or sea level rise…again, I’m unsure what the chart actually was). With this three-legged stool thus secured comfortably beneath him, Dr. Titley was able to focus on the implications for national security, as well how we as society can alleviate the economic stress our mitigation efforts will necessarily cause.

On this point, credit is given where due; Dr. Titley expressed a clear and unambiguous concern for those whose livelihoods might be impacted due to policy choices and increased regulations. His conclusion, though, was that we have the capacity to help those impacted, and should not let it stand in the way of moving away from fossil fuels.

Dr. Patrick Moore was the final presenter, and spoke with passion about the increase in CO2 being a wonderful boon to all life. If you ever wondered why or how he got into environmental activism, you understood immediately upon hearing him speak. This is an individual who feels strongly, and believes fervently, in his message. His presentation began with some pictures of himself from the heady days of free love, cheap drugs, and…Russian whaling? Yes. Our dear Patrick Moore, in his life on the edge, has pictures of himself in an inflatable boat pulling a Tiananmen Square with a Russian whaler. Oh, and he also had hair…but that’s a different topic.

With regards to his actual discussion, Dr. Moore began by running through what we know of paleoclimate history, showing charts that indicated temperatures and CO2 were not in sync throughout the record. It was a whirlwind tour through some five hundred million years of the earth’s climate history, with his basic premise being that CO2 has never been the cause for the earth’s many, and significant, climate fluctuations, so there’s no reason to assume it is today either. Furthermore, he clarified that despite the warming of the last 150 years, it’s still colder than during the peak of the last five interglacials. Neither had there been any single climate or weather events that were out of line from those experienced in the last ten thousand years.

Dr. Moore moved on from the paleo record quickly, though, and spent a good portion of the remaining time discussing all the benefits of CO2, concluding with his charge to “celebrate CO2”. (If I were in his marketing department, I’d suggest making this even more catchy by saying, “Celebrate, Don’t Regulate!”)

As Dr. Moore’s time ran out, it was clear that there were several other points he wanted to make, and my opinion is he may have tried to fit too much in.

As noted, following the presentations, there was a Q&A session and a closing comment opportunity. Here are my notes on interesting points made:

Mann:

  • The Pages 2K project validated the results of his original hockey stick
  • We don’t have any confidence in the paleo climate record more than 30K – 40K years old
  • Recent warming is unprecedented in totality of known climate record (the 30K – 40K year)
  • 350 – 380ppm is ideal CO2 level
  • No honest debate can be had about the basics of the science
  • Visit scepticalscience.com for more info on how to talk to skeptics

Curry:

  • Risk mitigation strategies must match the level of the risk
  • The precautionary principle is dangerous because it may set you down the wrong path
  • Beware the cure that’s worse than the disease
  • Man is not capable of controlling the climate, we will get what we get
  • IPCC set a range of 1.5 – 4.5 degrees Celsius for ECS, but GCMs seemed tuned to about 3.2C (high-end)
  • There’s too much uncertainty in our understanding to make broad sweeping claims
  • We have no idea what the optimal CO2 level should be

Titley:

  • Droughts and temperatures are the specific components of climate change most attributable to CO2
  • Extreme weather events will get worse
  • Sea level rise is the single biggest concern, with up to 25′ – 30′ likely (100 – 200 years out)
  • Orlando could be the southern most point of a future Florida
  • The optimal CO2 level is the level that caused climate stability, which in turned allowed mankind to flourish (starting 8,000 years ago)…so mid-300’s ppm is ideal.

Moore:

  • Consensus is a political word, not a scientific one
  • The impact of 2C increase would be equivalent to moving to Florida (insignificant)
  • Civilation began to flourish during holocene maximum, which was warmer than today (glacier advance and subsequent retreat since then demonstrates that it was warmer then that it is today)
  • Total reduction of man-made CO2 emissions is not only impossible, but it’s undesirable
  • Man’s accidental intervention into the carbon cycle may have inadvertently halted the slow death of plant life by reintroducing needed CO2 into the cycle
  • Ideal CO2 levels for plants are around 1000ppm, and there’s no reason to seek to avoid this level

Although I’ll provided expanded details for each presenter below, in the interest of keeping this digestible, I think it’s fair to provide my overall take on this up front, with a more detailed summary following at the end.

Dr. Mann’s key points and claims could be summarized as follows:

  • CO2 has now reached 410ppm, a level not seen in millions of years
  • There is now a veritable “hockey league” of graphs validating his original hockey stick graph
  • Based on current projections and “business as usual”, 4 to 5 degrees C warming is likely, and twice that in the Arctic
  • The models are wrong on sea ice…it’s melting FASTER than they projected (it’s not clear if the graph Dr. Mann displayed was global, arctic, or antarctic sea ice projections and observations)
  • The melting of the ice sheet represents a tipping point. Once it starts, it’s impossible to stop, and will represent substantial feedbacks kicking in
  • Climate change is now changing the jet stream, inducing large meanders into it
  • Sea level rise expectations have increased from 3′ – 4′ to 6′ – 8′

During the Q&A phase Dr. Mann addressed three questions:

1) Why don’t we move towards clean coal?
Dr. Mann’s position is that clean coal is not currently economically competitive against natural gas, but otherwise is perfectly acceptable as an energy source from a climate change perspective as long as it “keeps the genie in the bottle”
2) What, if anything, would he do differently on his hockey stick graph if he were doing it today?
Dr. Mann noted that his hockey stick graph was the first time this type of analysis had been attempted. And like any “seminal piece of work”, there were things to improve. However, though much has been learned in the intervening years, the hockey stick has been validated, most notably by the Pages2K project. See here for Willi’s take on it at the time it was published: https://ift.tt/2t0CNSS
3) What is the optimal level of CO2. (Note: This last question was a general question addressed by each of the speakers.)
Dr. Mann’s original answer to this question was a bit evasive, or perhaps it’d be fairer to categorize it as equivocal. Either way, Nick (the moderator) pressed him to give a concrete answer. Upon being pressed Dr. Mann hypothesized that 350 – 380 ppm were optimal. He also stated that CO2 levels greater than 400ppm could result in up to 60′ to 80′ of sea level rise.


Footnote by Anthony:

I offer my sincere thanks to Brian for his excellent summary.

As many WUWT readers know, the live video feed yesterday was a disaster. The organizers recognized this, and to their credit, sent this email:

We are aware of technical challenges that made it very difficult to hear the live broadcast and are so sorry for the disappointment. We did have a separate professional recording made of the event and will share a copy of that with you as soon as it is available. Please accept our sincere apologies for the sound quality during the live event. Thank you for your patience as we work to get this remedied.

I hope that when the video is made available, we’ll be able to share it here on WUWT.

via Watts Up With That?

https://ift.tt/2t7xlNY

June 13, 2018 at 06:48PM

New paper shows issues with temperature records: Comparing the current and early 20th century warm periods in China

Guest post by Dr. Willie Soon, Dr. Ronan Connolly & Dr. Michael Connolly

Recently, a new paper which we co-authored with five other researchers was published in Earth-Science Reviews entitled, “Comparing the current and early 20th century warm periods in China”. The paper is paywalled, but the journal has kindly allowed free access to the article until 20th July 2018 at this link here. If you’re reading this post after that date, you can download a pre-print here:Soon et al, 2018 ESR – China SAT trends (PDF)

The Supplementary Information and data for the paper is available here (Excel file) :Soon et al, 2018 ESR – China SAT trends – SI

The paper is quite technical and focuses specifically on Chinese temperature trends. But, we think that it will still be of interest to many readers here, especially anybody who is interested in any of the following topics:

  1. Urbanization bias
  2. The homogenization of temperature data
  3. The “early 20th century warm period” found in many parts of the Northern Hemisphere, and
  4. Comparing temperature proxies to instrumental records

Background to the project

Most studies of regional temperature trends for China have found at least two warm periods for the last century – the current warm period, and an early warm period from the 1920s until the 1940s. As part of our 2015 study of Northern Hemisphere temperature trends (abstract here; preprint here), we did our own analysis of Chinese temperature trends. We found that – for rural China – the 1940s warm period was actually hotter than present.

clip_image002clip_image002

This was quite surprising in terms of the current human-caused global warming explanation. In the 1940s, the atmospheric concentration of carbon dioxide (CO2) at 0.03% was still relatively close to the pre-industrial concentrations of 0.027-0.029%, while currently the concentration is about 0.04%. For that reason, the current Global Climate Models calculate that (a) there shouldn’t have been an early 20th century warm period, and (b) the current warm period should be the hottest on record. This can be seen from the plot below which shows the mean “hindcast” for China of all 41 of the “CMIP5” climate models. These are the models which were used for the most recent UN IPCC reports:

clip_image004clip_image004

In our 2015 paper, we argued that the most likely reason for the poor performance of the climate models is that they are underestimating the role of natural climate change and overestimating the role of carbon dioxide. [See also Soon et al., 2011 (abstract here; pdf here)].

However, while we were finding a very warm 1940s period for China, other groups were finding that the recent warm period was the “hottest on record” by far. For instance, the figure below is adapted from Liu et al., 2017 (abstract, paywalled unfortunately) – a study entitled, “Unprecedented warming revealed from multi-proxy reconstruction of temperature in southern China for the past 160 years”:

clip_image006clip_image006

Liu et al., 2017 did find a slight warm period in the 1940s, but they found that the recent warm period was much hotter.

When we combed through the literature on Chinese temperature trends, we found that there was actually a lot of debate over how the early-20th century and current warm periods compared to each other. Some studies found they were both quite similar, e.g., Ren et al., 2017 (abstract here; pdf here). But, others concluded that the earlier “warm period” was just a temporary localised blip, e.g., Li et al., 2017 (open access).

Why were different studies getting such different answers?

Along with two other colleagues (Prof. Hong Yan and Peter O’Neill), we decided to collaborate with three of the scientists who had reached the opposite conclusion to us (Prof. Jingyun Zheng, Prof. Quansheng Ge, Prof. Zhixin Hao) and investigate.

Prof. Zheng and Prof. Hao were both co-authors of the Liu et al., 2017 study mentioned above, i.e., the one claiming the current warm period was “unprecedented” in the last 160 years. And along with Prof. Ge, they have co-authored many papers which often reached very different conclusions from us, e.g., see Wang et al., 2018 (Open access). However, even though we each have different views on this subject, we all agreed that it is important to establish the reasons for the disagreements. Our new paper describes the results of this collaboration.

Summary of our key findings

1. Limitation of data for pre-1950s period

Probably, the biggest challenge for comparing the early 20th century and current warm periods in China is the shortage of long-term records. After the People’s Republic of China was founded in 1949, a nation-wide network of weather stations was installed across much of China, and so there is a relatively large amount of data for the post-1950s period. However, most of these stations were not available during the 1940s, i.e., at the time of the earlier warm period!

This can be seen from the figures below showing the number of stations in two of the most widely-used temperature datasets: the Climate Research Unit’s “CRUTEM” dataset and NOAA’s “GHCN” dataset.

clip_image008clip_image008

There is a lot of ongoing work to try and track down and digitize more data for China in the pre-1950s period, e.g., the so-called “ACRE” project – see Williamson et al., 2017 (Open access). But, for now, the available data is quite limited.

Some of you may have noticed that there was a surprising sudden drop in station numbers after 1990 for version 3 of both datasets. This was particularly pronounced for the GHCN dataset, which had a relatively large number of stations for the 1961-1990 period. There are several legacy reasons for this odd fall off, e.g., when GHCN was initially compiled, several of their main data sources had only been updated to 1990. However, with version 4 of both datasets, this particular problem seems to have been mostly resolved.

2. The urbanization bias problem

A related problem is that most of those stations with data for the early-20th century are in urban areas. This makes sense in that it is harder to staff a weather station continuously for decades if it is located in an isolated spot far from any urban area. However, urban areas tend to be warmer than the surrounding countryside due to the “Urban Heat Island” (UHI) effect. Since China has experienced a dramatic increase in urbanization over the last few decades, many of these urban station records have experienced a strong “urban warming” due to the growth of the local UHI.

This urban warming is a real climatic effect, and it is a growing problem for the Chinese population, since most people in China now live in urban areas. However, urban areas account for less than 1% of the land area of China. So, if you want to study the actual climatic trends for China, you need to correct for this “urbanization bias” problem.

To study this problem, we ranked all 494 Chinese stations in the GHCN version 4 dataset based on how urbanized they are today. We then divided the stations into five subsets according to these rankings. Below are the results for the most urban and most rural subsets:

clip_image010clip_image010

We found that the more urbanized the subset, the hotter the recent warm period seemed to be. This suggests that at least some of the recent “warming” is an artefact of urbanization bias. However, unfortunately, the more rural the subset, the less data there was for the early 20th century – leading to greater error bars. So, if we want to compare the two periods, we can’t just rely on the rural data.

Also, we found that most of the urban stations were in eastern China, while most of the rural stations were from western and central-China.

3. The homogenization debate

Urbanization bias is by no means the only non-climatic bias associated with the temperature records. There are many different ways in which changes to the thermometer station and its environment can artificially alter the values of the “measured temperatures” independently of the real climate. For example, station moves, changes in the time of observation, changes in the immediate surroundings of the thermometer, etc.

Several groups have come up with different statistically based “homogenization” algorithms to adjust the station records to attempt to correct for such “step change” non-climatic biases, such as the Menne & Williams, 2009 algorithm used by NOAA (Open access).

Meanwhile, studies such as Venema et al., 2012 (Open access) have tested these algorithms by artificially adding synthetic biases to unbiased data, and seeing how good they are at removing them. On these tests, the main algorithms seem to do quite well. This has led many people to assume that the “homogenized” temperature datasets are more reliable than the “raw” (unhomogenized) datasets.

As it happens, when the Chinese temperature datasets are homogenized, this tends to slightly reduce the warmth of the early 20th century period and increase the warmth of the current period. This has led some researchers to conclude that the 1940s warm period was at least partially an artefact of non-climatic biases, e.g., Li et al., 2017 (Open access).

However, in Sections 3.2.3-3.2.5 of the paper, we point out that it ain’t necessarily so. We show that the current homogenization algorithms have a theoretical flaw called the “urban blending problem”. Whenever there are a lot of urban stations, and the urban stations are used as neighbours for homogenizing rural stations, some of the urban heat island of the urban stations gets added into the “rural” station records. This means that after homogenization, all of the stations have some urban warming in their records – even if they had originally been rural!

We can see the results in the following plot which shows the effects of homogenization on a sample of 10 stations near Beijing. The plot is adapted from He and Jia, 2012 (Open access).

clip_image012clip_image012

We can see that before homogenization, there was a very strong correlation between the 1978-2008 warming trend and the rate of urbanization. That is, most of the warming trend was a result of urbanization bias. After homogenization, the differences in trends between the stations are reduced, and all of the stations have similar rates of warming. But, while the most heavily urbanized station has been cooled, the rural stations have been warmed. Instead of the urbanization bias being removed by homogenization, it has been merely spread among all stations (rural and urban).

This urban blending effect seems to be the main reason why homogenization “cools” the 1940s warm period for China. So, before we can rely on the homogenized data for China, we have to avoid urban blending. However, because there are so few Chinese rural stations for the early warm period, nobody has yet managed to construct a homogenized Chinese dataset for the early 20th century which only uses rural stations.

4. Inconsistency of temperature proxies

One possible approach to overcoming the urbanization bias problem could be to use so-called “temperature proxies”. These are datasets constructed from temperature-influenced measurements such as tree rings and ice cores. Although these are usually only indirect measurements of temperature, if they are calibrated against local thermometer records, they can be used to estimate long-term temperature trends for the region. The records often cover hundreds (or even thousands) of years. They also tend to be located in isolated areas, such as in the mountains, and therefore are usually unaffected by urbanization.

We think that temperature proxies are a very powerful tool for the climate science community and several of our co-authors on this paper do a lot of work with temperature proxies. However, they also have their limitations and should be used cautiously. We have noticed that there has been a tendency for researchers to focus mostly on the bits where different proxy series agree with each other. But, in Section 3.5, we point out that we should be comparing and contrasting the proxy series, rather than just focusing on the points of agreement.

Below are four different proxies for the same region, i.e., northeast China (although Wiles et al., 2014 is technically for northeast Asia). All of them show warming and cooling periods, and it is possible to find points of agreement between any two series. However, when you try to directly compare any two specific periods, e.g., the 1920s-1940s and 1980s-present, the results depend on which series you pick!

clip_image014clip_image014

For instance, Zhu et al., 2015 implies that both periods were similarly warm (and that there were also similar warm periods in the 19th century). However, Zhu et al., 2016 implies that the 1940s were one of the coldest decades since the 19th century!

Therefore, we believe more research is needed before temperature proxies can be used for directly comparing periods such as the early 20th century and recent warm periods.

Conclusions

So, what can we say? The main conclusions of our study are as follows:

  • The 1940s warm period seems to have been a real phenomenon in China. However, the data is probably still too limited to establish exactly how warm it was compared to the current warm period.
  • The current climate models are unable to reproduce this early-20th century warm period, and can only simulate the recent warm period. The models attribute the recent warming to greenhouse gas emissions, and calculate that greenhouse gas emissions should not have caused an early-20th century warm period. This suggests that the earlier warm period was probably due to natural climate change – and that the current models are underestimating natural climate change.
  • But, given that the models can’t explain the earlier warm period, it is quite plausible that some (or even most) of the recent warm period could also be due to similar natural climate changes to the ones which caused the 1940s warm period. That would imply that the current models are also overestimating the role of greenhouse gases in the recent warm period.

Disclaimer:

Given the fact that conspiracy theories abound in climate science, we should stress that all of the research for this collaborative paper was carried out during our free time and at our own expense. None of us received any funding for this research. The motivation for all eight of us in writing this paper was to try to advance the science on this topic.

via Watts Up With That?

https://ift.tt/2LPudhH

June 13, 2018 at 04:12PM

Huge planet-wide dust storm on Mars knocks out NASA’s ‘Opportunity’ rover – scientists worried

Dust storm is now on the verge of circumnavigating the entire planet

Mars rover Opportunity is in trouble.

NASA engineers attempted to contact Opportunity yesterday, June 12th, but did not hear back from the nearly 15-year old rover. The problem: A huge dust storm is blanketing Perseverance Valley where Opportunity has been working. This sequence of images from NASA’s Mars Reconnaisance Orbiter shows the progression of the storm:

The huge dust cloud is highlighted in red. Soon after it appeared on May 31st, it swirled south to envelope Opportunity. Right now, the dust is so thick in Perseverance Valley, day has been turned into night. The solar powered rover is being deprived of the sunlight it needs to charge its batteries.

NASA is now operating under the assumption that the charge in Opportunity’s batteries has dipped below 24 volts and the rover has entered low power fault mode, a condition where all subsystems, except a mission clock, are turned off. The rover’s mission clock is programmed to wake Opportunity at intervals so it can check power levels. If the batteries don’t have enough charge, the rover will put itself back to sleep again.

In a teleconference today, NASA planners expressed optimism that Opportunity can weather this storm and wake up again after the skies clear. It may take days or weeks for this to occur, however.

This is a dust storm of rare size. It is now on the verge of circumnavigating the entire globe, overlying more than a quarter of Mars’ land area. It is so large, astronomers can photograph it using amateur telescopes. Indeed, Joseph Rueck saw the storm starting on May 31st using his backyard telescope in Seastian, Florida.

via NASA spaceweather.com


The last time there was such a planet wide dust storm was in 2001, as seen my this Hubble Space Telescope image:

via Watts Up With That?

https://ift.tt/2t6KSVZ

June 13, 2018 at 03:44PM