The Good, the Bad and the Null Hypothesis

The Good, the Bad and the Null Hypothesis

via Watts Up With That?
http://ift.tt/1Viafi3

Guest post by David Middleton

Introduction

When debating the merits of the CAGW (catastrophic anthropogenic global warming) hypothesis, I often encounter this sort of straw man fallacy:

All that stuff is a distraction. Disprove the science of the greenhouse effect. Win a nobel prize get a million bucks. Forget the models and look at the facts. Global temperatures are year after year reaching record temperatures. Or do you want to deny that.

Source

This is akin to arguing that one would have to disprove convection in order to falsify plate tectonics or genetics in order to falsify evolution.  Plate tectonics and evolution are extremely robust scientific theories which rely on a combination of empirical and correlative evidence.  Neither theory can be directly tested through controlled experimentation.  However, both theories have been tested through decades of observations.  Subsequent observations have largely conformed to these theories.

Note: I will not engage in debates about the validity of the scientific theories of plate tectonics or evolution.

The power of such scientific theories is demonstrated through their predictive skill: Theories are predictive of subsequent observations.  This is why a robust scientific theory is even more powerful than facts (AKA observations).

CAGW is a similar type of theory hypothesis.  It relies on empirical (the “good”) and correlative evidence (the “bad”).

The Good

Carbon dioxide is a so-called “greenhouse” gas.  It retards radiative cooling.  All other factors held equal, increasing the atmospheric concentration of CO2 will lead to a somewhat higher atmospheric temperature.  However, all other things are never held equal in Earth and Atmospheric Science… The atmosphere is not air in a jar; references to Arrhenius have no signficance.

Atmospheric CO2 has risen since the 19th century.

co2-1

Figure 2. Atmospheric CO2 from instrumental records, Antarctic ice cores and plant stomata.

Humans are responsible for at least half of this rise in atmospheric CO2.

law1600

Figure 3. Natural sources probably account for ~50% of the rise in atmospheric CO2 since 1750.

While anthropogenic sources are a tiny fraction of the total sources, we are removing carbon from geologic sequestration and returning it to the active carbon cycle.

The average temperature of Earth’s surface and troposphere has generally risen over the past 150 years.

mean-12

Figure 5. Surface temperature anomalies: BEST (land only), HadCRUT4 & GISTEMP. Satellite lower troposphere: UAH & RSS.

Atmospheric CO2 has risen and warming has occurred.

The Bad

The modern warming began long before the recent rise in atmospheric CO2 and prior to the 19th century temperature and CO2 were decoupled:

lawmob1

Figure 6. Temperature reconstruction (Moberg et al., 2005) and Law Dome CO2 (MacFarling Meure et al., 2006)

The recent rise in temperature is no more anomalous than the Medieval Warm Period or the Little Ice Age:

Ljungqvist

Figure 7. Temperature reconstruction (Ljungqvist, 2010), northern hemisphere instrumental temperature (HadCRUT4) and Law Dome CO2 (MacFarling Meure et al., 2006). Temperatures are 30-yr averages to reflect changing climatology.

Over the past 2,000 years, the average temperature of the Northern Hemisphere has exceeded natural variability (defined as two standard deviations from the pre-1865 mean) three times: 1) the peak of the Medieval Warm Period 2) the nadir of the Little Ice Age and 3) since 1998.  Human activities clearly were not the cause of the first two deviations.  70% of the warming since the early 1600’s clearly falls within the range of natural variability.

While it is possible that the current warm period is about 0.2 °C warmer than the peak of the Medieval Warm Period, this could be due to the differing resolutions of the proxy reconstruction and instrumental data:

lljung_2_zps1098cbb7

Figure 8. The instrumental data demonstrate (higher frequency and higher amplitude temperature variations than the proxy reconstructions.

The amplitude of the reconstructed temperature variability on centennial time-scales exceeds 0.6°C. This reconstruction is the first to show a distinct Roman Warm Period c. AD 1-300, reaching up to the 1961-1990 mean temperature level, followed by the Dark Age Cold Period c. AD 300-800. The Medieval Warm Period is seen c. AD 800–1300 and the Little Ice Age is clearly visible c. AD 1300-1900, followed by a rapid temperature increase in the twentieth century. The highest average temperatures in the reconstruction are encountered in the mid to late tenth century and the lowest in the late seventeenth century. Decadal mean temperatures seem to have reached or exceeded the 1961-1990 mean temperature level during substantial parts of the Roman Warm Period and the Medieval Warm Period. The temperature of the last two decades, however, is possibly higher than during any previous time in the past two millennia, although this is only seen in the instrumental temperature data and not in the multi-proxy reconstruction itself.

[…]

The proxy reconstruction itself does not show such an unprecedented warming but we must consider that only a few records used in the reconstruction extend into the 1990s. Nevertheless, a very cautious interpretation of the level of warmth since AD 1990 compared to that of the peak warming during the Roman Warm Period and the Medieval Warm Period is strongly suggested.

[…]

The amplitude of the temperature variability on multi-decadal to centennial time-scales reconstructed here should presumably be considered to be the minimum of the true variability on those time-scales.

[…]

Ljungqvist, 2010

ljungq4

Figure 9. Ljungqvist demonstrates that the modern warming has not unambiguously exceeded the range of natural variability. The bold black dashed line is the instrumental record. I added The red lines to highlight the margin of error.

The climate of the Holocene has been characterized by a roughly millennial cycle of warming and cooling (for those who don’t like the word “cycle,” pretend that I typed “quasi-periodic fluctuation):

wpid-holo_mc_1_zps7041a1cc

Figure 10. Millennial cycle apparent on Ljungqvist reconstruction.

wpid-holo_mc_9-1_zps1d318357

Figure 11. Millennial scale cycle apparent on Moberg reconstruction.

These cycles (quasi-periodic fluctuations) even have names:

wpid-holo_mc_2_zpsea2f4dec2

Figure 12. Late Holocene climate cycles (quasi-periodic fluctuations).

These cycles have been long recognized by Quaternary geologists:

wpid-holo_mc_8_zps5db2253a

Figure 12. The millennial scale climate cycle can clearly be traced back to the end of the Holocene Climatic Optimum and the onset of the Neoglaciation.

Fourier analysis of the GISP2 ice core clearly demonstrates that the millennial scale climate cycle is the dominant signal in the Holocene (Davis & Bohling, 2001).

By Indur Goklany. Originally published at the Cato Institute, but published here also by invitation from the author.

Arguably the most influential graphic from the latest IPCC report is Figure SPM.2 from the IPCC WG 2’s Summary for Policy Makers (on the impacts, vulnerability and adaptation to climate change). This figure, titled “Key impacts as a function of increasing global average temperature change”, also appears as Figure SPM.7 and Figure 3.6 of the IPCC Synthesis Report (available at http://ift.tt/2ongp1z;). Versions also appear as Table 20.8 of the WG 2 report, and Table TS.3 in the WG 2 Technical Summary. Yet other versions are also available from the IPCC WG2’s Graphics Presentations & Speeches, as well as in the WG 2’s “official” Power Point presentations, e.g., the presentation at the UNFCCC in Bonn, May 2007 (available at http://ift.tt/2pqZz6D;).

Notably the SPMs, Technical Summary, Synthesis Report, and the versions made available as presentations are primarily for consumption by policy makers and other intelligent lay persons. As such, they are meant to be jargon-free, easy to understand, and should be designed to shed light rather than to mislead even as they stay faithful to the science.

Let’s focus on what Figure SPM.2 tells us about the impacts of climate change on water.

The third statement in the panel devoted to water impacts states, “Hundreds of millions of people exposed to increased water stress.” If one traces from whence this statement came, one is led to Arnell (2004). [Figure SPM.2 misidentifies one of the sources as Table 3.3 of the IPCC WG 2 report. It ought to be Table 3.2. ]

What is evident is that while this third statement is correct, Figure SPM.2 neglects to inform us that water stress could be reduced for many hundreds of millions more — see Table 10 from the original reference, Arnell (2004). As a result, the net global population at risk of water stress might actually be reduced. And, that is precisely what Table 9 from Arnell (2004) shows. In fact, by the 2080s the net global population at risk declines by up to 2.1 billion people (depending on which scenario one wants to emphasize)!

And that is how a net positive impact of climate change is portrayed in Figure SPM.2 as a large negative impact. The recipe: provide numbers for the negative impact, but stay silent on the positive impact. That way no untruths are uttered, and only someone who has studied the original studies in depth will know what the true story is. It also reminds us as to why prior to testifying in court one swears to “tell the truth, the whole truth and nothing but the truth.”

Figure SPM.2 fails to tell us the whole truth.

Hints of the whole truth, however, are buried in the body of the IPCC WG 2 Report as evidenced by the following quote from Section 3.5.1, p. 194, of that report. Note that Arnell (2004b) and Arnell (2004) are identical.

3/year) than the differences in the emissions scenarios (Arnell, 2004b). The number of people living in severely stressed river basins would increase significantly (Table 3.2). The population at risk of increasing water stress for the full range of SRES scenarios is projected to be: 0.4 to 1.7 billion, 1.0 to 2.0 billion, and 1.1 to 3.2 billion, in the 2020s, 2050s, and 2080s, respectively (Arnell, 2004b). In the 2050s (SRES A2 scenario), 262-983 million people would move into the water stressed category (Arnell, 2004b). However, using the per capita water availability indicator, climate change would appear to reduce global water stress. This is because increases in runoff are heavily concentrated in the most populous parts of the world, mainly in East and South-East Asia, and mainly occur during high flow seasons (Arnell, 2004b). Therefore, they may not alleviate dry season problems if the extra water is not stored and would not ease water stress in other regions of the world. [Emphasis added]

But even this acknowledgment seems grudging, and leaves a misleading impression, as can be seen by the following annotated version of the above quote. [My annotations are indicated within the quote in square brackets and are in bold.]

3/year) than the differences in the emissions scenarios (Arnell, 2004b). The number of people living in severely stressed river basins would increase significantly (Table 3.2). The population at risk of increasing water stress for the full range of SRES scenarios is projected to be: 0.4 to 1.7 billion, 1.0 to 2.0 billion, and 1.1 to 3.2 billion, in the 2020s, 2050s, and 2080s, respectively (Arnell, 2004b). [COMMENT: note that the IPCC text fails to mention that the reductions in populations at risk of water stress due to climate change are projected to be substantially higher — 0.6 to 2.4 billion, 1.8 to 4.3 billion, and 1.7 to 6.0 billion in the 2020s, 2050s and 2080s, respectively. See Table 10 from the original source.] In the 2050s (SRES A2 scenario), 262-983 million people would move into the water stressed category (Arnell, 2004b). [COMMENT: The corresponding figures for the population moving out of water stress category are 191 to 1,493 million. See Table 9 from the original source.] However, using the per capita water availability indicator, climate change would appear to reduce global water stress. This is because increases in runoff are heavily concentrated in the most populous parts of the world, mainly in East and South-East Asia, and mainly occur during high flow seasons (Arnell, 2004b). Therefore, they may not alleviate dry season problems if the extra water is not stored and would not ease water stress in other regions of the world. [COMMENT: One should expect that societies would take action to store water if that’s what is necessary to avoid water stress. Such actions are not rocket science; they are probably as old as humanity itself, and have a successful track record going back for millennia. Moreover, if the IPCC’s emission scenarios, and the economic growth rates they assume are to be believed, these societies would be much wealthier in the future and should, therefore, have access to more capital to help adapt to such problems. See here (pp. 1034-1036, Tables 1 and 10).]

[Note that the Arnell paper is discussed in some detail here (pp. 1034-1036; Table 4), among other places.]

To summarize, with respect to water resources, Figure SPM.2 — and its clones — don’t make any false statements, but by withholding information that might place climate change in a positive light, they have perpetrated a fraud on the readers.

” data-medium-file=”” data-large-file=”” class=”alignnone size-full wp-image-3138″ src=”http://ift.tt/2pr8aX1″ alt=”wpid-holo_mc_6_zpsb6aab5aa2″/>

Figure 13. The Holocene climate has been dominated by a millennial scale climate cycle.

The industrial era climate has not changed in any manner inconsistent with the well-established natural millennial scale cycle. Assuming that the ice core CO2 is reliable, the modern rise in CO2 has had little, if any effect on climate.

The Null Hypothesis

What is a ‘Null Hypothesis’

A null hypothesis is a type of hypothesis used in statistics that proposes that no statistical significance exists in a set of given observations. The null hypothesis attempts to show that no variation exists between variables or that a single variable is no different than its mean. It is presumed to be true until statistical evidence nullifies it for an alternative hypothesis.

Read more: Null Hypothesis http://ift.tt/2onbTQD
Follow us: Investopedia on Facebook

Since it is impossible to run a controlled experiment on Earth’s climate (there is no control planet), the only way to “test” the CAGW hypothesis is through models.  If the CAGW hypothesis is valid, the models should demonstrate predictive skill.  The models have utterly failed:

The models have failed because they result in a climate sensitivity that is 2-3 times that supported by observations:

From Hansen et al. 1988 through every IPCC assessment report, the observed temperatures have consistently tracked the strong mitigation scenarios in which the rise in atmospheric CO2 has been slowed and/or halted.

Apart from the strong El Niño events of 1998 and 2015-16, GISTEMP has tracked Scenario C, in which CO2 levels stopped rising in 2000, holding at 368 ppm.

Hansen_1

Figure 16. Hansen’s 1988 model and GISTEMP.

The utter failure of this model is most apparent on the more climate-relevant 5-yr running mean:

Hansen_5

Figure 17. Hansen’s 1988 model and GISTEMP, 5-yr running mean.

This is from IPCC’s First Assessment Report:

AR1_01

Figure 18.  IPCC First Assessment Report (FAR).  Model vs. HadCRUT4.

HadCRUT4 has tracked below Scenario D.

AR1_02

Figure 19. IPCC FAR scenarios.

This is from the IPCC’s Third Assessment Report (TAR):

As evidence to the contrary started rolling in and one prominent scientist after another abandoned ship, the global warming brigade lost much of its sizzle in the past year.

With temperatures going down rather than up, the devoted even had to retire the term “global warming” altogether.

But the wheels really started coming off the newly branded climate change bandwagon when Hollywood and a host of high-profile celebrities began admonishing we puny mortals about changing our ways for the sake of the planet.

There’s something irritating about being lectured on the finer points of energy conservation by jet-flying, limousine-riding, estate-living elites with their Olympic-sized swimming pools, vacation homes and garages full of European sports cars.

And this past week brought about a pitiable scene that will only further alienate the alarmists from the rest of us.

The four least memorable, demonstrably inept prime ministers of the last half-century gathered for a sad call to action to combat climate change. John Turner, Joe Clark, Kim Campbell and Paul Martin (remember him; the guy who was going to win 250 seats and govern for four terms?) are part of a coalition critical of the Harper government’s environmental record.

Their many years of public service are not to be discounted. But when it came to leadership, this is truly the gang that couldn’t shoot straight. Combined, the four of them barely lasted a full term as PM.

Their gaffes, miscalculations and historical performances of under-achievement are at once mind numbing and laughable. Each proved to be a walking disaster at the helm who was way, way in over their head.

But a coalition calling itself “Canadians for Climate Leadership” trotted the four of them out as though they were the Montreal Canadiens of the 1970s. The sad sack Vancouver Canucks of those depressing Pacific Coliseum days might be a more fitting sports analogy.

It’s really quite rich that this quorum of failed leaders would have the gall to demand we do more about the environment when each and every one of them gave the issue a pass during public life. Paul Martin’s hypocrisy is especially disturbing.

No doubt the coalition was waiting for return calls from Jean Chretien and Brian Mulroney but ultimately had to settle on the B-team.

It’s akin to going to a Star Trek Convention hoping for William Shatner or Leonard Nimoy but it turns out the keynote speaker is that guy in the reptile suit that Captain Kirk karate-chopped into submission.

Many are completely fed up with the reckless hysteria and junk science the climate change thesis is founded on.

And it would be so sweet if the four failed former PMs leave the cause in the same state of shamble and disarray they did their political parties.

John Martin, a criminologist at the University of the Fraser Valley, can be reached at John.Martin@ucfv.ca

” data-medium-file=”” data-large-file=”” class=”alignnone size-full wp-image-3117″ src=”http://ift.tt/2oncIZM” alt=”TAR_01″/>

Figure 20. IPCC TAR model vs. HadCRUT4.

HadCRUT4 has tracked the strong mitigation scenarios, despite a general lack of mitigation.

The climate models have never demonstrated any predictive skill.

And the models aren’t getting better. Even when they start the model run in 2006, the observed temperatures consistently track at or below the low end 5-95% range.  Observed temperatures only approach the model mean (P50) in 2006, 2015 and 2016.

The ensemble consists of 138 model runs using a range of representative concentration pathways (RCP), from a worst case scenario RCP 8.5, often referred to as “business as usual,” to varying grades of mitigation scenarios (RCP 2.6, 4.5 and 6.0).

fig-nearterm_all_update_2017-panela-1-1024x525

Figure 22. Figure 21 with individual model runs displayed.

SOURCE

When we drill wells, we run probability distributions to estimate the oil and gas reserves we will add if the well is successful.  The model inputs consist of a range of estimates of reservoir thickness, area and petrophysical characteristics.  The model output consists of a probability distribution from P10 to P90.

  • P10 = Maximum Case.  There is a 10% probability that the well will produce at least this much oil and/or gas.
  • P50 = Mean Case.  There is a 50% probability that the well will produce at least this much oil and/or gas.  Probable reserves are >P50.
  • P90 = Minimum Case.  There is a 90% probability that the well will produce at least this much oil and/or gas.  Proved reserves are P90.

Over time, a drilling program should track near P50.  If your drilling results track close to P10 or P90, your model input is seriously flawed.

If the CMIP5 model ensemble had predictive skill, the observations should track around P50, half the runs should predict more warming and half less than is actually observed. During the predictive run of the model, HadCRUT4.5 has not *tracked* anywhere near P50…

Sept. 18, 2008

<p><span class="bold">MEDIA ADVISORY : M08-176</span></p> <p><span class="bold"><a href="http://www.nasa.gov/home/hqnews/2008/sep/HQ_M08176_Ulysses_teleconference.html">http://www.nasa.gov/home/hqnews/2008/sep/HQ_M08176_Ulysses_teleconference.html</a></span></p> <p><span class="bold"><strong>NASA To Discuss Conditions On And Surrounding The Sun</strong> </span></p> <p><!– Body starts –>WASHINGTON — NASA will hold a media teleconference Tuesday, Sept. 23, at 12:30 p.m. EDT, to discuss data from the joint NASA and European Space Agency Ulysses mission that reveals the sun’s solar wind is at a 50-year low. The sun’s current state could result in changing conditions in the solar system. <!–more–></p> <p> </p> <p>Ulysses was the first mission to survey the space environment above and below the poles of the sun. The reams of data Ulysses returned have changed forever the way scientists view our star and its effects. The venerable spacecraft has lasted more than 17 years – almost four times its expected mission lifetime.</p> <p>The panelists are:<br /> — Ed Smith, NASA Ulysses project scientist and magnetic field instrument investigator, Jet Propulsion Laboratory, Pasadena, Calif.<br /> — Dave McComas, Ulysses solar wind instrument principal investigator, Southwest Research Institute, San Antonio<br /> — Karine Issautier, Ulysses radio wave lead investigator, Observatoire de Paris, Meudon, France<br /> — Nancy Crooker, Research Professor, Boston University, Boston, Mass.</p> <p>Reporters should call 866-617-1526 and use the pass code “sun” to participate in the teleconference. International media should call 1-210-795-0624.</p> <p>To access visuals that will the accompany presentations, go to:</p> <p align="center"><a href="http://www.nasa.gov/topics/solarsystem/features/ulysses-20080923.html">http://www.nasa.gov/topics/solarsystem/features/ulysses-20080923.html</a></p> <p>Audio of the teleconference will be streamed live at:</p> <p align="center"><a href="http://www.nasa.gov/newsaudio">http://www.nasa.gov/newsaudio</a></p> <p>&lt;!– Body ends –></p>

 

<p><!– Press Release standard text starts –></p> <p align="center">– end –</p> <p to John Sumpton</p> ” data-medium-file=”” data-large-file=”” class=”alignnone size-full wp-image-3144″ src=”http://ift.tt/2prcu8O&#8221; alt=”cmip5_2″/>

Figure 23. Figure 21 zoomed in on model run period with probability distributions annotated.

I “eyeballed” the instrumental observations to estimate a probability distribution of predictive run of the model.

Prediction Run Approximate Distribution

2006 P60 (60% of the models predicted a warmer temperature)
2007 P75
2008 P95
2009 P80
2010 P70
2011-2013 >P95
2014 P90
2015-2016 P55

Note that during the 1998-99 El Niño, the observations spiked above P05 (less than 5% of the models predicted this). During the 2015-16 El Niño, HadCRUT only spiked to P55.  El Niño events are not P50 conditions. Strong El Niño and La Niña events should spike toward the P05 and P95 boundaries.

The temperature observations are clearly tracking much closer to strong mitigation scenarios rather than RCP 8.5, the bogus “business as usual” scenario.

The red hachured trapezoid indicates that HadCRUT4.5 will continue to track between less than P100 and P50. This is indicative of a miserable failure of the models and a pretty good clue that the models need be adjusted downward.

In any other field of science CAGW would be a long-discarded falsified hypothesis.

Conclusion

Claims that AGW or CAGW have earned an exemption from the Null Hypothesis principle are patently ridiculous.

In science, a broad, natural explanation for a wide range of phenomena. Theories are concise, coherent, systematic, predictive, and broadly applicable, often integrating and generalizing many hypotheses. Theories accepted by the scientific community are generally strongly supported by many different lines of evidence-but even theories may be modified or overturned if warranted by new evidence and perspectives.

UC Berkeley

This is not a scientific hypothesis:

More CO2 will cause some warming.

 It is arm waving.

This is a scientific hypothesis:

A doubling of atmospheric CO2 will cause the lower troposphere to warm by ___ °C.

Thirty-plus years of failed climate models never been able to fill in the blank.  The IPCC’s Fifth Assessment Report essentially stated that it was no longer necessary to fill in the blank.

While it is very likely that human activities are the cause of at least some of the warming over the past 150 years, there is no robust statistical correlation.  The failure of the climate models clearly demonstrates that the null hypothesis still holds true for atmospheric CO2 and temperature.

Selected References

Davis, J. C., and G. C. Bohling, The search for patterns in ice-core temperature curves, 2001, in L. C. Gerhard, W. E. Harrison, and B. M. Hanson, eds., Geological perspectives of global climate change, p. 213–229.

Finsinger, W. and F. Wagner-Cremer. Stomatal-based inference models for reconstruction of atmospheric CO2 concentration: a method assessment using a calibration and validation approach. The Holocene 19,5 (2009) pp. 757–764

Grosjean, M., Suter, P. J., Trachsel, M. and Wanner, H. 2007. Ice-borne prehistoric finds in the Swiss Alps reflect Holocene glacier fluctuations. J. Quaternary Sci.,Vol. 22 pp. 203–207. ISSN 0267-8179.

Hansen, J., I. Fung, A. Lacis, D. Rind, Lebedeff, R. Ruedy, G. Russell, and P. Stone, 1988: Global climate changes as forecast by Goddard Institute for Space Studies three-dimensional model. J. Geophys. Res., 93, 9341-9364, doi:10.1029/88JD00231.

Kouwenberg, LLR, Wagner F, Kurschner WM, Visscher H (2005) Atmospheric CO2 fluctuations during the last millennium reconstructed by stomatal frequency analysis of Tsuga heterophylla needles. Geology 33:33–36

Ljungqvist, F.C. 2009. N. Hemisphere Extra-Tropics 2,000yr Decadal Temperature Reconstruction. IGBP PAGES/World Data Center for Paleoclimatology Data Contribution Series # 2010-089. NOAA/NCDC Paleoclimatology Program, Boulder CO, USA.

Ljungqvist, F.C. 2010. A new reconstruction of temperature variability in the extra-tropical Northern Hemisphere during the last two millennia. Geografiska Annaler: Physical Geography, Vol. 92 A(3), pp. 339-351, September 2010. DOI: 10.1111/j.1468-459.2010.00399.x

MacFarling Meure, C., D. Etheridge, C. Trudinger, P. Steele, R. Langenfelds, T. van Ommen, A. Smith, and J. Elkins. 2006. The Law Dome CO2, CH4 and N2O Ice Core Records Extended to 2000 years BP. Geophysical Research Letters, Vol. 33, No. 14, L14810 10.1029/2006GL026152.

Moberg, A., D.M. Sonechkin, K. Holmgren, N.M. Datsenko and W. Karlén. 2005. Highly variable Northern Hemisphere temperatures reconstructed from low-and high-resolution proxy data. Nature, Vol. 433, No. 7026, pp. 613-617, 10 February 2005.

Instrumental Temperature Data from Hadley Centre / UEA CRU, NASA Goddard Institute for Space Studies and Berkeley Earth Surface Temperature Project via Wood for Trees.

Featured Image