*Guest Post by Willis Eschenbach*

In my recent post “Putting It Into Reverse” I discussed the relationship between temperature and total surface radiation absorbed. By “total surface radiation absorbed”, I mean the total of the downwelling longwave radiation from the clouds and the atmosphere, plus downwelling sunlight at the surface, minus the upwelling reflected sunlight.

Here’s a graphic from that post. If you haven’t read it yet, you might do so as an intro to this post. Not necessary, however, as this one stands on its own.

Original Caption: *Figure 1: Gridcell by gridcell correlation of surface absorbed radiation (shortwave + longwave) and surface temperature. Gridcells are 1° latitude by 1° longitude.*

I focus in this post on the radiation balance at the surface—how much radiation is absorbed versus how much is emitted. I’ve done this because it is a very simple and transparent part of the whole. There are no intermediate steps—the surface absorbs radiation, it warms, and it emits radiation.

According to the CERES satellite data, as a 24/7 global average, upwelling (headed to space) thermal radiation from the surface is just under 400 watts per square meter (W/m2). Downwelling (headed to earth) thermal radiation from the clouds/atmosphere absorbed by the surface is about 345 W/m2. And net solar (surface downwelling less surface reflected) energy absorbed by the surface is just under 165 W/m2.

This gives a global 24/7 average of just over 500 W/m2, about half a kilowatt/m2, of radiation absorbed by the surface.

But only about 400 W/m2 are radiated away. What about the other 100 W/m2 of absorbed energy?

Well, first, something on the order of three-quarters of that energy is used to evaporate water from the surface. It’s called “latent heat”. This, of course, leaves the surface cooler than it would otherwise be if there were no latent heat loss.

The other quarter is lost via conduction to the atmosphere and subsequent convection away from the surface. It’s called “sensible heat”. This also leaves the surface cooler than it would be without sensible heat loss.

Here is a scatterplot showing the relationship and the trend of upwelling emitted surface radiation with respect to absorbed downwelling radiation.

*Figure 2. Scatterplot, where each dot is a month. For each month, the bottom “X” scale shows radiation absorbed that month, and the vertical “Y” scale shows the radiation emitted in that same month.* *The seasonal swings have been removed from the data in all graphics.*

Figure 2 shows that for each watt per square meter absorbed, only three-quarters of a watt per square meter is emitted as upwelling radiation from the surface. The rest goes to sensible and latent heat loss. (Yes, there is a tiny residual term, less than 1/2%, of energy from/to storage, mainly in the ocean. However, because it is so small, it is typically ignored in this type of first-order analysis.)

So … why is any of this of interest?

Well, back around 1880 a couple of very smart guys named Joseph Stefan and Ludwig Boltzmann figured out that there is a mathematical relationship between the temperature of an object and its thermal radiation. The relationship is given by the “Stefan-Boltzmann Law”. Using that law, if you know the radiation, you can calculate the temperature and vice versa. Figure 3 shows the same data as in Figure 2, but this time I’ve used the Stefan-Boltzmann Law to convert the upwelling surface radiation shown in Figure 2 to temperature. So in Figure 3, the vertical “Y” axis is in degrees Celsius.

*Figure 3. Scatterplot, where each dot is a month. For each month, the bottom “X” scale shows radiation absorbed that month, and the vertical “Y” scale shows the temperature of that same month.*

What this says is that because only part of the absorbed radiation turns into upwelling surface longwave emission (AKA temperature), it takes almost 7 watts per square meter of extra energy to raise the earth’s surface temperature by 1°C.

And that’s a lot. A doubling of atmospheric CO2 levels is said to increase downwelling radiation by 3.7 W/m2. So if that extra energy to raise temperatures by 1°C comes solely from increased CO2, it would be almost two doublings from our present level of 410 ppmv of CO2. The CO2 level would have to be ~ 1,500 ppmv to get a 1°C rise from the present temperature.

Here’s a graph showing how the surface temperature and the absorbed radiation at the surface are closely related.

*Figure 4. Absorbed total radiation at the surface (blue, right scale) versus temperature (red, left scale). Total radiation is the sum of the downwelling longwave (thermal) radiation from the atmosphere, plus the shortwave (solar) radiation. Also shown are the theoretical forcing increase due to CO2 over the period (yellow/black line), and the trend of total radiation absorbed (dashed cyan/black line). Dashed black line is horizontal, showing what would happen with no increase in surface absorbed radiation.*

Clearly, much more than CO2 is at play … I’ll leave this all here for further contemplation and discussion.

Evening now, dinner is over here on the hillside. Three generations live here in a rambling house I built with my own hands. We eat together every night. My gorgeous ex-fiancee and I make our dinner. Our daughter makes dinner for her husband and two kids, a 3-year-old girl and a 10-month-old boy. Works perfectly, we all get to eat together and I’d starve on what they eat …

Here’s my ex-fiancee at this very moment as seen from my window, tending her beloved garden …

With the hope that your lives contain joy, laughter, family, friends, and true sweethearts, I remain,

Yr. Obt. Svt.,

w.

**Math Note 01:** As is common in the field, I’ve used an emissivity of 1.0 to convert radiation to temperature. I could refine that, but a) Earth’s emissivity is quite high, on the order of 0.95 or greater, and b) changing the emissivity changes the absolute values, but it makes very, very little difference to the trends of interest.

**Math Note 02: **Because there is uncertainty in the “X” axis values (total radiation absorbed) in Figures 2 and 3, I’ve used Deming Regression to determine the correct trend, rather than Ordinary Least Squares Regression which underestimates the trend when there is uncertainty. And if you don’t know what Deming Regression is, don’t worry—most folks don’t know either, including most climate scientists.

**My Note:** When you comment, please ** quote the exact words you are responding to. **This avoids many misunderstandings. It is also the only way to refute someone’s idea.

via Watts Up With That?

August 30, 2022 at 04:29PM