There have been articles on WUWT recently, here and here, commemorating the 30 years since James Hansen gave Senate committee testimony about his view of the human influence on climate. Some apologists for Hansen have, without more than subjectively comparing graphs, claimed that his prediction was extremely accurate. The following is his official 1988 prediction for three different scenarios of future trace-gases implicated in anthropogenic global warming:
I have highlighted the observed 1958-1988 annual average temperatures in red to make the line more legible.
The apologist’s claims for extreme accuracy are based on the subjective impression that the temperatures over the last 30 years have tracked his prediction of temperatures from forcing of intermediate ‘greenhouse gas,’ other trace- gasses, and aerosol assumptions (Scenario B). He assumed two significant volcanic eruptions during that 30-year period. However, there was only one, Mt. Pinatubo (1991, VEI 6). Therefore, had he assumed that there would only be one eruption, his estimates would have been higher and would have tracked B even more poorly than they have. Were it not for two exceptionally strong El Niño events in the last 20 years, it is unlikely that current temperatures would be anywhere near as high as they are currently. However, he did not consider the role of El Niño’s in his computer model. Therefore, it is just luck that his predictions came as close to reality as they did. The greatest intellectual ‘sin’ for a scientist is to be right for the wrong reasons!
Hansen dramatically emphasized that “The most recent two seasons (Dec.-Jan.-Feb and Mar.-Apr.-May, 1988) are the warmest in the entire record.” This is really a non sequitur. It would be notable if the last point(s) in a long upward-trending series were not the warmest in the series. And, indeed, the 27 seasons preceding the two 1988 record temperatures were all lower than the 1981 seasonal high! (See the next graph, below) Basically, Hansen got lucky again that he had a couple of warm seasons that allowed him to make such a statement to impress the uncritical Senators. Otherwise, he would have had to truncate his graph at 1981 to make a similar claim. He also added an extra season of data to his ‘30-year’ time-series, probably to accentuate the claim. Two seasons sounds more impressive than one season.
Hansen claimed “The warming is almost 0.4 degrees Centigrade [sic] by 1987 relative to … the 30 year mean, 1950 to 1980 … The probability of a chance warming of that magnitude is about 1 percent.” The first graph, above, with the red line, shows that 0.3 °C would be a more accurate estimate. One should be suspicious of such a claim when his own data demonstrated that the temperature had already exceeded that for one season in 1981! Are we to believe that at least two events with a 1% probability occur within 7 years of each other? He then claimed that the recent temperatures were about three times the standard deviation (0.13) of the baseline annual temperature average. Actually, the standard deviation of the annual averages for the 1958 to 1988 period is more like 0.15. Thus, the 1988 quarterly temperatures were about two ‘standard deviations’ above the previous 30 years of temperatures, not three! He was playing loose with the facts!
The values at the beginning and end of a noise-free, increasing, linear time-series will have the largest differences from the mean; meaning, that it is expected that they will likely have the largest standard deviations from the mean. The standard deviation of a time-series varies directly with the slope of a smooth trend-line and, the number of samples. To analyze time-series data properly, it should be de-trended, the mean set to zero, and the residuals used to get an accurate estimate of the probability of a random deviation from the mean. Hansen should know that! He is describing behavior (the first two data points of 1988) that is a function of the slope, not the internal variance of the data. Again, Hansen is trying to snow the Senators. No one is arguing that temperatures aren’t increasing. It is evident from the graphs. Nevertheless, he is offering sophistry to convince the Senate committee that what we are seeing is extremely rare.
However, most of his apologists are engaging in qualitative hand waving and not using any mathematical or statistical analysis to quantify the quality of his prediction, even when based on false assumptions. The question is, “How skillful was Hansen in quantitatively predicting the climate for the next 30 years, based on a computer program that assumes CO2 is the ‘control knob’ on global temperatures?”
If one were to fit a linear, least-squares regression to Hansen’s 30-year data, and extrapolate it 30 years into the present, how would it compare with Hansen’s prediction? I’ll call this a naive prediction, a simple extrapolation of past trends, without making any assumptions about the cause of the warming, or the presence of extenuating influences such as volcanic aerosols. Implicit in this naive prediction is that it represents “Business as Usual,” the trace-gas assumption made by Hansen for his Scenario A.
The following graph is derived from Hansen’s Figure 2 in his Senate testimony, with higher temporal resolution than the prediction graph above. According to Hansen, the baseline for the calculation of the ΔT (anomaly) is the average temperature for the period of 1950 through 1980, although, he does not provide that average temperature.
The R2 value for the red trend-line is lower than I would like to see, with time only accounting for about 19% of the variance in temperature. Smoothing with annual averages and removing the seasonality would increase the R2 value. Nevertheless, substituting 60 years for x in the regression equation, gives a predicted anomaly of 0.467 °C for 2018. That is lower than Hansen’s Scenario C – that is, reduction (“draconian emission cuts!”) of the things driving warming – was supposed to be. Note that the slope of the regression line predicts a change of less than 1 °C per century, about the amount commonly claimed for the last century. Therefore, it may be a reasonable long-term approximation of future warming.
However, the period of 1958 to 1988 probably is not the best interval to use for prediction. The R2 value is low also because there are two distinct trends – one negative and a longer one that is positive. It appears that the 30-year interval he used was a pragmatic choice to provide the most current measurements for the Senate testimony. The two trends change over between about 1964 and 1970. Calculating the slope of the trend-line for the period of 1964 through 1988 would probably be the best choice. Doing so, provides better than a two-fold increase in the R2 value and increases the slope of the trend-line (green) to 0.0181 °C per year, for a predicted anomaly of about 0.803 for 2018.
Note that the slope of the trend-line (green) for the period from 1970 through 2018 is almost the same as that obtained above ― 0.017 versus 0.018 °C per year. In hindsight, it appears that the selection of the period from 1964 through 1988 was a better choice for future prediction than the entire 30-year set. This graph shows an estimated anomaly of about 0.80 °C for 2018. Unfortunately, he does not provide an R2 value for his “Best Linear Fit,” nor are error bars provided for the temperatures, which is typical for his work.
It appears that the average global-temperature trend has been reasonably well behaved since at least 1970, and probably since about 1964. That is, it has not shown the longer-term non-linearity characteristic of Hansen’s 1988 predictions.
Let’s return now to the graph first shown, which was Hansen’s 1988 prediction with Scenarios A, B, and C. I have plotted the naive extrapolation, in green, on top of the original graph.
In the early years, the three scenarios are all close to each other, and probably close to the uncertainty of the temperature measurements. Therefore, I’ll focus on the 21st Century behavior. Note that the trend-line (green) best tracks Scenario C, the “draconian emission cuts!” I suspect that Hansen scheduled one of his two hypothetical volcanic eruptions for about 2014, driving Scenarios B and C temperatures down temporarily. The trend-line prediction is clearly lower than Scenario B (intermediate trace-gas), and much lower than Scenario A, the supposed “Business As Usual.” How much lower you ask? For 2019, the trend-line predicts an anomaly of about 0.80, while Hansen’s Scenario A is about 1.55; that is almost 94% higher than the trend-line prediction. Looked at another way, Scenario A has a slope (post 1988) of about 0.033 while the observed temperatures have a slope of about 0.017. That is, Scenario A has a slope twice the slope of the naive prediction, which matches the observed data recorded since 1988.
Using only Hansen’s own data, the above demonstrates that Hansen was not “extremely accurate” in his 1988 predictions because a simple, commonly unreliable, linear extrapolation performed better than his model in predicting the last 30 years of temperatures. One of the consequences of demonstrating the ‘Business As Usual’ linear extrapolation of past temperatures as being superior to the model used by Hansen, is that it isn’t necessary to appeal to anthropogenic influences to account for a phenomenon that started 12 millennia ago, with the end of the last major glaciation. Occam’s Razor suggests that the best explanation for something is the simplest explanation. That is, there is no compelling need to complicate the explanation with human interference. Climate changes. That is what it does. That is why climatologists use a 30-year average of weather to define a climate regime or episode. While I’m sure that humans are having an impact on climate, it isn’t just their CO2 emissions, and it certainly isn’t fossil fuel combustion that is the primary control of temperature. Notwithstanding how poor Hansen’s predictions actually were, I think we should still keep before us his assessment of computer modeling:
“There are major [my emphasis added] uncertainties in the model, which arise especially from assumptions about (1) global climate sensitivity and (2) heat uptake and transport by the ocean, …”
He should have mentioned also the need for parameterization of clouds in the models. In any event, we should take computer model ‘projections’ with a grain of sea salt – and anything that Hansen says with a block of salt.
Steven Mosher graciously provided the original graphs and quotes from Hansen through a link to a copy of the Senate Committee testimony, which he uploaded to WUWT comments:
Richard Rhodes won the Pulitzer Prize for The Making of the Atomic Bomb. He’s written countless other books and his new book about energy called Energy: A Human history.
He takes us on a journey through the history behind energy transitions over time from wood to coal to oil to renewables and beyond. Some stories are well known, some much less so, a fascinating set of characters that go back in time all the way to Elizabethan England.
It also provides fascinating insights into how energy history should help us today understand a possible energy transition of the future towards lower carbon economy, to provide affordable reliable and sustainable energy for a growing global population. Rhodes shared some transition stories in an interview with Jason Bordoff at Columbia University, podcast and transcription entitled Richard Rhodes — Energy: A Human History. Excerpts below in italics are from Rhodes unless otherwise indicated, with my light editing, headers, bolds and images.
Jason Bordoff: And so tell me what you – let’s go back kind of further back and start a little bit at the beginning. I think some people may be familiar with more recent energy innovations electricity obviously, oil, nuclear power. But you really start with animals, with woods and what it meant for human civilization as we know it to depend on that for their energy sources and how transformation it was to then convert initially to coal and then beyond. Talk a little bit about how big, how big a deal that was and what it meant for the human experience as we know it.
From Wood to Coal in Elizabethan England
Richard Rhodes: Well, the story of the transition by the Elizabethan English from wood to coal was one of the most fascinating and in some ways comical although of course it wasn’t comical for them. Parts of the whole history and I mean I have to say one of the things that I wanted to do with this book was to tell the human stories that are behind the technologies that involves, so many books on the history of energy focused almost entirely on the technological changes.
The Wood Burning Society
And of course there are vast human stories, because changing from one source of energy to another is as much a social phenomenon as it is a technical phenomenon perhaps more so. So, the Elizabethans had been cutting down their trees in vast numbers, primarily for firewood for their homes. And they burned firewood usually on wood on stern platforms or fire places set against the wall that didn’t have chimneys.
They like the smell of wood and they thought that the smoke hardened their rafters, so either there was just a hole in the roof leading straight up from the fire place or they let the smoke drift through the rooms and out through the windows. Well, that was fine as long as they had enough wood, but as they cut the wood down farther and farther away from London it got more and more expensive to transport.
Substituting Coal for Wood
So, eventually it reached the point where it was really too expensive for the common people to afford, at that point the only alternative that they had was really smelly bituminous coal from New Castle up the river — up the country in the northeast, and they didn’t like its characteristics compared to wood. First of all imagine lighting a bituminous coal fire in the minute of your living room with no place for the smoke to go and imagine what – you’d be coughing and breathing from that. And then on top of that imagine resting your beef, your good English beef over a coal fire with all the sulfur that’s in coal smoke.
In a way England was just one vast coal mine that these layers of coal which was black and dirty and smelled sulfurous when you burned it was literally the devils exponent. If the devil had hell down in the center of the earth this is where his body waste accumulated further up during the surface. Well, that obviously didn’t mean dear coal to the populous. So, they really struggled with it and basically what happened is the rich kept buying wood which they could afford and the poor had to find a way to survive with coal and they hated it.
A New King Adopts Coal
The transition really was a social transition when Elizabeth died at the end of the seventh century at the end of the sixteen century just around 1598 or so King James I — VI of Scotland became the King of England. And he came down to London as James I and the Scots who had a much thinner forest to begin with up north then the English had had already switched to coal a long time ago. They had been working on coal for a hundred years.
And so Scottish coal was better quality, it didn’t have so much sulfur in it, so when the King came to London and started burning coal in the castle it became fashionable. Well, the King does it I suppose we can too, was the result and after that the transition was much facilitated. In addition they had to retrofit all the homes that didn’t have proper chimneys with chimney, which is another lesson that has extended across the entire history of energy transitions.
Converting Society to Burn Coal
In this regard it seems so simple – you find a new source of energy when it’s old – when it is causing you troubles and you switch over to it. I mean that’s the way people are talking today about wind and solar and other renewable’s. But it turns out, it takes anywhere to a 50 to 100 years to make a full scale energy transition, because it’s not just a matter of the technology at all, it’s a matter of all sorts of social and societal changes.
In this case for example even they had to retrofit all the chimneys. They had to open coal mines and find a way to transport the coal down to London and the years to perform. They had to develop markets. They would sell the coal and then most of all you had to figure out how to burn it in your home without making the place un-inhabitable. So, it took a while. It was not really until the 1650s and the 1660s that coal had really moved in England and then of course they had the problem of air pollution.
Inventing the Coal Industry and Society
Well, just staying with the English, ones they started digging coal they first dug of course the superficial layers that tended to out crop on hillsides. So, they could easily drain their mines just by putting in what they call adits which were to — basically channels for the water to flow out. But as they continued to dig deeper as they used the superficial coal they began to intersect the water table and the mines began flooding. They tried pumping them out with horses and what were called Rims which were basically horse turned pumps.
But that got more and more difficult as the mines continued to deepen, they were going down as far as 800 feet below ground to dig their coal. It’s hard to pump water that far with just a couple of horses. So, the solution that they found as time went on and this is now the early eighteenth century the middle eighteenth century was to develop an engine – a steam engine an early form of steam engine that was very inefficient less than 1%.
So, it was a big thing the size of a house, they would kind of sit on top of the coal mine opening to the surface and pump out the water, so that the mines could continue to be mined. This was the Newcomen engine which basically produced a vacuum which then allowed atmospheric pressure to rush in and function as a pump. That limited its function to the pressure of the atmosphere about 32 feet of lift. And therefore there continued to be a desire for innovation, a better way to pump water farther along, because if you had a – let’s say a 300 foot shaft in a coal mine.
Energy Necessity Calls for Innovation
The only way you could pump with a Newcomen engine would be to put the engines every 32 or so feet up and down the shaft, which was not a very efficient idea, especially since coal mines tend to release a certain amount of methane and other gases. And there were lots of explosions that people had to deal with. So, it quickly became apparent that there was a place for a better steam engine that’s where James Watt the Scotsman came along with and invented a true steam engine one that worked by using steam expand and push the piston back and forth.
And it could pump as much as its capacity was – was built to pump then of course they had the problem of moving the coal from the mine down to the river or the ocean in order to barge it to London. Again moving stuff around which turns out to be a large part of the problems in dealing with these forms of energy. At first the mines were close enough to the water to simply put the coal on a cart and roll it down hill. They used rails to do that, originally wooden rails, but then they started covering the wooden rails with cast iron plates on top to make it more efficient.
Transformation into a Coal Society
And you know, once you switch from wood to coal and as the country began to industrialize, particularly with advent of the steam engines, coal production got more and more enormous. And it wasn’t just a matter of heating homes anymore it became a matter of running factories as well. And you couldn’t do that with bags of coal on the saddles of horses, you needed some more large scale way to move the coal around.
Once the mines were farther back from the valleys where the rivers ran or the canals as they came to be, you had to find a way to move the material uphill as well as downhill. And horses weren’t going to do that job not at the scale that England was operating by then. So, someone realized that if you had a small steam engine and by then Watts engine could be made fairly small, you could mount them on wheels and move the coal with the steam engine, which is a certainly a railroad engine.
The Coal-Based Society Emerges
This whole story is about how self reinforcing all of these things were. You need to go deeper and deeper to get the coal for heating and cooking. You develop an innovation like the steam engine to do that, which enables an innovation to transport the coal, and then uses the very energy you were trying to get for another purpose to power that new innovation.
And once it was clear that you could move railroad carts of coal with a steam engine, someone realized that you could move people too. And England suddenly blossomed with railroads all over the country. The canal age was over and the railroad age began and it all followed from this early transition from wood to coal. And all the industrialization that came with the development of a stable, reliable source of continual power, which water power had not been, and animal power had been limited.
Here was an engine you fed it coal and it gave you – it gave you a turning wheel that would turn mills to loom cotton that would turn mills to make steel whatever you needed to do. So, it really was an innovation stage by stage just kind of piggyback from one to the next, really a fascinating transition time in history.
The Downside of Coal Energy
One of the interesting things that’s very clear, and it’s as clear today in Beijing as it was in London at 1660. The first thing you do is get your energy, you do what you have to do to increase the energy supply to your country or your society. Then as a kind of a luxury good in a way, you start looking at how to reduce the baleful side effect such as air pollution that comes along with that source of energy.
The first paper published by the newly formed royal society of London in 1664 was a study of how to improve the air in London and it was remarkably similar to today’s ideas to move industry into the suburbs, to ring the city with plant life, trees. So, this particular writer proposed all sorts of wonderful trees that put out perfume during their flowering season that should be built in a belt around London.
The King was so busy having just been restored to power – after the Roundhead Revolution that had caused his father to be beheaded– that he was much too busy selling monopolies and refilling his coffers to actually do anything about it. But the point is people were thinking about this prospect and it was not different from what happened when Pittsburgh at the turn of the century was so filled with coal smoke that from a nearby hill where you barely can see the city.
Pittsburgh Faces Coal Air Pollution
And that was pretty much true in most American cities up until the 1950s. As they went about cleaning up their air supply in the early 1950s, there was a proposal by United States government to share the cost of building the first commercial nuclear power plant in the United States at a place called Shippingport near Pittsburgh on the river. I talked to the president of the Duquesne Light which was the company that was going to be the private contractor for this power plant.
He said you know, we sold this power plant to the city council of Pittsburgh as a green technology. People have come to think of nuclear as the devil’s excrement, but compared to burning coal, compared to burning what they had available to burn at the time, nuclear was great with its total absence of carbon production. The past can really inform the present when you look at how things have been done before and why they were done that way and what lessons they offer us to learn in the process.
Smog in Los Angeles
The stories that I tell are very much intertwined. So let’s jump to Los Angeles in the 1950s when what we now call smog was beginning to be a very serious problem there. The companies that refined oil in and around Los Angeles wanted to do whatever was available to clean up the air pollution, because it was commonly believed that it all was coming from their refineries or from trash burning.
Previously cities had been focused and states had been focused primarily on coal smoke – on smoke and its baleful effects on the atmosphere. Smog was originally smoke and fog two words combined. But in the 50s in Los Angeles they became this photochemical phenomenon that was going on in the atmosphere that was making everything look brown. And the question now was what do you do about that?
Dr. Haagen-Smit at Caltech was carrying out an exercise in identifying the perfume essence of ripe pineapple. So, he had a room full of ripe pineapples that he was sucking the air in the room through a machine that included some liquid nitrogen that would freeze out of the air, the essential aroma of chemicals. It was to Dr. Haagen-Smit that the California county people turned and asked him if he could identify the component of this smog that was in the air.
So, he used the same machinery, but he put the pineapples away and he opened the window and sucked in about 30,000 liters of California smog into the room ran it through his machine and ended up with a few drops of very nasty brownish sticky material, which was essence of California photochemical smog, and identified where it came from. And its primary component with the other things like the refineries and so forth were certainly a part of it.
But the main component was automobile exhaust and that gave Los Angeles the beginning of what turned out to be a large national struggle with the automobile manufactures to get them to put catalytic converter on their automobiles and eventually to get rid of the nitrous oxides, which was another component of automobile exhaust that was deadly. I repeat this not merely a technical book this is really a collection of the most amazing human stories.
But Haagen-Smit who was then of course put down by the great laboratories that had been turned to by automobile companies to refute his work, he with his simple experiments, he was a veteran of World War II. He had been a survivor of World War II so he knew how to make things simple. He with his laboratory work was able to identify what needed to be done and finally by the 1980s the entire country was trying to deal with smog by way of adding catalytic converter to cars.
Whale Oil and Petroleum
It’s a truism of the oil industry that the petroleum saved the whales. And they say that because one of the main sources of lighting for wealthier people – it was pretty expensive this whale oil–particularly spermaceti which was the very lovely refined oil that whales carry on their heads as a way of controlling their buoyancy. By heating and cooling the oil in their heads they can adjust their neutral buoyancy and therefore don’t sink to the bottom or rise to the top unless they want to.
So, these beautiful whales were used to make candles and were collected at the rate of 10,000 whales a year at the height of the whaling industry as Herman Melville beautifully describes it in Moby Dick. But most people couldn’t afford whale oil that was pretty expensive item. What they actually used — and I and most people had never known about this — was something that was called burning fluid which was basically the sap of the long leaf pines of south eastern United States which could be refined into turpentine and the turpentine could then be mixed with plain alcohol.
And with a little bit of menthol to sweeten the smell because turpentine burning is not a great smell, this then became something called burning fluid which is what almost everyone used in their lamps. One particular brand of burning fluid was called kerosene we know that name from its later application to petroleum I’ll jump to that in a sec. But — so most people burn lamps or they simply burn the cheap tallow candles, which smell like burning beef fat, not a great smell in your home either. Then came the discovery of petroleum in 1859 or rather the discovery of “rock oil” or “coal oil”. If you could drill for this stuff and pump it out in vast quantities, you could make all the kerosene you wanted.
There had been petroleum seeps in various parts of the country and particularly one in Pennsylvania where they tried to use the petroleum by soaking it up as it floated on the surface of streams where it oozed out from underground, in blankets, and squeezing the blankets out into a jug, and then selling that for liniment to rub on your sores and on your gums and swallow as a healthy item and so forth, if you can imagine. Anyway, once Colonel Drake went off to oil city as it came to be called and drilled a well and showed how you could pump oil out of the ground or indeed some wells would pump it for you and spate it into the air.
Gasoline, A Dangerous Byproduct, Transforms Society
All of a sudden petroleum was the new stuff, but it wasn’t the new stuff for powering machinery, nobody had found that use yet. Its first use for the next 50 years was for lighting, once they figured out how to refine petroleum into what was now called kerosene made from petroleum or it was used for the lubrication. But since the automobile hadn’t been invented and they had among their waste products their refinery — this stuff called gasoline which was much too volatile to put in a lamp.
The lamp would blow up from the fumes, so they would either pour it out on the ground to evaporate it into the air or they dump it into the streams and rivers of America in the dark of the night and so much other waste was in those days. It was beginning in the 1880s to be question among oil refiners so would they kind of run out of possible uses for their stuff. In a way the automobile saved petroleum. It was the automobile that came along just at the turn of the century and the industry took off.
Beyond the Petroleum Society
Jason Bordoff: You write about the disruptive, unexpected consequences of these innovations. As you said the whales are being slaughtered by the 10s of 1000s and oil is discovered and then that significantly reduces demand for whale oil. And then you wrote about the automobile and how one of the major problems at the turn of the century was horse populations and horse manure in cities like London and New York. Problems were sort of solved with the technology innovation that we didn’t expect. Does that tell you anything about what’s coming around the corner and maybe the level of humility we should have for our ability to anticipate it.
Richard Rhodes: We’re now in the middle of what I think is the largest energy transition in human history. And you know, I’ve written so much around this subject and here was the chance to take a look all the way back to what was really the beginning and the rest. One of the things that I discovered when I was working on The Making of the Atomic bomb, in fact one of the reasons I wrote that book is because we seem to be in the early 1980s at a crossroads where it looked so dangerous for the world. All the nuclear weapons brought in the world.
And it seems to me that if we went back to the beginning and took another look there might have been alternative pathways that would have led in a safer and better direction. And I thought that my thing – the same thing might be true for our energy dilemmas of today. So, that’s the reason I wrote the book.
I simply say we have to use every available energy source that isn’t carbon heavy in order to survive this largest of all energy transitions. But much of the world is just in the process of developing; that is to say people who have lived for millennia in deep poverty are slowly beginning to see the possibility. China being the most obvious example of moving up to the kind of middle class lives that we in the United States pretty much take for granted.
So, we have a double problem which is increasing the energy usage of large numbers of people around the world while at the same time reducing the carbon levels of the energy we use that is a really big challenge, bigger than people realize. And that means that we’re not going to be able to sit down and say well nuclear is dangerous, because once in a while nuclear power plant blows up–which is true of any energy source and particularly unusual in the nuclear world by the way.
We’re going to have to find a way to work with nuclear, as well as, these other energy sources. You cannot power the world on renewables. The United States is rich enough that if it really wanted to it could probably work out a way to run its entire energy economy on renewables although I don’t think it would be a very efficient system it would be a very expensive system. But the rest of the world doesn’t really have that luxury. Right now China has on the drawing boards or in development some 125 nuclear power reactors. They are not even to deal with global warming. They are to deal with air pollution. And the Chinese are selling coal to the rest of the world unfortunately.
So, when Germany for example decided to eliminate its nuclear power and go all renewables, it has found itself compelled by its own energy demands to increase its use of brown coal which is the most carbon producing of all the various kinds of coal. They actually have increased their production of carbon dioxide since they decided to go — to eliminate their nuclear power supply.
The Italians eliminated their nuclear power, so now they buy their electricity from the French and the French of course are about 80% nuclear, which is pretty hypocritical of the Italians. I mean this is the kind of discussion that I think we’re all going to have to have and swallow hard and look again at nuclear, look again at all the other sources of energy we can think of that are not carbon producing to deal with what is the worlds – the histories most enormous energy transition yet.
Last Chance to Book for Tony Abbott Lecture: Melbourne, 3 July 2018
The place to be on Tuesday night.
”Climate Change & Restraining Greenhouse Gas Emissions”
Last days to book your tickets for the Bob Carter Commemorative Lecture given by the Hon. Tony Abbott—the former PM and current MHR for Warringah in NSW—on 3 July 2018.
Tickets: Book them through Eventbrite. Tickets:$35 for AEF members and $42 for others.
Book Tickets here.
The AEF is hosting the Lecture at CQ Functions in the Melbourne CBD commencing at 5:30 pm. It will be the second in the lecture series, which AEF has established to commemorate the life and work of Professor Robert (Bob) Carter, who was, among many other things, a former AEF Director and its Scientific Adviser.
Mr Abbott’s address will conclude with a question and answer session.
Dr Peter Ridd will move the vote of thanks.
Light refreshments will be served before and after the formalities. During this time, attendees may purchase drinks from the bar..
Please invite your friends and colleagues to join us for what promises to be an exceptional evening involving one of the most critical public policy issues of the […]