Month: February 2020

The Mind-Boggling Cost of Net Zero

IS THE decision to reduce net carbon dioxide emissions to zero by 2050 one of extraordinary ambition or extreme foolhardiness? What we know for certain is that it isn’t an example of ‘evidence-based policymaking’. The Government and Parliament have given little thought to how it might be achieved, and how much it might cost.

Image: Shutterstock

Rather than publish their own estimates, they have outsourced this crucial task to the Committee on Climate Change (CCC). This is the organisation chaired by Lord Deben, who himself receives vast sums from green business interests through his ‘sustainability consultancy’ Sancroft. Sancroft’s clients include Drax, the largest recipient of renewable energy subsidies in the UK. If that wasn’t bad enough, Drax have also effectively appointed their own ‘Head of Sustainability’ to the CCC. These are the very last people you would go to for an independent assessment.

The CCC have kindly informed Parliament that the cost of achieving net zero emissions will be between 1 per cent and 2 per cent of GDP in 2050, with a headline figure of £50 billion. The small print of their report describes how at least half of this cost is due to increasing the target from an 80 per cent to a 100 per cent reduction in emissions. Very quietly, MPs appear to have nodded through a doubling of the cost of British climate policy with no real debate and no objections.

The limited information they received was misleading. The CCC’s estimate was for only a single year, 2050, despite the fact that there would be enormous capital costs in the intervening years. When blogger Ben Pile asked for estimates for the years from 2020-2049, he was told that such estimates did not exist. 

Equally significantly, they would not explain their methodology. This means that it is impossible to say how reliable an estimate this is, or subject it to any scrutiny. The Government is therefore relying on an estimate that it can’t possibly explain or understand, because the CCC themselves cannot explain it.

Behind the scenes there are signs of private concern within Government that it will cost much more. Before they both left office, Philip Hammond wrote a letter to Theresa May warning that the Department for Business, Energy and Industrial Strategy (BEIS) had estimated the cost would be 40 per cent higher, at £70billion per annum. This would mean households paying an average of £2,400 every year between now and 2050. 

Unfortunately, BEIS has now gone all quiet, and the Government appears reluctant to admit that these calculations ever existed.

The GWPF was determined to get to the bottom of how much Net Zero might cost, so we commissioned some of the best in the business to apply their expertise to this important question. Their work has now been published in a selection of new reports. Unlike the CCC and the Government, our calculations are there for everyone to see, and we hope that they will trigger an important debate about the costs of Net Zero.

Michael Kelly is emeritus Prince Philip Professor of Technology at the University of Cambridge, and a fellow of both the Royal Society and the Royal Academy of Engineering. His report, Decarbonising Housing: The Net Zero Fantasylooks at data from pilot studies to refine his cost estimates of achieving an 80 per cent decarbonisation target through retrofitting existing homes. That is, to make them more energy efficient by improving insulation, draught-proofing, cladding etc.

The results are staggering, but not surprising to those familiar with the impossible economics of deep retrofits. Professor Kelly found that £2.1trillion would be required, or £75,000 per house. The findings are very much in line with those from the Energy Technologies Institute (ETI), which found that £2trillion would be required for the entire UK housing stock. A further £1.5trillion would be needed to decarbonise non-residential buildings, resulting in a total cost of £3.5trillion.

The other side of the coin is electricity. Not only is it proposed that the National Grid completely decarbonise, but it must also expand significantly to power the electrification of transport and heat. Both Dr Capell Aris and Colin Gibson have decades of experience in the power sector, and they have used a levelised cost of energy (LCOE) approach to calculate the additional costs of a renewables-based grid compared to one based on gas generation. LCOE can be thought of as the minimum constant prices at which electricity must be sold in order to break even over the lifetime of the project (in present value terms). 

Their analysis in The Future of GB Electricity Supply: Security, Cost and Emissions in a Net-zero Systemexplains that to deliver net zero carbon emissions you would have to build far more wind turbines than is necessary merely to cover peak energy demand. This is because wind cannot be relied upon to deliver power when it is needed. In this scenario, generation will also often exceed demand when it is windy, meaning that excess power cannot be used. The additional capital costs of building what is termed as overcapacity, and cost of paying wind turbines to switch off on windy days, entail a far more expensive system. In monetary terms, the cost is £1.4 trillion higher than in a scenario dominated by gas-fired power stations.

In theory, you could deliver net zero with a zero-carbon electric grid, and not have to bother about energy efficiency (albeit still making significant investments in heat pumps to replace gas boilers). In reality, a combination of approaches is needed to offset the price increases expected under electric heating systems that would otherwise make them even more unpalatable. Indeed, the National Grid assume reductions of 10-26 per cent in the demand for heat in their net zero compliant scenarios.

Andrew Montford has looked at both new papers in the round to calculate the total costs of reaching net zero. He concludes in £3 Trillion and Counting: Net Zero and the National Ruin that these costs would in all likelihood exceed £3 trillion once additional investments in decarbonising industry and transport were taken into account.

It is beyond belief that Boris Johnson, Dominic Cummings and indeed most Tory MPs are yet to realise that such costs are not going to go down well with voters. These large sums may sound theoretical, but they would impose a brutal form of green austerity upon the public. The Government must be forced to level with people about the significance of the intervention they are planning to undertake, and when they do so, they must surely realise that such a damaging policy is unconscionable.

Original Post and Comments

The post The Mind-Boggling Cost of Net Zero appeared first on The Global Warming Policy Forum (GWPF).

via The Global Warming Policy Forum (GWPF)

https://ift.tt/2SYiWlC

February 25, 2020 at 08:01AM

Magnetic field at Martian surface ten times stronger than expected

Mars from NASA’s Hubble Space Telescope

Tales of the unexpected on Mars: ‘Day-night fluctuations and things that pulse in the dark’, and other mysteries. What’s unique to Mars?

New data gleaned from the magnetic sensor aboard NASA’s InSight spacecraft is offering an unprecedented close-up of magnetic fields on Mars, says Phys.org.

“One of the big unknowns from previous satellite missions was what the magnetization looked like over small areas,” said lead author Catherine Johnson, a professor at the University of British Columbia and senior scientist at the Planetary Science Institute.

“By placing the first magnetic sensor at the surface, we have gained valuable new clues about the interior structure and upper atmosphere of Mars that will help us understand how it – and other planets like it – formed.”

Zooming in on magnetic fields

Before the InSight mission, the best estimates of Martian magnetic fields came from satellites orbiting high above the planet, and were averaged over large distances of more than 150 kilometres.

“The ground-level data give us a much more sensitive picture of magnetization over smaller areas, and where it’s coming from,” said Johnson. “In addition to showing that the magnetic field at the landing site was ten times stronger than the satellites anticipated, the data implied it was coming from nearby sources.”

Scientists have known that Mars had an ancient global magnetic field billions of years ago that magnetized rocks on the planet, before mysteriously switching off. Because most rocks at the surface are too young to have been magnetized by this ancient field, the team thinks it must be coming from deeper underground.

“We think it’s coming from much older rocks that are buried anywhere from a couple hundred feet to ten kilometres below ground,” said Johnson. “We wouldn’t have been able to deduce this without the magnetic data and the geology and seismic information InSight has provided.”

The team hopes that by combining these InSight results with satellite magnetic data and future studies of Martian rocks, they can identify exactly which rocks carry the magnetization and how old they are.

Day-night fluctuations and things that pulse in the dark

The magnetic sensor has also provided new clues about phenomena that occur high in the upper atmosphere and the space environment around Mars.

Just like Earth, Mars is exposed to solar wind, which is a stream of charged particles from the Sun that carries an interplanetary magnetic field (IMF) with it, and can cause disturbances like solar storms. But because Mars lacks a global magnetic field, it is less protected from solar weather.

“Because all of our previous observations of Mars have been from the top of its atmosphere or even higher altitudes, we didn’t know whether disturbances in solar wind would propagate to the surface,” said Johnson. “That’s an important thing to understand for future astronaut missions to Mars.”

The sensor captured fluctuations in the magnetic field between day and night and short, mysterious pulsations around midnight, confirming that events in and above the upper atmosphere can be detected at the surface.

The team believe that the day-night fluctuations arise from a combination of how the solar wind and IMF drape around the planet, and solar radiation charging the upper atmosphere and producing electrical currents, which in turn generate magnetic fields.

“What we’re getting is an indirect picture of the atmospheric properties of Mars – how charged it becomes and what currents are in the upper atmosphere,” said co-author Anna Mittelholz, a postdoctoral fellow at the University of British Columbia.

And the mysterious pulsations that mostly appear at midnight and last only a few minutes?

“We think these pulses are also related to the solar wind interaction with Mars, but we don’t yet know exactly what causes them,” said Johnson. “Whenever you get to make measurements for the first time, you find surprises and this is one of our ‘magnetic’ surprises.”

Full report here.

via Tallbloke’s Talkshop

https://ift.tt/2SWl792

February 25, 2020 at 07:38AM

New Video : 3983 Days Left For The Planet

New Video : 3983 Days Left For The Planet

This entry was posted in

Uncategorized

. Bookmark the

permalink

.

via Real Climate Science

https://ift.tt/2HSyTn5

February 25, 2020 at 05:35AM

Want to catch a photon? Start by silencing the sun

Quantum breakthrough uses light’s quirky properties to boost 3D imaging, paving the way for enhanced performance in self-driving cars, medical imaging and deep-space communications

Stevens Institute of Technology

Even with a mesh screen covering an object, (top), Stevens quantum 3D imaging technique that generates images 40,000x clearer (middle) than current technologies (bottom). The technology is the first real-world demonstration of single-photon noise reduction using a method called Quantum Parametric Mode Sorting, or QPMS, which was first proposed by Huang and his team in a 2017 Nature paper. Unlike most noise-filtering tools, which rely on software-based post-processing to clean up noisy images, QPMS checks light's quantum signatures through exotic nonlinear optics to create an exponentially cleaner image at the level of the sensor itself. Detecting a specific information-bearing photon amid the roar of background noise is like trying to pluck a single snowflake from a blizzard -- but that's exactly what Huang's team has managed to do. Huang and colleagues describe a method for imprinting specific quantum properties onto an outgoing pulse of laser light, and then filtering incoming light so that only photons with matching quantum properties are registered by the sensor. The result: an imaging system that is incredibly sensitive to photons returning from its target, but that ignores virtually all unwanted noisy photons. The team's approach yields sharp 3D images even when every signal-carrying photon is drowned out by 34 times as many noisy photons. "By cleaning up initial photon detection, we're pushing the limits of accurate 3D imaging in a noisy environment," said Patrick Rehain, a Stevens doctoral candidate and the study's lead author. "We've shown that we can reduce the amount of noise about 40,000 times better than the top current imaging technologies." That hardware-based approach could facilitate the use of LIDAR in noisy settings where computationally intensive post-processing isn't possible. The technology could also be combined with software-based noise reduction to yield even better results. "We aren't trying to compete with computational approaches -- we're giving them new platforms to work in," Rehain said. Credit Stevens Institute of TechnologyEven with a mesh screen covering an object, (top), Stevens quantum 3D imaging technique that generates images 40,000x clearer (middle) than current technologies (bottom). The technology is the first real-world demonstration of single-photon noise reduction using a method called Quantum Parametric Mode Sorting, or QPMS, which was first proposed by Huang and his team in a 2017 Nature paper. Unlike most noise-filtering tools, which rely on software-based post-processing to clean up noisy images, QPMS checks light's quantum signatures through exotic nonlinear optics to create an exponentially cleaner image at the level of the sensor itself. Detecting a specific information-bearing photon amid the roar of background noise is like trying to pluck a single snowflake from a blizzard -- but that's exactly what Huang's team has managed to do. Huang and colleagues describe a method for imprinting specific quantum properties onto an outgoing pulse of laser light, and then filtering incoming light so that only photons with matching quantum properties are registered by the sensor. The result: an imaging system that is incredibly sensitive to photons returning from its target, but that ignores virtually all unwanted noisy photons. The team's approach yields sharp 3D images even when every signal-carrying photon is drowned out by 34 times as many noisy photons. "By cleaning up initial photon detection, we're pushing the limits of accurate 3D imaging in a noisy environment," said Patrick Rehain, a Stevens doctoral candidate and the study's lead author. "We've shown that we can reduce the amount of noise about 40,000 times better than the top current imaging technologies." That hardware-based approach could facilitate the use of LIDAR in noisy settings where computationally intensive post-processing isn't possible. The technology could also be combined with software-based noise reduction to yield even better results. "We aren't trying to compete with computational approaches -- we're giving them new platforms to work in," Rehain said. Credit Stevens Institute of Technology

Even with a mesh screen covering an object, (top), Stevens quantum 3D imaging technique that generates images 40,000x clearer (middle) than current technologies (bottom). The technology is the first real-world demonstration of single-photon noise reduction using a method called Quantum Parametric Mode Sorting, or QPMS, which was first proposed by Huang and his team in a 2017 Nature paper. Unlike most noise-filtering tools, which rely on software-based post-processing to clean up noisy images, QPMS checks light’s quantum signatures through exotic nonlinear optics to create an exponentially cleaner image at the level of the sensor itself. Detecting a specific information-bearing photon amid the roar of background noise is like trying to pluck a single snowflake from a blizzard — but that’s exactly what Huang’s team has managed to do. Huang and colleagues describe a method for imprinting specific quantum properties onto an outgoing pulse of laser light, and then filtering incoming light so that only photons with matching quantum properties are registered by the sensor. The result: an imaging system that is incredibly sensitive to photons returning from its target, but that ignores virtually all unwanted noisy photons. The team’s approach yields sharp 3D images even when every signal-carrying photon is drowned out by 34 times as many noisy photons. “By cleaning up initial photon detection, we’re pushing the limits of accurate 3D imaging in a noisy environment,” said Patrick Rehain, a Stevens doctoral candidate and the study’s lead author. “We’ve shown that we can reduce the amount of noise about 40,000 times better than the top current imaging technologies.” That hardware-based approach could facilitate the use of LIDAR in noisy settings where computationally intensive post-processing isn’t possible. The technology could also be combined with software-based noise reduction to yield even better results. “We aren’t trying to compete with computational approaches — we’re giving them new platforms to work in,” Rehain said. Credit Stevens Institute of Technology

Researchers at Stevens Institute of Technology have created a 3D imaging system that uses light’s quantum properties to create images 40,000 times crisper than current technologies, paving the way for never-before seen LIDAR sensing and detection in self-driving cars, satellite mapping systems, deep-space communications and medical imaging of the human retina.

The work, led by Yuping Huang, director of the Center for Quantum Science and Engineering at Stevens, addresses a decades old problem with LIDAR, which fires lasers at distant targets, then detects the reflected light. While light detectors used in these systems are sensitive enough to create detailed images from just a few photons – miniscule particles of light that can be encoded with information – it’s tough to differentiate reflected fragments of laser light from brighter background light such as sunbeams.

“The more sensitive our sensors get, the more sensitive they become to background noise,” said Huang, whose work appears in the Feb. 17 advanced online issue of Nature Communications. “That’s the problem we’re now trying to solve.”

The technology is the first real-world demonstration of single-photon noise reduction using a method called Quantum Parametric Mode Sorting, or QPMS, which was first proposed by Huang and his team in a 2017 Nature paper. Unlike most noise-filtering tools, which rely on software-based post-processing to clean up noisy images, QPMS checks light’s quantum signatures through exotic nonlinear optics to create an exponentially cleaner image at the level of the sensor itself.

Detecting a specific information-bearing photon amid the roar of background noise is like trying to pluck a single snowflake from a blizzard — but that’s exactly what Huang’s team has managed to do. Huang and colleagues describe a method for imprinting specific quantum properties onto an outgoing pulse of laser light, and then filtering incoming light so that only photons with matching quantum properties are registered by the sensor.

The result: an imaging system that is incredibly sensitive to photons returning from its target, but that ignores virtually all unwanted noisy photons. The team’s approach yields sharp 3D images even when every signal-carrying photon is drowned out by 34 times as many noisy photons.

“By cleaning up initial photon detection, we’re pushing the limits of accurate 3D imaging in a noisy environment,” said Patrick Rehain, a Stevens doctoral candidate and the study’s lead author. “We’ve shown that we can reduce the amount of noise about 40,000 times better than the top current imaging technologies.”

That hardware-based approach could facilitate the use of LIDAR in noisy settings where computationally intensive post-processing isn’t possible. The technology could also be combined with software-based noise reduction to yield even better results. “We aren’t trying to compete with computational approaches — we’re giving them new platforms to work in,” Rehain said.

In practical terms, QPMS noise reduction could allow LIDAR to be used to generate accurate, detailed 3D images at ranges of up to 30 kilometers. It could also be used for deep-space communication, where the sun’s harsh glare would ordinarily drown out distant laser pulses.

Perhaps most excitingly, the technology could also give researchers a closer look at the most sensitive parts of the human body. By enabling virtually noise-free single-photon imaging, the Stevens imaging system will help researchers create crisp, highly detailed images of the human retina using almost invisibly faint laser beams that won’t damage the eye’s sensitive tissues.

“The single-photon imaging field is booming,” said Huang. “But it’s been a long time since we’ve seen such a big step forward in noise reduction, and the benefits it could impart to so many technologies.”

###

From EurekAlert!

via Watts Up With That?

https://ift.tt/2SVimVG

February 25, 2020 at 04:06AM