By KEVIN KILTY,
This past March 12 the Center of the American Experiment (CAE) released a study of projected power costs for Minnesota on the basis of its new policy mandating 50% renewable energy by year 20301. This study was soon afterward reported on the blogs PowerLine and Manhattan Contrarian.
Among the assumptions CAE made to calculate levelized cost of energy (LCOE) was that capacity factor for wind plants supplying Minnesota in year 2030 would average 40% over the course of a year. While this is not as high as the 44% projected by the Energy Information Agency (EIA), or the 40-60% forecast by National Renewable Energy Laboratory (NREL) for year 20302, it still seemed high to me, and I began a short study of capacity factor to verify these assumptions. As sources of information I searched the various annual electricity profiles of EIA and technical documents of the EIA, Federal Energy Regulatory Commission (FERC), and NREL.
1. Capacity for Electric Generation
As explained in a technical document accompanying the various EIA State Electricity Profiles3, capacity factor is the ratio of actual electrical energy generated over a year to the maximum possible energy generation (plant capacity) adjusted by anticipated downtime. This adjustment, known as availability factor ranges from 0.97 for thermal plants to 0.99 for wind turbines, and is thus a small factor in most capacity calculations. In fact, it is not clear that it is even used at all for wind turbines.
Thus, capacity factor seems a very straight forward concept. Yet, the EIA State Electricity Profiles do not list capacity, but rather include a tab in their spreadsheet labeled Capability. Capability is defined nowhere among any glossaries or technical documents on the EIA site. However, a phone call to a staffer at EIA revealed that capability is most likely the Generating Summer Capacity or Net Summer Capacity5. It is, though, most undoubtedly the basis of capacity factor calculations.
Seasonal generating capacity is defined in many places, and always in the following way. A net summer capacity is the net power a generating station can deliver to a load during a multi-hour test operating at summer (May through October) conditions. Winter capacity would be similar except for being tested under conditions appropriate to the winter season of November through April. There is good reason for making such distinctions and tests for thermal plants. During a summer season the condenser side of a thermal plant might be unable to reach its low design temperature which inhibits its thermal efficiency. During winter, a different set of thermal parasitics comes into play; but these also reduce the net power delivered to load.
When it comes to wind turbines, however, all this sort of careful adjustment for power delivered to load under realistic conditions goes out the window. For wind turbines the net summer capacity is just the nameplate rating of the equipment. Yet, it is patently obvious that summer conditions would never produce net power from a wind turbine into a load at nameplate rating because the wind doesn’t blow.
Table 1 shows, as an example, the extreme (high and low) capacity factors for wind plants in selected states during the year 2017 and which winter or summer months this occurred in.
|State||Summer||Capacity Factor||Winter||Capacity Factor|
Table 1. Capacity factors for wind energy during summer and winter seasons in selected states. Statistics for year 2017. Data from EIA Annual Electricity Profile reports.
During the summer season, during August generally, capacity factors are always below 25%, and actually decline west to east across the country to values as low as 16% in Iowa and Minnesota. The actual numbers might change somewhat year to year, but the pattern is clear. Net summer capacity at wind plants is nowhere close to the capability published by the EIA. Compounding this is the typical pattern of electrical power usage which peaks nationwide during the late summer months of July and August–exactly during the times of lowest net capacity in wind plants.
2. A modest suggestion
LCOE may use projected average capacity factors ranging from 40% to 60%, but engineers do not design for average conditions. Instead, recognizing that the world presents uncertainties, they often design with particular uncertainties in mind. These may be worst case scenarios, or 99% certainty, or something similar, but never average conditions. We possess data regarding wind speeds at sufficiently fine time scale and measured over long enough periods to calculate an expected capacity from wind plants in any season or month, and could
calculate reasonable figures to any level of certainty required. In fact, even without plant specific data, a person can take weather station data at nearby sites and make a very reasonable calculation of net capacity in any month. Yet, we don’t bother. I suggest we should.
To add a realistic summer capacity to other measures of capability would have two positive effects on discussions of renewable energy.
First, with regard to discussions of power supply margins, one notices that government agencies treat the margin that new renewables add to the grid on the basis of nameplate rating6. This does not present much negative effect as long as wind and solar remain minor contributors to the seasonal net power landscape. However, as renewables attempt to reach the 40% to 50% of capacity envisioned in renewable portfolios, then people eventually will have to acknowledge that 1MW of nameplate wind power added to a system is actually only 160kW in the operation of that system in August.
Second, the so-called penetration of renewables into a system is often made on the basis of nameplate rating, or on actual generation with renewables being given priority to the grid. This has the pernicious effect of making renewables appear simple to integrate, and making renewable portfolios appear trouble free to mandate. To offer something like a worst summer month net capacity to augment other capability measures for renewable plants wouldn’t perfectly capture the complexity that uncertainties present, but it would inject more realism into any discussion. It would at minimum provide an explicit nod to the amount of overbuild required to make a reliable grid from renewables.
1 Doubling Down on Failure: How a 50 Percent by 2030 Renewable Energy Standard Would Cost Minnesota $80.2 Billion, Isaac M. Orr, Mitch Rolling, and John Phelan. Find at: https://www.americanexperiment.org
2 Power Plant of the Future through Science-Based Innovation, Dykes, et al. National Renewable Energy Laboratory, Technical Report NREL/TP-5000-68123 August 2017.
3 These can be found online at, using Colorado as an example, https://www.eia.gov/electricity/state/colorado/.
4 For example, at the Duke Energy owned Top of the World wind turbine farm in Wyoming, the nameplate capacity of the various wind turbines adds to 200.2MW and the EIA lists this plant capacity as 200MW, which is well above an availability factor of 0.99.
5 This exercise of chasing down the true meaning of terms always reveals that definitions are slippery, sometimes difficult to locate; and, rarely does anyone exhibit complete confidence about how particular figures are calculated.
6 For instance, refer to the assessment the Offices of Electric Reliability and Enforcement at FERC made for summer 2017 found at: https://www.ferc.gov/market-oversight/reports-analyses/mkt-views/2017/2017-summer-assessment.pdf
via Watts Up With That?
April 9, 2019 at 08:07PM