Logo IMG
HOME > PAST ISSUE > May-June 2011 > Article Detail


Global Energy: The Latest Infatuations

In energy matters, what goes around, comes around—but perhaps should go away

Vaclav Smil

Renewable Renaissance?

Unfortunately, this has led to exaggerated expectations rather than to realistic appraisals. This is true even after excluding what might be termed zealous sectarian infatuations with those renewable conversions whose limited, exceedingly diffuse or hard-to-capture resources (be they jet stream winds or ocean waves) prevent them from becoming meaningful economic players during the next few decades. Promoters of new renewable energy conversions that now appear to have the best prospects to make significant near-term contributions—modern biofuels (ethanol and biodiesel) and wind and solar electricity generation—do not give sufficient weight to important physical realities concerning the global shift away from fossil fuels: to the scale of the required transformation, to its likely duration, to the unit capacities of new convertors, and to enormous infrastructural requirements resulting from the inherently low power densities with which we can harvest renewable energy flows and to their immutable stochasticity.

2011-05SmilF4.jpgClick to Enlarge ImageThe scale of the required transition is immense. Ours remains an overwhelmingly fossil-fueled civilization: In 2009 it derived 88 percent of its modern energies (leaving traditional biomass fuels, wood and crop residues aside) from oil, coal and natural gas whose global market shares are now surprisingly close at, respectively, 35, 29 and 24 percent. Annual combustion of these fuels has now reached 10 billion tonnes of oil equivalent or about 420 exajoules (420 × 1018 joules). This is an annual fossil fuel flux nearly 20 times larger than at the beginning of the 20th century, when the epochal transition from biomass fuels had just passed its pivotal point (coal and oil began to account for more than half of the global energy supply sometime during the late 1890s).

Energy transitions—shifts from a dominant source (or a combination of sources) of energy to a new supply arrangement, or from a dominant prime mover to a new converter—are inherently prolonged affairs whose duration is measured in decades or generations, not in years. The latest shift of worldwide energy supply, from coal and oil to natural gas, illustrates how the gradual pace of transitions is dictated by the necessity to secure sufficient resources, to develop requisite infrastructures and to achieve competitive costs: It took natural gas about 60 years since the beginning of its commercial extraction (in the early 1870s) to reach 5 percent of the global energy market, and then another 55 years to account for 25 percent of all primary energy supply. Time spans for the United States, the pioneer of natural gas use, were shorter but still considerable: 53 years to reach 5 percent, another 31 years to get to 25 percent.

Displacing even just a third of today’s fossil fuel consumption by renewable energy conversions will be an immensely challenging task; how far it has to go is attested by the most recent shares claimed by modern biofuels and by wind and photovoltaic electricity generation. In 2010 ethanol and biodiesel supplied only about 0.5 percent of the world’s primary energy, wind generated about 2 percent of global electricity and photovoltaics (PV) produced less than 0.05 percent. Contrast this with assorted mandated or wished-for targets: 18 percent of Germany’s total energy and 35 percent of electricity from renewable flows by 2020, 10 percent of U.S. electricity from PV by 2025 and 30 percent from wind by 2030 and 15 percent, perhaps even 20 percent, of China’s energy from renewables by 2020.

Unit sizes of new converters will not make the transition any easier. Ratings of 500–800 megawatts (MW) are the norm for coal-fired turbogenerators and large gas turbines have capacities of 200–300 MW, whereas typical ratings of large wind turbines are two orders of magnitude smaller, between 2 and 4 MW, and the world’s largest PV plant needed more than a million panels for its 80 MW of peak capacity. Moreover, differences in capacity factors will always remain large. In 2009 the load factor averaged 74 percent for U.S. coal-fired stations and the nuclear ones reached 92 percent, whereas wind turbines managed only about 25 percent—and in the European Union their mean load factor was less than 21 percent between 2003 and 2007, while the largest PV plant in sunny Spain has an annual capacity factor of only 16 percent.

2011-05SmilF5.jpgClick to Enlarge ImageAs I write this a pronounced high pressure cell brings deep freeze, and calm lasting for days, to the usually windy heart of North America: If Manitoba or North Dakota relied heavily on wind generation (fortunately, Manitoba gets all electricity from flowing water and exports it south), either would need many days of large imports—yet the mid-continent has no high-capacity east-west transmission lines. Rising shares of both wind and PV generation will thus need considerable construction of new long-distance high-voltage lines, both to connect the windiest and the sunniest places to major consumption centers and also to assure uninterrupted supply when relying on only partially predictable energy flows. As the distances involved are on truly continental scales—be they from the windy Great Plains to the East Coast or, as the European plans call for, from the reliably sunny Sahara to cloudy Germany (Desertec plan)—those expensive new supergrids cannot be completed in a matter of years. And the people who fantasize about imminent benefits of new smart grids should remember that the 2009 report card on the American infrastructure gives the existing U.S. grid a near failing grade of D+.

And no substantial contribution can be expected from the only well-tested non-fossil electricity generation technique that has achieved significant market penetration: Nuclear fission now generates about 13 percent of global electricity, with national shares at 75 percent in France and about 20 percent in the United States. Nuclear engineers have been searching for superior (efficient, safe and inexpensive) reactor designs ever since it became clear that the first generation of reactors was not the best choice for the second, larger, wave of nuclear expansion. Alvin Weinberg published a paper on inherently safe reactors of the second nuclear era already in 1984, at the time of his death (in 2003) Edward Teller worked on a design of a thorium-fueled underground power plant, and Lowell Wood argues the benefits of his traveling-wave breeder reactor fueled with depleted uranium whose huge U.S. stockpile now amounts to about 700,000 tonnes.

But since 2005, construction began annually on only about a dozen new reactors worldwide, most of them in China where nuclear generation supplies only about 2 percent of all electricity, and in early 2011 there were no signs of any western nuclear renaissance. Except for the completion of the Tennessee Valley Authority’s Watts Bar Unit 2 (abandoned in 1988, scheduled to go on line in 2012), there was no construction underway in the United States, and the completion and cost overruns of Europe’s supposed new showcase units, Finnish Olkiluoto and French Flamanville, were resembling the U.S. nuclear industry horror stories of the 1980s. Then, in March 2011, an earthquake and tsunami struck Japan, leading to Fukushima’s loss of coolant, destruction of reactor buildings in explosions and radiation leaks; regardless of the eventual outcome of this catastrophe, these events will cast a long suppressing shadow on the future of nuclear electricity.

comments powered by Disqus


Subscribe to American Scientist