Energywise iconEnergywise

The Black Swan and the Bell Curve

Many developments in the last few years have suggested that general public concern about global warming is much shallower than one might have supposed earlier in the decade, most recently Germany's decisions to end reliance on nuclear energy entirely and accelerate the phase-out of currently operating reactors. That means much more dependence in the next decade on domestic coal and natural gas mainly imported from the Russian sphere, and probably somewhat greater imports of nuclear-generated electricity from France. And it means that Germany is very unlikely to meet its ambitious greenhouse gas reduction goal--a cut of 40 percent by 2020, vis-a-vis the 1990 level.

Deutsche Bank now predicts that greenhouse gas emissions from Germany's power sector will more than double in the next decade as a result of the country's new course, an astonishing prospect for a country that until now had been a leader in global efforts to address climate change.

The retreat of Republican senators John McCain and Lindsay Graham, who earlier had cosponsored cap-and-trade greenhouse gas reduction bills; the refusal of Congress to take up such a bill and the Obama administration's decision not to fight for one; the global breakdown of the Kyoto process and the acquiescence of the once-activist European nations to a program of purely voluntary carbon cuts; the relative indifference of general publics to the whole subject of global warming and climate change, despite a plague of violent and weird weather that has afflicted the United States, western Europe, Russia, China, and Pakistan in the last few years--what's going on?

"Planet Earth Doesn't Know How to Make It Any Clearer It Wants Everybody to Leave," led the satirical weekly The Onion last week. In this week's opener to The New Yorker, Elizabeth Kolbert muses in a more serious vein about why people aren't getting the message, despite the consistency of extreme weather events with climate model predictions. Why did the president brood about such events being "beyond our power to control," she wonders, in his recent visit to tornado-ravaged Joplin, Missouri?

Of course the president is not exactly wrong. Unusually violent events will take place occasionally under any circumstances. And--the public probably grasps--the likelihood of such events will continue to increase even if we all start sharply cutting greenhouse gas emissions tomorrow, because of the warming "inertia" already injected into the system.

What's easily lost sight of in that kind of reflection, however, is the simple fact that we can work to reduce the probability of ever-more extreme climate events, or not. Right now, we have drifted into zone where global greenhouse gas concentrations are much higher than they have been any time in the history of homo sapiens and close to 50 percent higher than they were at the dawn of the industrial revolution, 250 years ago. We are in uncharted waters and getting deeper into those waters all the time. The rational decision is to reverse course and start navigating our way out of those waters as fast as can be prudently done.

How fast is that? There are those--a lot of those--who argue, in light of the Fukushima and other worse-than-worst case nuclear accidents, that we don't need to do anything so desperate as resort to more reactor construction just to mitigate the risks of climate change. Respectfully, I would suggest that the risks associated with global warming are of a completely different order than the risks attached to atomic energy. For the weird weather we've seen so far may be no more than a nasty foretaste of what's to come.

Suppose, to take a different kind of worse-than-worst case, some catastrophic combination of flood, drought, and pest conspired to completely shut down one of the world's major breadbaskets for three or four years--say Illinois-Iowa, the Yangtze or Yellow River basin, the Mekong, Ganges, Indus, or Nile. The effect would be a a global food shortage, mass starvation, and global convulsions the like of which have never  been seen before.

To focus for a moment on the Illinois-Iowa scenario, standard business as usual projections say that this part of the Midwest will have, by the end of the century, a climate like that in East Texas or Alabama--a temperate climate zone will have changed to a subtropical one characteristic of the deep South.

That's the mainstream scenario. But what if things were even worse than that, much worse? "Black swan" scenarios associated with the mathematicians Nassim Nicholas Taleb and Benoit Mandelbrot postulate that seemingly very improbable events take place because the probability distribution turns out (mainly in hindsight!) to have "fat tails"--that is, instead of conforming to the "normal" Bell Shape form, the distribution has a shape in which the probability of extreme events is greater than normal.

We don't actually have to resort to black swan theory, even if geoscience is one of its more obvious applications. The politically conservative Chicago jurist Richard A. Posner, in Catastrophe: Risk and Response (2004), has pointed out that the more uncertain we are about mainstream climate projections, the more we should be worrying about worse-than-worst case scenarios. Thus, he suggests, climate skeptics are shooting themselves in the foot when they emphasizeuncertainties. By the symmetrical logic of the Bell curve, if climate scientists are underestimating the possibility things may be not nearly as bad as they predict, they also are underestimating the probability they will be much worse.

How much worse could they be? In an agricultural collapse scenario, we could be talking about millions, hundreds of millions or even billions of deaths--not the "mere" 20,000-30,000 premature deaths that have followed from the Chernobyl accident.


The Financial Times reported yesterday that 46 percent of the United States has been either abnormally wet or abnormally dry this spring; 21 percent is normal. "It is highly improbable that the remarkable extreme weather events of 2010 and 2011 could have all happened in such a short period of time without some powerful climate-altering force at work," said Southern Methodist University business professor Bernard Weinstein, quoted in the FT. "I expect that by 20 to 30 years from now, extreme weather years like we witnessed in 2010 will become the new normal."

The re-insurance company Munich Re has arrived at a similar conclusion: "The only plausible explanation for the rise in weather-related catastrophes is climate change. The view that weather extremes are more frequent and intense due to global warming coincides with the current state of scientific knowledge."

Case for Accelerating Dry Cask Storage of Spent Nuclear Fuel

Special Report: Fukushima and the Future of Nuclear Power

Editor's Note: John Boyd is an IEEE Spectrum contributor reporting from Kawasaki, Japan. This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency. For more details on how Fukushima Dai-1's nuclear reactors work and what has gone wrong so far, see our explainer and our timeline.

A newly released report from the International Panel on Fissile Materials contains information that implicitly bolsters the case for moving spent fuel out of cooling ponds and into dry cask storage, both in the United States and in most other parts of the world as well. After 9/11 it already was apparent that fuel in cooling ponds could make a tempting target for terrorists--and one much easier to hit than reactor cores. Now, in the wake of the dangerous fire in the Fukushima cooling pond, the case for accelerating dry cask storage is inescapable.

With plans for permanent disposal of nuclear wastes stalled just about everywhere except for Finland and Sweden, spent fuel should be moved as fast as possible out of cooling ponds and into dry casks.

What does that mean? As the Fissile Materials report usefully explains, "In dry cask storage, spent fuel assemblies are typically placed in steel canisters that are surrounded by a heavy shielding shell of reinforced concrete, with the shell containing vents allowing air to flow through to the wall of the canister and cool the fuel. A typical dry cask for Pressurized Water Reactor fuel contains about 10 tonnes of spent fuel, roughly one half of an annual discharge from a 1 GWe reactor." The large cylindrical containers (seen in the Nuclear Regulatory Commission photo above) generally are located close to reactor sites in the United States, but are much "harder" than the spent fuel ponds also typically found at the sites.

Worldwide, about 90 percent of spent fuel is in vulnerable cooling ponds and only a tenth in dry casks, according to the report. The numbers are somewhat better for the United States, where, of roughly 64,500 tonnes of heavy metal (uranium and plutonium, basically), 15,250--almost a quarter of the total--is in dry casks. Princeton University's Harold Feiveson , a participant in the ongoing fissile materials studies, points out that as much as 50,000 tonnes or more could be promptly moved to dry casks, having already cooled for the requisite five years.

(The U.S. reactor fleet produces about 2,000 tonnes of spent fuel a year, so of the 65,000 tonnes of total spent fuel, only about 10,000 tons still needs to be in cooling ponds.)

Robert Alvarez, a former senior official in Bill Clinton's Energy Department, recently made the same argument. Writing in the Huffington Post, Alvarez said that the United States “should promptly take steps to reduce these risks [associated with cooling ponds] by placing all spent nuclear fuel older than five years in dry, hardened storage casks [as] Germany did 25 years ago. It would take about 10 years at a cost between $3.5 and $7 billion. If the cost were transferred to energy consumers, the expenditure would result in a marginal increase of less than 0.4 cents per kilowatt hour for consumers of nuclear-gneerated electricity."

Dry casks at Fukushima, Alvarez notes, weathered the tsunami and earthquake unscathed, unlike the reactors and cooling ponds.

Arguably, with 9/11 the large quantity of U.S. spent fuel being held in unhardened cooling ponds already represented a national emergency. Now, with the collapse of plans for a permanent geologic spent fuel repository at Yucca Mountain, Nevada, the situation is more urgent than ever.

The logic behind the selection of Yucca Mountain grew from a perception that the United States should just pick a place to put its spent fuel, already, and get on with the job. Since the area around the U.S. nuclear weapons testing area was already highly contaminated, what serious objection could there be to putting well-secured reactor wastes in the vicinity? But it turned out that Nevadans objected strenuously, and with their voices ever more significant in closely contested presidential elections, they ultimately prevailed and killed the project. As it happens, says the fissile materials report, Yucca Mountain turns out to have been "a poor choice on technical grounds." Ideally, spent fuel should be stored in oxygen-free conditions, for example in deep granite or clay. But Yucca Mountain would have been exposed to flows of oxygen-rich water.

The mission of the International Panel on Fissile Materials is to establish a technical basis for securing, consolidating and reducing stockpiles of highly enriched uranium and plutonium, so that they will be less available for use in nuclear weapons by errant states or terrorists. This latest report, however, has special salience because of its relevance to nuclear accident scenarios and power plants security.

Big City Climate Meeting in Mega-Sao Paulo

The Earth Systems Research Laboratory on Hawaii's Mauna Loa reported at the end of May that the atmospheric carbon level is now 395 ppm (394.97, to be exact), 46 percent higher than the pre-industrial level. Meanwhile, the International Energy Agency reported that 2010 greenhouse gas emissions, despite the global economic crisis, were at a record-high level.

As the world steadily approaches a landmark where global carbon concentrations will be 400 ppm and 50 percent higher than the pre-industrial level, experts are concluding that we will be unable to contain the increase in average global temperatures to 2 degrees Celsius, our official goal. Evidently we're on track for a 4 degree rise by the end of this century.

Yet the major world economies are deadlocked over whether to stick with the Kyoto program of agreed-upon greenhouse gas reductions, the United States having opted out because of concerns about countries like China and India, and the emerging market economies having refused to opt in because of concerns about how their growth prospects would be affected. 

So, under the circumstances, it's perhaps not surprising that others are seizing the initiative. Representatives of 40 large metropolises (the C40) convened in Sao Paulo last week to discuss what they can do. Half the world's people live in cities, where they consume two thirds of the world's energy and generate 70 percent of our greenhouse gas emissions. 

The C40, originally brought together by Bill Clinton, now is chaired by New York City Mayor Michael Bloomberg. Mega-egos from mega-cities confronting a mega-problem, perhaps the greatest ethical challenge of our day. Can they succeed?

I'm old enough to remember when I first heard that Sao Paulo, a place I had barely heard of, was the world's largest city. Now the world's largest city is most likely Chongqing, on the Yangtze River just upstream from the Three Gorges Dam. Soon it will be some other place we've never heard of. Globalization, industrialization, and urbanization are occurring at such a breakneck pace, it's hard to have faith we'll be able to cope successfully with the ramifications.

The C40 cities agreed last week to regularize a system of emissions accounting, to be submitted to the next big global climate meeting, in Durban, South Africa, next November. the World Bank agreed to establish a single office where cities can come to one-stop-shop for climate action grants. Bloomberg's foundation promised to give the C40 $6 million per year for the next three years to support its activities, a 12-fold increase over what it had been getting from the Clinton Foundation.  (Bill Clinton ceded personal leadership of the group to Bloomberg, an aide to the former president commenting that the "golden rule" applied: He who has the gold rules.)

Making progress on climate depends partly on many people doing many small things in many places, as Bloomberg said in Sao Paulo. (He is seen above unveiling an electric car charging station in New York, last year.) Around the world, cities are tightening energy efficiency standards, accelerating installation of LED lighting, promoting lower-emissions vehicles, and so on.

But progress also depends, even more crucially, on the really big players doing really big things. Though there is much the cities can do to conserve energy, improve energy efficiency, and promote greener technologies, ultimately, unless something is done about how the huge quantities of energy they consume is produced, the climate problem will keep getting worse. Inevitably, then, the emphasis of city programs will be not on slowing climate change but on adapting to it. And the cities that will shine will be those, like Chicago, that prove to have done the most the fastest to be ready for climate change when things get really bad.

Italian Voters Turn Against Nuclear Power Ahead of Referendum

Special Report: Fukushima and the Future of Nuclear Power

Editor's Note: This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency.

A riff on the 1959 French film Hiroshima mon amour, set in postwar Japan

Italy is hurtling towards a referendum on nuclear power this month that could deliver yet another blow to the beleaguered low-carbon energy option, following recent reversals in Switzerland, Germany and Japan. Political graffiti and propaganda that I recorded last week in Genoa mirror opinion polls that show Italian voters souring rapidly on nuclear energy.

Fukushima Mon Amour (top right) is a riff on the 1959 French film Hiroshima Mon Amour, set in post-war Japan. Another image found on Genoa's medieval walls (lower right) closes with one of the Italian language's strongest insults, porcodio, to read: No to god-damned nuclear.

For Italy, this month's referendum is a case of deja vu: The country shut down the last of its four nuclear reactors in 1990, after voters approved an antinuclear referendum inspired by the 1986 Chernobyl disaster. That made Italy the only G8 nation without nuclear power. Italy's top court ordered a second referendum for June 12 and 13 of this year after Italian Premier Silvio Berlusconi passed legislation to restart Italy's nuclear program, proposing to supply a quarter of Italy's power via nuclear energy by 2030.

Last week, sensing an impending loss at the polls, Berlusconi's government put off the plan for two years and sought to scrap this month's referendum. If the courts nevertheless allow the vote to go forward, Italy's nuclear renaissance could be nipped in the bud.

But for all the anti-nuclear indignation that Italian voters may muster, their rejection of nuclear power is likely to be less than definitive. Roughly 10% of Italy's electricity is now nuclear power imported from France and Switzerland, according to the World Nuclear Association, an industry group.

Increasingly Fractious Questions about Gas Fracking

The U.S. natural gas industry has been advertising itself as a bridge to an energy future belonging to wind, and that may be an understatement or even be the opposite of the truth: It may turn out--when we look back on the present day from mid century or later--that wind will have been a bridge to what's called in the industry "unconventional" gas. That refers in theory both to methane obtained as a byproduct of coal mining and to gas extracted by means of hydraulic fracturing or "fracking." But for all practical purposes--so dramatic has been the revolution in shale gas--it refers just to fracking.

But the higher its profile, the more unconventional gas is destined to be questioned. New York State has put a moratorium on gas drilling to protect New York City's pristine water supply. France has flatly banned it, at least till further notice. And this last week, in ExxonMobil's and Chevron's annual shareholder meetings, resolutions calling for more transparency in corporate fracking plans garnered substantial support--41 percent of voting stockholders in the case of Chevron, 28 percent in the case of ExxonMobil.

Concerns about integrity of water supplies are paramount, but they are not the only concerns. (A recent study published in the Proceedings of the National Academy of Science found 17 times as much methane in water wells located in fracking areas as the average in non-fracking areas.) Another major one from a policy standpoint is leakage of gas from drilling operations and distribution systems. In terms of public policy, one of gas's selling points is its relatively small carbon footprint--for any given amount of electricity   generated, emissions from gas-fired turbines are about half those from a coal-fired generating plant. But methane is a much more potent greenhouse gas than carbon dioxide, and so if large quantities are leaking into the atmosphere, the supposedly benign impact of switching from coal to gas could be cancelled.

Recent studies have found that in the United States, "as much as 7.9 percent of [natural gas] puffing out from shale gas wells, intentionally vented or flared, or seeping from loose pipe fittings along gas distribution lines," The New York Times reported.This is an important subject we will know a lot more about when scientists start to directly measure greenhouse gas emissions on a regional basis, an endeavor that's in its infancy. To date, estimates of national emissions are derived from known sources, mostly importantly electricity generators and gasoline (petrol) burned in internal combustion engines. Direct emissions measurements taken in the atmosphere may reveal significant sources that have been underestimated in the past.

Smart Grid Developments

California's plan to have smart meters installed universally by the end of next year is running into some opposition in PG&E's northern territory. According to a recent report in EnergyBiz magazine, some customers believe they are "suffering health effects including migraine headaches, heart palpitations and nausea from the emissions of the radio frequency meters." Because of such complaints, the state's uitility commission has ordered the utility to prepare some kind of opt-out option for disgruntled customers. It remains to be seen how much the development may impede installation of some 17 million advanced meters.

On a more positive note, Siemens is doing a two-year test in a Germany city to evaluate a self-organizing, automated electricity distribution system in which an increased shared of energy from renewables must be accommodated. To be carried out in cooperation with the local utility, a local college, and the Aachen technical university, the project is dubbed Irene, for Integration of Renewable Energies and Electric Mobility. Besides incorporating power from wind turbines, solar cells, and biogas generators, Irene will provide for electric vehicle charging, and for the EVs to store electricity. The "Siemens role in this smart grid pilot project involves installation of software developed by its global research [arm] Corporate Technology," says the company's press release.

Siemens also has announced a global contest in which five smart grid business or technology proposals that could "help the world become a better place" will be recognized: "The world around us is changing. To lower CO2 emissions, we need to rely on renewable energy sources.  For that, the current energy network needs to become more flexible and intelligent," says the contest site. Winners will receive $21,000 each and a trip to Berlin to meet with Siemens smart grid experts. Siemans has also promised to spend more than $1.5 million to test the best ideas in the real world. Proposals will be accepted through June 15.

Not Neutral on Nukes: Switzerland to Phase Out Nuclear Power

Special Report: Fukushima and the Future of Nuclear Power

Editor's Note: This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency.

Following the downward trend on nuclear power in Europe, Switzerland has announced plans to allow its five reactors to reach the ends of their 50-year lifespans; no new nuclear facilities would then be built to replace them. The Swiss Federal Council cited the Fukushima disaster as the push to phase out nuclear power, writing in a press release that "the people of Switzerland would like to see a reduction in the residual risk associated with the use of nuclear energy."

Switzerland, a country of 7.8 million people -- fewer than live in New York City -- currently gets 39 percent of its electricity from those five reactors. Amazingly, another 56 percent comes from hydropower, with only five percent coming from more conventional sources.

The Federal Council's decision -- which will now go to the country's Parliament for debate -- apparently will cost the country up to 0.7 percent of its GDP. A renewed focus on efficiency, expanded hydropower, and a reduction in overall energy consumption are among the ways Switzerland hopes to replace the lost power generation.

The first of the five reactors -- one of two at Beznau, 30 miles northwest of Zurich -- is due to be decommissioned in 2019. The others would go offline by 2034.

This appears to be much more of a long-term policy decision than a fear that a Fukushima-type disaster could befall the country. As the press release notes: "The Federal Council sees no reason to seek early decommissioning. Tests conducted by the Swiss Federal Nuclear Safety Inspectorate have shown that the safe operation of Switzerland's nuclear power plants is currently assured." Switzerland follows others in Europe -- especially Germany -- in attempts to phase out nuclear power.

Image of Gosgen plant in Switzerland via Luigi Rosa

Radiation Risks

Special Report: Fukushima and the Future of Nuclear Power

Editor's Note: This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency. For more details on how Fukushima Dai-1's nuclear reactors work and what has gone wrong so far, see our explainer.

The Fukushima catastrophe was a worse than worst case accident set off by a much worse than anticipated combination of tsunami and earthquake--"beyond imagination," as a former manager of the Dai-1 reactor put it. But in one way Japan got lucky: Even though there was a radiation release equivalent to a significant fraction of Chernobyl's--perhaps as high as 20 percent--most of the radiation happened to be carried out to sea by winds and water, rather than spreading to the country's densely populated regions.

Nonetheless, the accident showed that large radiation releases can occur in a light water reactor accident, not just in a defectively designed and operated Soviet-era reactor. So it's important to be clear about Chernobyl's long-term health impact.

Somehow the notion has got abroad that Chernobyl's health impact was minimal. This is because the illnesses and premature deaths statistically expected from a Chernobyl-scale radiation release cannot be detected, individually or even collectively, among the hundreds of millions of cancer cases that would occur anyway, and because of continued debate about morbidity and mortality at low radiation dose levels.

What then is the basis for assigning thousands or even tens of thousands of premature deaths to Chernobyl?

Probably contrary to some popular misconceptions, the effect of exposure to radiation is relatively well understood. Because radiation can be precisely measured and leaves a clear trail, the health impact of exposure is perhaps "the best understood, and certainly the most highly quantified dose-response relationship for any common environmental human carcinogen," a report of the U.S. National Cancer Institute concluded.

Committees of the National Academies have produced a succession of studies on the Biological Effects of Ionizing Radiation, and the seventh of them, the 2006 BEIR VII report, is widely considered to have produced the most authoritative estimates. They are based mainly on detailed studies of Hiroshima and Nagasaki bombing survivors, but also populations exposed to atmospheric nuclear weapon testing, workers in the nuclear industry, and so on. (A useful concise account can be readily found by searching online on "a summary of BEIR VII," which is by the chairman and cochair of the committee.)

When the BEIR VII dose-response estimates are  applied to populations exposed to Chernobyl radiation, as students of the subject like Tom Cochran and Elisabeth Cardis have done, the result consistently is is the range of 20,000-30,000 deaths; the range of uncertainty admittedly is large, ranging from several thousand to 70,000 or 80,000. So we can and should say, based on the best available science, that Chernobyl is causing at least several thousand deaths and probably about 25,000.

To be sure, not all national regulatory authorities have accepted the BEIR VII results as definitive, among them France's. That may be why a 2008 United Nations report declined to assign any specific number to long-term health effects of Chernobyl. Some, unfortunately, have chosen to take that as a statement there have been no long-term effects.

Ten years after the Chernobyl accident, assessing Chernobyl's health effects, Spectrum magazine referred to them in a headline as "stressful." That referred partly to the demonstrable fact that near-term ladverse effects were mainly psychological, though not to be minimized. But the headline also meant to highlight the seemingly paradoxical contrast between higher-than-expected incidence of thyroid cancer, mainly in children, and lower-than-expected leukemia and solid cancer incidence.

In hindsight, we could and should have made it clearer that there was little prospect of detecting and measuring leukemia and solid cancer incidence at the expected rates. For the most stressful aspect of the Chernobyl results, in the final analysis, is that the accident surely caused thousands of deaths that cannot actually be identified. The Chernobyl fatalities, in that sense, are like the deaths of unknown soldiers.

Conversion Records and the Promise of Super-Efficient Solar

Increasing the amount of energy hitting a solar cell that can be converted into usable electricity to the point that solar becomes cost-competitive with other energy sources seems always just out of reach, but as more of these possibilities crop up perhaps we will see the goal become reality soon. Two stories falling into such a category have cropped up in recent days.

First, a research group in Switzerland has increased the efficiency record for flexible cells made from what is known as CIGS -- copper indium gallium selenide -- from 17.6 percent to 18.7 percent. A minor upgrade, maybe, but it's up from 14.1 percent as recently as 2005, and each tiny improvement brings the technology closer to mass adoption.

As the director of Empa, the Swiss Federal Laboratories for Materials and Technology, Gian-Luca Bona said in a press release: "Next, we need to transfer these innovations to industry for large scale production of low-cost solar modules to take off." This is a key point -- we often hear of efficiency improvements or conceptual designs in a laboratory, but bringing these technologies to the point where they can be mass produced is another phase altogether. Apparently, scientists at Empa are actually focused on that goal, working with a Swiss startup called Flisom to bring the thin-film solar cells to market.

And in less practical but potentially, possibly, in-a-few-years game-changing news, researchers at the University of Missouri have used what they call nantenna to create another flexible solar design that theoretically could provide -- amazingly -- 95 percent conversion efficiency. To be clear, this is in very early stages of development, and at first will only be capable of harvesting excess heat from industrial processes and the like. But as Missouri associate professor of chemical engineering said: "If successful, this product will put us orders of magnitudes ahead of the current solar energy technologies we have available to us today.”

Such statements do tend to give me pause, as there are any number of plans out there that will theoretically boost solar efficiencies into the stratosphere -- like, say, this singlet fission idea, or these super thin-film cells that can bring 12-fold improvements. Of course, none of these have yet revolutionized solar power development. Hopefully, one or more of them will start that process soon.

(Image via Empa)

Smart Grid's Experimental Phase

Medium-term prospects for the smart grid will be among the key technology topics addressed next month at the IEEE’s Technology Time Machine conference in Hong Kong. The purpose of the small and, frankly, elite meeting is to assemble people who are betting their corporate and national futures on when critical technologies will mature and take off. To judge from preliminary assessments laid out in a white paper prepared for the conference [PDF], some technologies, such as cloud computing, already are at a hockey-stick inflection point, while others, such as the so-called "Internet of Things," will reach that point in perhaps ten years' time. With the smart grid, due to immense technical challenges and acute engineering shortages, the inflection point may be closer to two decades away.

To put it a little differently, the process of merging traditional power transmission and distribution technology with state-of-the-art communications and sensing technologies is in its infancy. As George W. Arnold and Wanda K. Reder explain in the Time Machine article, the trained engineers and technicians needed to design and build the smart grid are simply not there. The smart grid projects currently underway are therefore essentially experimental, and not every demonstration will go well.

Xcel Energy's Smart Grid City project in Boulder, Colo., featured two years ago in IEEE Spectrum magazine, is a sobering case in point. At the beginning of this year, the Colorado Public Service Commission declined to give Xcel full cost recovery on the project, cutting the requested $44.5 million to $27.9 million. The regulator said benefits to local ratepayers could not be demonstrated and that Xcel's shareholders would have to cover the difference. The poorly planned and executed project appears to have been a fiasco.


Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More