Energywise iconEnergywise

Not Neutral on Nukes: Switzerland to Phase Out Nuclear Power

Special Report: Fukushima and the Future of Nuclear Power

Editor's Note: This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency.


Following the downward trend on nuclear power in Europe, Switzerland has announced plans to allow its five reactors to reach the ends of their 50-year lifespans; no new nuclear facilities would then be built to replace them. The Swiss Federal Council cited the Fukushima disaster as the push to phase out nuclear power, writing in a press release that "the people of Switzerland would like to see a reduction in the residual risk associated with the use of nuclear energy."

Switzerland, a country of 7.8 million people -- fewer than live in New York City -- currently gets 39 percent of its electricity from those five reactors. Amazingly, another 56 percent comes from hydropower, with only five percent coming from more conventional sources.

The Federal Council's decision -- which will now go to the country's Parliament for debate -- apparently will cost the country up to 0.7 percent of its GDP. A renewed focus on efficiency, expanded hydropower, and a reduction in overall energy consumption are among the ways Switzerland hopes to replace the lost power generation.

The first of the five reactors -- one of two at Beznau, 30 miles northwest of Zurich -- is due to be decommissioned in 2019. The others would go offline by 2034.

This appears to be much more of a long-term policy decision than a fear that a Fukushima-type disaster could befall the country. As the press release notes: "The Federal Council sees no reason to seek early decommissioning. Tests conducted by the Swiss Federal Nuclear Safety Inspectorate have shown that the safe operation of Switzerland's nuclear power plants is currently assured." Switzerland follows others in Europe -- especially Germany -- in attempts to phase out nuclear power.

Image of Gosgen plant in Switzerland via Luigi Rosa

null

Radiation Risks

Special Report: Fukushima and the Future of Nuclear Power

Editor's Note: This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency. For more details on how Fukushima Dai-1's nuclear reactors work and what has gone wrong so far, see our explainer.

The Fukushima catastrophe was a worse than worst case accident set off by a much worse than anticipated combination of tsunami and earthquake--"beyond imagination," as a former manager of the Dai-1 reactor put it. But in one way Japan got lucky: Even though there was a radiation release equivalent to a significant fraction of Chernobyl's--perhaps as high as 20 percent--most of the radiation happened to be carried out to sea by winds and water, rather than spreading to the country's densely populated regions.

Nonetheless, the accident showed that large radiation releases can occur in a light water reactor accident, not just in a defectively designed and operated Soviet-era reactor. So it's important to be clear about Chernobyl's long-term health impact.

Somehow the notion has got abroad that Chernobyl's health impact was minimal. This is because the illnesses and premature deaths statistically expected from a Chernobyl-scale radiation release cannot be detected, individually or even collectively, among the hundreds of millions of cancer cases that would occur anyway, and because of continued debate about morbidity and mortality at low radiation dose levels.

What then is the basis for assigning thousands or even tens of thousands of premature deaths to Chernobyl?

Probably contrary to some popular misconceptions, the effect of exposure to radiation is relatively well understood. Because radiation can be precisely measured and leaves a clear trail, the health impact of exposure is perhaps "the best understood, and certainly the most highly quantified dose-response relationship for any common environmental human carcinogen," a report of the U.S. National Cancer Institute concluded.

Committees of the National Academies have produced a succession of studies on the Biological Effects of Ionizing Radiation, and the seventh of them, the 2006 BEIR VII report, is widely considered to have produced the most authoritative estimates. They are based mainly on detailed studies of Hiroshima and Nagasaki bombing survivors, but also populations exposed to atmospheric nuclear weapon testing, workers in the nuclear industry, and so on. (A useful concise account can be readily found by searching online on "a summary of BEIR VII," which is by the chairman and cochair of the committee.)

When the BEIR VII dose-response estimates are  applied to populations exposed to Chernobyl radiation, as students of the subject like Tom Cochran and Elisabeth Cardis have done, the result consistently is is the range of 20,000-30,000 deaths; the range of uncertainty admittedly is large, ranging from several thousand to 70,000 or 80,000. So we can and should say, based on the best available science, that Chernobyl is causing at least several thousand deaths and probably about 25,000.

To be sure, not all national regulatory authorities have accepted the BEIR VII results as definitive, among them France's. That may be why a 2008 United Nations report declined to assign any specific number to long-term health effects of Chernobyl. Some, unfortunately, have chosen to take that as a statement there have been no long-term effects.

Ten years after the Chernobyl accident, assessing Chernobyl's health effects, Spectrum magazine referred to them in a headline as "stressful." That referred partly to the demonstrable fact that near-term ladverse effects were mainly psychological, though not to be minimized. But the headline also meant to highlight the seemingly paradoxical contrast between higher-than-expected incidence of thyroid cancer, mainly in children, and lower-than-expected leukemia and solid cancer incidence.

In hindsight, we could and should have made it clearer that there was little prospect of detecting and measuring leukemia and solid cancer incidence at the expected rates. For the most stressful aspect of the Chernobyl results, in the final analysis, is that the accident surely caused thousands of deaths that cannot actually be identified. The Chernobyl fatalities, in that sense, are like the deaths of unknown soldiers.

Conversion Records and the Promise of Super-Efficient Solar

Increasing the amount of energy hitting a solar cell that can be converted into usable electricity to the point that solar becomes cost-competitive with other energy sources seems always just out of reach, but as more of these possibilities crop up perhaps we will see the goal become reality soon. Two stories falling into such a category have cropped up in recent days.

First, a research group in Switzerland has increased the efficiency record for flexible cells made from what is known as CIGS -- copper indium gallium selenide -- from 17.6 percent to 18.7 percent. A minor upgrade, maybe, but it's up from 14.1 percent as recently as 2005, and each tiny improvement brings the technology closer to mass adoption.

As the director of Empa, the Swiss Federal Laboratories for Materials and Technology, Gian-Luca Bona said in a press release: "Next, we need to transfer these innovations to industry for large scale production of low-cost solar modules to take off." This is a key point -- we often hear of efficiency improvements or conceptual designs in a laboratory, but bringing these technologies to the point where they can be mass produced is another phase altogether. Apparently, scientists at Empa are actually focused on that goal, working with a Swiss startup called Flisom to bring the thin-film solar cells to market.

And in less practical but potentially, possibly, in-a-few-years game-changing news, researchers at the University of Missouri have used what they call nantenna to create another flexible solar design that theoretically could provide -- amazingly -- 95 percent conversion efficiency. To be clear, this is in very early stages of development, and at first will only be capable of harvesting excess heat from industrial processes and the like. But as Missouri associate professor of chemical engineering said: "If successful, this product will put us orders of magnitudes ahead of the current solar energy technologies we have available to us today.”

Such statements do tend to give me pause, as there are any number of plans out there that will theoretically boost solar efficiencies into the stratosphere -- like, say, this singlet fission idea, or these super thin-film cells that can bring 12-fold improvements. Of course, none of these have yet revolutionized solar power development. Hopefully, one or more of them will start that process soon.

(Image via Empa)

null

Smart Grid's Experimental Phase

Medium-term prospects for the smart grid will be among the key technology topics addressed next month at the IEEE’s Technology Time Machine conference in Hong Kong. The purpose of the small and, frankly, elite meeting is to assemble people who are betting their corporate and national futures on when critical technologies will mature and take off. To judge from preliminary assessments laid out in a white paper prepared for the conference [PDF], some technologies, such as cloud computing, already are at a hockey-stick inflection point, while others, such as the so-called "Internet of Things," will reach that point in perhaps ten years' time. With the smart grid, due to immense technical challenges and acute engineering shortages, the inflection point may be closer to two decades away.

To put it a little differently, the process of merging traditional power transmission and distribution technology with state-of-the-art communications and sensing technologies is in its infancy. As George W. Arnold and Wanda K. Reder explain in the Time Machine article, the trained engineers and technicians needed to design and build the smart grid are simply not there. The smart grid projects currently underway are therefore essentially experimental, and not every demonstration will go well.

Xcel Energy's Smart Grid City project in Boulder, Colo., featured two years ago in IEEE Spectrum magazine, is a sobering case in point. At the beginning of this year, the Colorado Public Service Commission declined to give Xcel full cost recovery on the project, cutting the requested $44.5 million to $27.9 million. The regulator said benefits to local ratepayers could not be demonstrated and that Xcel's shareholders would have to cover the difference. The poorly planned and executed project appears to have been a fiasco.

null

U.S. National Academies Call for Strong Federal Climate Policy

A decade or so ago, when geophysicists and policy wonks started to talk about climate change rather than global warming, certain people seemed to think this represented some kind of insidious propaganda. Actually it just reflected an awareness that some of the consequences of global warming may be counter-intuitive or not what comes right to mind: increased winter precipitation, springtime flooding, more violent storminess, and so on.

During the past two years states in the U.S. Mid-Atlantic, northeastern and upper midwestern regions have had extraordinarily high and intense winter snowfalls. Spring snow melts have been greater and more rapid, with the Ohio and Mississippi basins now experiencing the worst flooding in close to a century. Weeks ago the Southeast was swept by tornadoes unlike anything most living people could ever remember, producing images reminiscent of the climate horror movie "The Day After Tomorrow." Climate change? So, in some sense, it seems.

"Although there is some uncertainty about future risks," says a major report issued by the National Research Council last week, "changes in climate and related factors have already been observed in various parts of the United States; and the impacts of climate change can generally be expected to intensify with increasing greenhouse gas emissions." The report notes that average U.S. temperatures have climbed two degrees Fahrenheit in the last 50 years, and that water levels are rising among many American towns and cities.

The report, "America's Climate Choice," makes favorable mention of greenhouse gas reduction efforts by cities, states, and regions, but says that a strong Federal policy would have much more impact. And if the United States is to live up to the commitment it made at Copenhagen in December 2009, where the agreed-upon accord promised to limit global increases in temperature to no more than 2 degrees Celsius by comparison with pre-industrial levels, there will have to be "a significant departure from 'business-as-usual' in how we produce and use energy." The most effective policy device, says the report, would be "a comprehensive, nationally uniform price on CO2 emissions, with a price trajectory sufficient to drive major investments in energy efficiency and low-carbon technologies."

The group that produced the NRC report was unusually diverse, as The New York Times noted. Though it included some stalwarts long associated with energy policy like Robert W. Fri and the its chairman Albert Carnesale, and some prominent climate experts like Susan Solomon and Robert H. Socolow, it also drew in people from the business community and from conservative policies, like Jim Gerringer, a former Wyoming governor now head of the Environmental Systems Research Institute in Cheyenne.

null

Chernobyl's 20,000-30,000 Fatalities

Since at least the mid-1990s, the standard estimate of the long-term human impact of the Chernobyl catastrophe is that there would be between 20,000 and 30,000 premature deaths from leukemia and other cancers, almost entirely in the greater European region. So how have the New York Times's editorial writers got the idea that Chernobyl's impact was minimal?

Writing earlier this week, they said: "The latest evaluation — a United Nations committee in 2008 — concludes that emergency workers who struggled to bring the plant under control suffered great harm but the wider public was barely affected. In the three countries hit with the most fallout — Belarus, Ukraine and parts of the Russian Federation — the committee found that the only significant harm was several thousand cases of highly curable thyroid cancer among people who were exposed as children, mostly by drinking contaminated milk. Only a handful have died."

I don't know what UN evaluation the Times is referring to, but it seems obvious that they have misinterpreted something. And the probable explanation almost certainly has to do with a dilemma discussed here in Spectrum on the 25th anniversary of the Chernobyl catastrophe, several weeks ago. Not only are the premature fatalities from Chernobyl undetectable individually, so that there is no way of knowing which individuals in particular died as a result of the accident; the excess deaths are undetectable even as a group, because they are buried in the statistical "noise" associated with hundreds of millions of cancer deaths in the relevant time period.

So the problem, in a nutshell, is that not seeing is not believing. Since the deaths can't be identified or even measured, the Times's editorialists are treating them as if they're not occurring. They really ought to know better. At a time when countries routinely are led by Harvard or Yale law review editors (like Obama and the Clintons), chemists or physicists (like Thatcher or Merkel), or the long string of "enarchs" who almost always run France,* we can safely assume that policy is made with an awareness of statistics basics. Increasingly, for better or worse, the top journalists are educated at the same schools. So they ought to appreciate that if dose-response models predict a certain level of fatalities, those fatalities must be assumed to be occurring, even if they aren't seen.

To be sure, like the fatalities and illnesses known to be result yearly from exposure to coal emissions, the death and morbidity from nuclear accidents are not easy to factor into policy making. Nobody wants to weigh off deaths from nuclear accidents against deaths from pollution, and make choices on the basis of which technologies cause the most premature fatality. In the final analysis, as in air traffic regulation, policy will inevitably aim to eliminate deaths entirely. That has been the thrust of the EPA's coal regulation in recent decades, and it's bound to be the thrust of nuclear regulation as well.

In the long run, if nuclear energy is to remain viable, there can be no more Chernobyls or Fukushimas.

___________________________________________________

* The enarches are graduates of France's ultra-prestigious Ecole Nationale d'Administration (ENA).

null

European Shale Gas "Bonanza"

Two recent reports, one from the U.S. Energy Information Administration (EIA) and one sponsored by the European Centre for Energy and Resource Security, draw attention to Europe's large estimated reserves of shale gas and their wider implications. The revolution in natural gas plainly will not be confined to the United States. But how it plays out, country by country, is less obvious.

The EIA report estimates that Europe has hundreds of trillions cubic feet of unconventional gas reserves, enough--says the Centre for Energy and Resource Security--to last 60 years at current demand levels. (That compares to the usual estimate for the United States of "enough for 100 years.") Yet Germany, which is the European country most concerned about its growing dependence on Russian gas AND which appears to be in the process of negotiating a once-and-all nuclear exit, has relatively little--just 8 trillion cubic feet.

France, on the other hand, has an estimated 180 trillion cubic feet. Yet the French National Assembly just yesterday voted to ban any gas or oil fracking in the country. Whether or not France's upper legislative body follows suit, the country will remain committed to its longstanding "all-nuclear" energy strategy.

The geostrategic impact of shale gas may be greatest, initially, in eastern Europe, especially Poland. No country is more apprehensive about Russia than Poland, and none other so keen to minimize any dependence. With estimated shale gas reserves that are roughly the equal of France's, it will be pleased to see Chevron drill its first Polish fracking well later this year.

null

Germany and Japan Back Away from Nuclear

Special Report: Fukushima and the Future of Nuclear Power

Editor's Note: This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency.

In the last days, the two countries usually described as the world's third and fourth largest economies have taken steps that may signal the ultimate end of their reliance on nuclear energy. Yesterday Japan's beleaguered prime minister said his government would drop plans to build 14 more nuclear power plants; with plants already being taken offline in the wake of the Fukushima catastrophe, the economic and political ramifications are sure to be very far-reaching. Meanwhile, Germany's government is being advised in a draft report by a panel it appointed to really exit from nuclear energy by the next decade, rather than exit from the exit, which is what Chanceller Angele Merkel would have preferred pre-Fukishima.

It was never particularly obvious why Germany would have built one of its first atomic power plants, Biblis (above), at a scenic spot right on the Rhine River. So it was rather a foregone conclusion after Fukushima that such plants would soon be shuttered for good. But it was not a foregone conclusion that the government would decide to eventually close down all plants and never build any new ones.

Because the political impact of Fukushima will play out over years, it will not be surprising if ultimately Japan's nuclear phobia prevails and the country decides to opt out of nuclear energy completely, following in Germany's steps.

For the record, Japan and Germany are not actually the world's third and fourth largest economies, China is not second, and the United States is not first. In strictly economic terms, the world's largest economy by a wide margin is the European Union; the United States is second, China third, and Japan fourth. The import and impact of Germany's and Japan's turning their backs on nuclear energy is not to be minimized; it is huge. But nor should it be overdone. Germany is the largest state in Europe, but it is not all of Europe. France, Sweden, and Finland remain highly committed to nuclear power, and that is not likely to change. Germany's decision to gradually stop producing nuclear electricity and adopt an all-green energy strategy is comparable to California's. It will have influence in European states that are undecided about nuclear, but not in those that have firmly made up their minds one way or the other.

Fukushima's impact in Asia will be similar. It's not likely to dissuade China and South Korea from sticking with plans to sharply increase reliance on nuclear energy. But in countries like India, which are wavering, its influence could be considerable.

null

Physics Panel Dismisses Atmospheric Carbon Capture

Rather than undertake the challenging job of cutting carbon emissions--whether by switching to lower-carbon or zero-carbon fuels, or by capturing emissions at their source--why not just draw carbon out of the atmosphere and bind it chemically to some substance, which could then release it chemically for storage or industrial use? A panel of the American Physical Society (APS), the world's leading physics association, has subjected that tempting idea to close scrutiny and has dismissed it in unusually blunt terms.

Direct air capture of carbon dioxide [DAC] "is not currently an economically viable approach to mitigating climate change. Any commercially interesting DAC system would require significantly lower avoided CO2 costs. . . . In a world that still has centralized sources of carbon emissions, any future deployment that relies on low-carbon energy sources for powering DAC would usually be less cost-effective than simply using the low-carbon energy to displace those centralized carbon sources. Thus, coherent CO2 mitigation postpones deployment of DAC until large, centralized CO2 sources have been nearly eliminated on a global scale. . . . This report provides no support for arguments in favor of delay in dealing with climate change that are based on the availability of DAC as a compensating strategy."

The APS report, prepared by a task force of its Panel on Public Affairs (POPA), is strictly about carbon capture, not carbon storage. For its analysis, it evaluated a benchmark technological scheme developed by the University of Rome's Renato Baciocchi and colleagues and published in 2006. "This scheme was chosen because it both relies largely existing technology and provides detailed information on material and energy balances that are necessary for a cost analysis of an industrial process," the POPA report says. Princeton University's Robert Socolow, who chaired the POPA task force, says the committee found no other analysis done at a comparable level.

Socolow (pictured above) is best known to the general educated public as the co-inventor of a user-friendly analytic model for assessing carbon reduction strategies based on "carbon wedges." That model, it can safely be said, has been by far the most influential carbon policy tool developed in the last decade. POPA also has a formidable track record. It has produced a series of occasional reports on matters of great public import, most notably the massive study it sponsored in the mid-1980s on "directed energy weapons," the so-called DEW study. It subjected the Reagan administration's Star Wars program to withering criticism and contributed to the administration's decision to downplay missile defense systems based on lasers and instead focus on kinetic kill systems--"smart rocks" that knock out incoming missiles by colliding with them.

If the past is any guide to the present, the latest POPA report, "Direct Air Capture of CO2 with Chemicals," will take atmospheric capture of carbon off the policy agenda. This means, together with the collapse of an anticipated nuclear renaissance, that coming to terms with climate change will be more challenging than ever.

null

Compressed-air Car Proponents Losing Faith

Licensees of the much-hyped AirPOD minicar are pressing for results from Motor Development International, the Luxembourg-registered firm behind the compressed-air-powered vehicle. In recent postings to their websites and coverage by European news sources, some of MDI's partners are now openly questioning the technology and MDI's capacity to develop it -- questions that Spectrum raised in November 2009 in the investigative feature, "Deflating the Air Car."

When Spectrum's feature went to print, MDI was guaranteeing mass-production of AirPODs within a few months at its development base on France's Cote d'Azur. A year and a half later there is no sign of the promised minicars and their advertised 140-kilometer range, and outspoken licensees are blaming MDI.

Their highest-profile critic is Swiss developer Catecar S.A., which purchased rights to produce and market MDI's vehicles in Switzerland and Liechtenstein for CHF 650,000 ($741,000) according to a report last month by Radio Jura Bernois. Catecar intended to begin turning out AirPODs in March 2010 in nearby Reconvillier, but has yet to receive the technology. MDI says Catecar must pay an additional fee prior to tech transfer (close to CHF 1.7 million according to RJB), which Catecar says is contrary to their licensing agreement. "Catecar does not owe any amount to MDI until it has produced its technology, which to date it has failed to do," writes Catecar principal Henri-Philippe Sambuc in an open letter posted on the firm's website.

Sambuc's letter states that it served MDI with legal notice on January 12 that it was late in delivering its technology. The letter ends with an appeal to MDI's independent shareholders to "start paying attention" to the plight of its licensees.

Christchurch-based alternative energy developer IndraNet Technologies is also expressing frustration. IndraNet formed a joint venture with MDI to build and sell AirPODs in Australia and New Zealand. This spring The Future, a website created to tout the joint venture's sustainability goals, announced the formation of "a group of concerned stakeholders" seeking dialogue on "how best to take what was a promising technology out of the trouble [into which] it appears to have fallen."

The post is entitled "How to mess-up good technology? Deep Concerns about the fate of MDI" and links to a 7-page document with the same title outlining the group's concerns.

MDI's website appears not to have been updated since November, and makes no mention of the licensees' concerns or an eventual date to begin selling AirPODs.

Advertisement

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More