Energywise iconEnergywise

Downsizing Nuclear: Difficulties With Big Plants Spur Interest in Mini Reactors

By now, we've all heard plenty about the "nuclear renaissance," or revival, or whatever it's being called. The Obama administration wants $54 billion in loan guarantees to build new reactors, and various states around the country are ramping up efforts to overturn moratoria and bring in those government dollars. It remains unclear, though, if the relatively sudden momentum will actually yield a bevy of new reactors, or if it will be stopped in its tracks.

In Georgia, where the first of those loan guarantees was headed, a judge ruled that the certification process for the new reactors was illegal, setting back the construction process. An attempt in the Illinois legislature to overturn a longstanding moratorium on new nuclear construction in the industry's flagship state failed to make it past the House. Earlier similar attempts in a number of states, including Minnesota, Hawaii and Kentucky, have also run aground. Perhaps appetites for decade-long construction and costs that routinely jump up by the billion are thinner than they appeared.

Some companies, though, have an alternative to the large-scale, 1,200 megawatt-and-up monsters that are running into such opposition. The small modular reactor, delivering only 25 MW of power for eight or 10 years before being replaced, could come before the Nuclear Regulatory Commission for approval this year. The output will be enough to power about 20,000 homes.

Hyperion Power Generation is one such company, having revealed last year the design for its Hyperion Power Module. Using a liquid metal cooling design (specifically, lead-bismuth eutectic), the 1.5 meter wide and 2.5 meter high reactor would be shipped to customers fully sealed, and shipped back in the same state a decade later, not requiring any refueling or storing of nuclear waste. Of course, if they really start delivering on the reported 150-plus purchase agreements already in place, that means a whole lot of nuclear material being shipped around the country, even in small and supposedly sealed amounts.

Other nuclear players are also reportedly getting in on the act, from french company Areva to Toshiba and its subsidiary Westinghouse. The small reactors, intended for hard-to-reach locales, military bases, factories or anything else that might not get connected to the traditional power grid easily, will come with substantially reduced price tags compared to traditional nuclear projects. Instead of, say, a $3 billion estimate that likely gets bumped up by a factor of two or three by the time it is completed, Hyperion will ask $50 million.

Even if the immense construction difficulties that big nuclear runs into may not be in play, the regulatory hurdles companies must leap are largely the same. The NRC licensing process can take years, and concerns over leaking material and the potential for terrorism remain in spite of company assurances. And environmentalists who feel that the many billions spent on nuclear power would be better directed toward adoption of renewable energy sources like wind and solar likely won't find comfort in the small nuclear designs. After all, even if they do come in at $50 million, 150 of those comes out to $7.5 billion and the electricity output of about three big nuclear reactors. Progress?

Photo via Hyperion Power Generation

One Million Chernobyl Fatalities?

A new book claims that almost one million people died between 1986 and 2004 from exposure to Chernobyl accident radiation. The claim, based mainly on a survey of scholarly literature in Slavic languages, is orders of magnitude higher than the most authoritative previous estimates. Yet the book is published by the New York Academy of Sciences, which says that earlier estimates "have largely downplayed or ignored many of the findings in the Eastern European scientific literature and consequently have erred by not including these assessments."

The book, "Chernobyl: Consequences of the Catastrophe for People and the Environment," is by Alexey Yablokov of the Center for Russian Environmental Policy in Moscow and Vassily and Alexey Nesterenko of the Institute of Radiation Safety in Minsk.

Global assessments made ten years after the accident and reported at an IAEA conference in 1996 estimated that in the long run, the toll from Chernobyl in terms of premature or "excess" deaths would come to about 8,650. But because the number of "background" cancer deaths in the population most severely affected--the 600,000-800,000 involved in clean-up operations--would come to 825,000, most of the excess cancer deaths would  be "hard to detect epidemiologically," said Elizabeth Cardis, probably the world's leading expert on the subject.

Cardis's detailed predictions were discussed in an IEEE Spectrum article that appeared in November 1996, reporting on the IAEA conference. Though that article is not available online, the general picture it presents remains largely valid, according to the most recent update of Cardis's analysis. Thyroid cancer incidence among children was found to be much higher than models would have predicted, but leukemia incidence was lower. Mortality and mobility associated with psychological stress might exceed casualties attributable to radiation exposure.

In 2005, the Chernobyl Forum a consortium of global health agencies and governmental organizations, including the IAEA and World Health Organization--put the death toll at about 4,000. That still makes Chernobyl the worst industrial accident in history. As such the consequences of the accident are not to be minimized or trivialized, but bear in mind that hundreds of thousands of people die yearly from exposure to emissions from coal-fired power plants.

I have not seen the new book and am not predisposed to give much credence to its claims. Any such treatment of Chernobyl health effects would have to somehow rigorously distinguish consequences of the accident from consequences of the general public health catastrophe that has engulfed Russia and some of the Soviet successor states in recent decades. I draw attention to the book here mainly, as said, because it carries the imprimatur of the New York Academy.

Strained MIT Climate Friendship

A feature story in yesterday's Boston Globe depicts an increasingly strained friendship between Richard Lindzen and Kerry Emanuel--the former well known as one of the leading U.S. climate change skeptics, the latter best known for his pre-Katrina prediction that fierce hurricanes would become more frequent. When Emanual first joined Lindzen at MIT, Lindzen (left) was a registered Democrat  and Emanuel a Reagan voter. Emanuel in the meantime has come around to the view that there's a growing risk of catastrophic climate change: "None of the evidence is perfect, but it all points in one direction," Emanuel (depicted in the postage-stamp photo on the contents page) told the Globe. Lindzen took a jab at Emanuel at last year's Heartland Institute conference, saying he and fellow alarmists take the position they do because it "just makes their lives easier," presumably in terms of research funding and peer pressure. Emanuel has accused Lindzen of endorsing dishonest science.

The two men reportedly still have a collegial relationship but are not vacationing together with their families in France these days or inviting each other to dinner.


Trouble Brewing for Wind?

Amid much good news for wind--an onging global surge in wind energy installations, the go-ahead from the U.S. government for the immensely controversial Cape Wind project--comes a report detailing a sharp rise in wind operating costs and poor performance relative to other countries. Prepared by the independent business intelligence service Wind Energy Update, the Wind Energy Operations & Maintenance Report finds that current O&M costs are two or three times higher than first projected and that there has been a 21 percent decrease in returns on investments in wind farms. O&M costs were found to be especially high in the United States, "now the world's largest wind power market."

Based on surveys, the report estimates average world O&M costs at 2.7 U.S. cents per kilowatthour, which compares with the 2 c/kWh at which costs roughly equal the value of U.S. wind production credits.* The report says that while close to 80 percent of the world's wind turbines are still under warranty, "this is about to change." R&D is focusing especially on gearbox reliability: "Many gearboxes, designed for a 20-year life, are failing after six to eight years of operation."

POSTSCRIPT (5/27/10): to listen to a podcast interview with the author of the report, click here

* My apologies to readers for the inexcusable typo that appeared in the original version of this post, and for my having been so slow to notice alerts to the mistake in comments

Big Chills

As every educated European knows--that is to say, virtually every European--the continent's benign climate depends on an anomaly: Atmospheric warming by the Gulf Stream, which Benjamin Franklin first noticed. Were it not for the Atlantic's warming surface currents--and, perhaps too, deflection of high-atmosphere winds by the Rocky Mountains--Paris and London might resemble Winnipeg, and Scandinavia would be virtually uninhabitable. Because of this precariousness, notions of abrupt or catastrophic climate change have more currency in Europe than in the United States--especially the "big chill" scenario developed, ironically, by the American geochemist Wallace Broecker.

Broecker's 50 years of work at Columbia University's Lamont Doherty Earth Observatory got a recent celebration at the laboratory. Anybody wanting a quick and easy introduction to his main accomplishments can do no better than to watch and listen to the songs written for the occasion by folksinger Tom Chapin, who happens to be Broecker's brother-in-law, and Penn State geologist Richard Alley, author of a nice general-reader book about ice coring.

Broecker's big chill started about 10,500 years ago and lasted about 1,200 years, in a period now known as the Young Dryas, named after a Scandinavian flower whose wanderings testified to sudden climate change. In that event, an ice dam blocking a huge inland lake, Agassiz, burst, sending a flood of freshwater into the North Atlantic; the effect was to shut down the North Atlantic conveyor, plunging western and northern Europe into a mini-ice age.

In 1997, Alley published a paper identifying a similar event, one that occurred about 8,200 years ago, in which freshwater abruptly flooded the Hudson Bay. Now, in a recent Science paper, Shi-Yong Yu and colleagues report on an event about 9,300 hundred years ago, with a similar pattern yet again. That episode had "a Northern Hemispheric expression with a spatial pattern nearly identical to that of [Alley's] '8.2 kyr event,' a widespread cooling associated with the sudden drainage of the glacial Lake Agassiz-Ojibway complex through the Hudson Straight."

Commenting, Alley says there do indeed seem to be several significant "wiggles" in Earth's recent temperature record. The basic mechanism starts with the relative saltiness of the Atlantic, a result of trade winds carrying vapor from the Atlantic across Central America to the Pacific. When the salty Atlantic waters reach the region around Greenland and Norway they sink, to start their return journey south. But when there's a sudden freshwater infusion, the waters fail to sink, temporarily shutting down the conveyor mechanism. In the extreme case, says Alley, the surface freshwater freezes off Norway, giving the regional climate a really nasty kick. In any case, "the emerging picture is that the North Atlantic [climate] does care about freshwater."

Sun King

The New York Times has scored a coup this week, running a rare interview with the reclusive green investor and environmental philanthropist David Gelbaum. Besides taking stakes in U.S. solar and clean-tech companies like Entech Solar, eSolar, First Solar,  and SunPower,  as well as similar companies in Australia and China (not to mention Toyota, because of the Prius), Gelbaum also has put money into smart-grid-relevant companies like GridPoint and startups developing energy storage systems to back up renewables. Yet Gelbaum’s support for solar is not indiscriminate. He has fought a large solar project slated for protected land in the Mojave Desert, which the Wildlands Conservancy acquired to preserve. A co-founder of Wildlands, Gelbaum has donated $250 million to the conservancy, as well as $200 million to Sierra Club. (He also has given similar amounts to support veterans of the Iraq and Afghanistan wars, and to the American Civil Liberties Union.) According to the Times profile, which is worth reading in its entirety, Gelbaum lives modestly and gives away most of the money he makes through his main investment vehicle, Quercus Trust.


Gas Up, Coal and CO2 Down Sharply

The Worldwatch Institute, for decades a leading player in sustainability research, has issued a report with startling findings about recent changes in the  U.S. fossil fuel mix and their implications for the country's greenhouse gas emissions. From 2007 to 2009, the share of natural gas in U.S. electricity generation increased from 20 to 23 percent, while the share of coal dropped from 52 to 45 percent. That may not look like much at first glance but in fact may be the beginning of a sea change in U.S. energy. The decrease in coal generation accounted for half the decline in carbon emissions from 2007 to 2007, which dropped a jaw-dropping 10 percent.

"In just two years, we wiped out half the increase in U.S. greenhouse gas emissions that had taken place during the previous 15 years," observes Christopher Flavin, president of Worldwatch and principal author of the report, "The Role of Natural Gas in a Low-Carbon Energy Economy." Could that dramatic decrease in U.S. carbon suggest that the U.S. goal of cutting greenhouse gas emissions 17.5 percent by 2020 is unduly unambitious? "It could," says Flavin.

Easily the most important factor enabling U.S. coal generation and carbon emissions to fall so dramatically has been the fast-growing role of natural gas in electricity generation. That in turn is attributable to the revolution in "unconventional gas," that is, natural gas extracted from deep shale formations by means of horizontal drilling and hydraulic fracturing. But almost important, arguably, is the bear hug that prominent organizations in the environmental community--Worldwatch and Sierra Club, among them--are giving gas.

Gas, after all, is a nonrenewable fossil fuel. So why do a lot of environmentalists like it so much?

If you go to a place like Dimock, in northeast Pennsylvania, where Marcellus Shale gas drilling is going on great guns, you'll see huge trucks carrying drill pipe through the middle of what used to be a sleepy rural town. Methane has contaminated drinking water wells and must now be vented; everywhere around town there are huge unsightly tanks (see photo above), some holding water for injection into wells, some holding the flowback water which contains all manner of chemicals--some put into the injection water to expedite gas extraction, some picked up from the ground on the way back to the surface.

So how can national environmental leaders be for gas? Actually, the advantages, as enumerated by Flavin and his coauthor Saya Kitasei, are overwhelming. First: "Burning natural gas produces virtually none of the sulfur, mercury, or particulates that are among the most health-threatening of pollutants that result from coal combustion," they explain," citing a study that found the environmental cost of gas-generated electricity was one-twentieth--yes, one twentieth!--that of coal.

Second, electricity from natural gas results on average in about half the carbon emissions resulting from coal-fired power. And that's the average: the ratio is much worse for the older, dirtiest coal plants and best gas-fired ones."New combined-cycle gas plants produce 55 percent less carbon dioxide that new coal plants do and 62 percent less than the average U.S. coal plant," say Flavin and Kitasei.

Because of dispatch rules favoring coal and nuclear baseload plants ahead of gas, and because gas often is used in peaking plants that only come onstream intermittently, gas-fired plants represent a much larger share of U.S. generating capacity than they do of actual generation. This implies that changes in rules and policy could induce an even more rapid conversion of coal to gas, if prices stabilize. Flavin thinks it's now conceivable that in the next 10 or 12 years we could cut coal's share in electricity generation to about a 25 percent share in U.S. generation, to yield roughly a 12.5 percent cut in U.S. carbon emissions.

But prices and policies are the wildcards. Traditionally gas prices have fluctuated drastically; their stabilization will depend on finding ways to develop shale gas consistent with meeting local concerns, which are serious. Needed policies include more uniform Federal and state regulation of gas development, revised electricity dispatch rules, tighter air quality regulation, and--above all--a strong cap-and-trade climate bill that discourages coal generation instead of  "locking it in," as Flavin puts it.

Smart Grid Feedback (1)

In a posting two weeks ago I raised the question of whether the much vaunted smart grid will produce actual energy and carbon savings . . . and save customers money. My prediction was that we will not know until next year at the earliest.

That posting elicited a lively response from readers. Though it isn't normally my practice to respond directly to comments or repeat them in follow-on posts, in this case such wide and intense interest attends smart grid prospects, it's worth culling some of the more telling reports.

To be clear: I'm going to ignore the more opinionated postings that complain, for example, about the smart grid's just being another excuse for the government to control our lives and violate our privacy, the "global warming hoax"; the undue influence of the U.S. military-industrial complex, "Stalinist regulations issued by Washington," and the need for technical project leaders with "dictatorial powers."

The question of why so many engineers believe they want more liberty but at the same time betray a yearning for authoritarian leadership--that is a whole other topic, and not one that will ever be addressed in this space.

So let's turn instead to the more factual reports:

--Aggravated risk of network failure. AW worries about recent research suggesting that networked networks are "prone to epic failures."

--Higher electricity bills in Canada. Daniel Fingas reports that in Toronto, smart metering with time-of-day pricing seems to be yielding higher average electricity costs.  "If your water heater (retrofitted with the appropriate control system) or heating/AC is electric you might see some savings, but other than that your only reasonable adjustable consumers are your clothes dryer and dishwasher."

--Radio interference and erroneous data gathering in California. Tom Kirkpatrick, reporting from PG&E country, says that in his home, he could no longer get AM radio after his smart meter was installed, because of RF interference. Further, he says that there have been local news reports of supposedly smarter meters delivering wrong information about power usage and communications failures.

--Texas turmoil. Lyel Upshaw says that a smart grid installation program "has been temporarily suspended while investigations are ongoing regarding consumers' electric bills doubling, tripling, and even more on their first billing after installation." Problems with calibration are probably involved, and Staples is confident that they will be promptly solved, but "the new digital meters [have] not made a good impression on consumers." TMCC says that in Dallas, "tests have had to be run to prove the smart meters are actually set and running correctly in the field since the test program end users found much higher energy bills after installation."

Permit me, since readers so often complain about my alleged biases and obsessions, to express gratitude to IEEE Life Fellow  Jim Crescenzi, who says he considers my reports "especially informative and helpful" and me to be "unusually objective."


Cape Wind Coda

Last week my posting about Federal approval of the immensely controversial Cape Wind project expressed skepticism about its significance and the general potential of offshore wind in the United States.

Tom Vinson, director of Federal regulatory affairs at the American Wind Energy Association, begs to differ. He points out that this is the first such project to obtain Washington's approval, and that all offshore projects further than 3 miles from the coastline require Federal approval. He says Cape Wind now has all required state, local, and Federal permits, except possibly one from the FAA, which must approve any structure higher than 200 feet.

Regarding wind's general potential, Vinson recalls that in a major study that found the United States could generate 20 percent of its electricity from wind, an estimated 54 GW of 300 GW total would be produced by offshore turbines. So that's tantamount to 3 or 4 percent of U.S. electricity coming from offshore wind.

That's not trivial. But I stand by my observation that potential is not the same as realistically developable. Every offshore project will be controversial, and I'll be very surprised if oceanic offshore wind is generating 3 percent of U.S. electricity by 2020 or 2030.

POSTSCRIPT (May 11, 2010):

AWEA, in its annual report, lists 12 offshore wind projects on the books. Four are in New Jersey, two in Massachusetts and two in Rhode Island, one each in Delaware, North Carolina and Texas, and a Great Lakes project near Cleveland.


Fixing the Gulf Oil Leak: Blowout Preventers and Robotic Submarines

Updated June 20


We are now 62 days into the Gulf of Mexico oil leak, and since this post was first published a number of different attempts to stem the flow have been attempted with varying degrees of success. (They varied from "abject failure" to "capturing some of the oil.")

As estimates of oil flow rates climbed steadily from 5,000 barrels per day up to a stunning 60,000 barrels per day, the strangely-named efforts ramped up. For a few days, the top kill approach seemed promising: this involved injecting large amounts of mud and drilling fluid into the well to stem the flow, but after several days BP admitted defeat.

Next, the company did manage to install a lower marine riser package cap over the top of the damaged blowout preventer. To do this, diamond wire saws were deployed to cut off the damaged pipe; with the new cap then manuevered in place, captured oil and gas are brought up a pipe to a waiting ship, the Discoverer Enterprise. The gas is flared off, and the oil stored. This, along with a secondary siphoning technique using the same equipment used for the top kill injection, now has a maximum capacity of 28,000 barrels per day of captured oil. We're still a ways off from President Obama's promise to be capturing 90 percent of the oil within a short timeframe.

Clearly, the technical difficulties involved with an enormous leak under 5,000 feet of water have been, to this point, nearly insurmountable. Still, the flow of ideas from everywhere, including extensive suggestions in the comments section below, is impressive. Among those ideas are the use of explosive charges (including nuclear, although that option has been largely rejected as "crazy" by officials), installing a heavier blowout preventer capable of withstanding the pressure of the escaping oil, and disconnecting the pipe's flanges and installing an open gate valve.

Though some of the oil is now being captured, such ideas could still play a role before the leak can be shut off completely. Recent reports indicate that drilling of the relief wells is ahead of schedule, but BP still estimates they won't be fully operational until August. By then, this spill will have become among the worst the world has ever seen.


When the Deepwater Horizon offshore oil rig exploded on April 20, much of the oil that was actually on the rig ended up in the water. We have since learned that as far as oil spillage is concerned, this was not the problem. The problem actually lies 5,000 feet under water, where the well from which Deepwater Horizon was pulling oil has since been spewing about 5,000 barrels, or 210,000 gallons, of oil each day. The thing is, the rig actually had a piece of technology that should have prevented this. It didn't work.

Oil wells both on and offshore have contraptions called blowout preventers. Those iconic old images of oil well gushers? The blowout preventer stops that from happening, saving much cleanup and money given all the oil that used to spew randomly all over the place before it could be brought under control. According to BP's CEO Tony Hayward, the blowout preventer should have kicked in the day that the explosion occurred, but failed to do so. They don't seem to know why.

Because the 450-ton blowout preventer (a bigger, underwater version of the image to the right, below) still does sit atop the leaking well, option A for stemming the flow of oil has been to kick it into action. To do so, BP has sent in remote-controlled submarines with robotic arms. As cool as that sounds, it too has failed, and again we don't quite know why.

Another less catchy option for preventing further leakage into the Gulf is to create chambers to sit around the three leaks, and connect pipes from those chambers that will funnel the oil up to the surface. Makes sense, but it will take at least a few weeks to get it going. And according to CNN, that technique has never even been tested in water so deep.

Finally, BP has one more idea: by this weekend the Transocean Development Driller III will begin work on a "relief-well" nearby that could stem the flow through the leaking well. Good idea, except that one will take up to three months. Let's see, 210,000 gallons per day, up to 90 days... almost 19 million gallons, or more than 61,000 tonnes, not even including what has leaked so far. If we get to that point, the constant comparisons to Exxon Valdez won't be so hyperbolic; that disaster totaled about 35,000 tonnes of oil (although there have been many other larger spills around the world).

This is the second major offshore oil rig incident in less than a year. In 2009, an Australian rig in the Timor Sea spewed oil into the water for 10 weeks before a decidedly low-tech fix stopped the flow: they pumped huge amounts of mud into the well to plug it up.

Pulling oil up from deep underground that is in turn deep underwater is, clearly, a complicated business. Fixing a giant mistake in that process is proving even harder. A spokesman for BP summed up the efforts to stem the river of oil well: "We're not sure it's going to work, but it's certainly something worth attempting."

(Photos via US Coast Guard and Philbentley)


Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More