Energywise iconEnergywise

Stuxnet Sends Ominous Message

Two months ago the German cybersecurity expert Frank Rieger published a compelling analysis of Stuxnet suggesting it targeted Iranian nuclear facilities, quite possibly the big uranium enrichment complex at Natanz. Two weeks ago the U.S. cybersecurity firm Symantec published an exhaustive analysis that showed beyond any reasonable doubt that Natanz was the main target, though perhaps not the only target. All that is arresting enough. But there's also a larger message, namely that any large networked system--from the smart grid to oil refineries or nuclear reactors--could be vulnerable to malware of similar sophistication.

To quote the summary that concludes the Symantec report: "Stuxnet represents the first of many milestones in malicious code history--it is the first to exploit four operating system vulnerabilities, compromise two digital certificates, and inject code into industrial control systems and hide the code from the operator... Stuxnet has highlighted direct-attack  attempts on critical infrastructure are possible and not just theory or movie possibilities.. . . Stuxnet is the type of threat we hope to never see again."

Huge Stockpile of Kazakh Weapons-Grade Plutonium Is Secured

The Kazakh and U.S. governments recently moved about 100 tons of high-grade plutonium from a poorly secured location on the Caspian Sea to a better-secured secret location at the diagonally opposite corner of Kazakhstan. The operation was described last week in a three-part eyewitness report on National Public Radio  by Mike Shuster, NPR's longtime roving ace reporter.

What made this maneuver so specially important, Shuster explained, was that the plutonium happened to be of the highest quality for nuclear weapons--"ivory" grade, as people in the trade put it.

For those accustomed to thinking of all plutonium as weapons-grade, the Kazakh situation calls for somewhat more explanation than Shuster deemed appropriate for a general audience. It's well known that uranium can either be natural, somewhat enriched for reactors, or highly enriched for weapons, but plutonium is generally thought to be weapons-grade by definition. That's essentially true, but with an important qualification. If the plutonium is produced by neutron bombardment of uranium in a standard power reactor being operated normally, it ends up containing excessive quantities of the higher-isotopes (Pu-241, 243, etc), rendering any weapon made from it prone to premature ignition. Therefore, any country wanting to build good and reliable nuclear weapons will not want to make them from plutonium extracted from standard reactor fuel. (Tthis is why Iran's Bushehr reactor, built with Russian help, is essentially irrelevant to its nuclear-weapons program.)

Even for a general audience, Shuster might have mentioned that the Kazakh plutonium was made in an early breeder reactor, the BN-350, which was customized to both produce electricity and (in facility seen above) desalinate water from the Caspian. Breeder-bred plutonium is higher grade, which is one of several reasons why breeders are such a singularly bad idea.

As Tom Cochran explained in a recent interview here, Russia's breeder program ultimately was a failure because the breeder fuel cycle was never "closed." That is to say, the plutonium produced in the breeders was never processed into fresh reactor fuel, to realize the dream of an endlessly self-propagating fuel supply. Thus it came about that Kazakhstan found itself sitting on enough fuel for make nearly 800 quite satisfactory atomic bombs.

Great Lakes Wind

Four years ago, surveying the U.S. president's excellent strategic plan for climate change technology, I was struck by a detail on a map charting the country's wind resources: It showed of course a lot of strong winds through the Great Plains and off the two ocean coasts; but it also showed something less obvious to all—namely, that the entire surface area of the Great Lakes also was among the country's very windiest regions.

Why not, I wondered, focus wind energy development just there. In contract to the Great Plains, which are notoriously distant from the country's major load centers, the Great Lakes are smack dab in the middle of the country's highly populated old industrial heartland, a region that happens to get most of its electricity from carbon-intense coal. And surely it might be easier and less expensive to develop offshore wind on the lakes than on the coasts.

It turns out I'm not the only genius having such thoughts. On one of his recent posts at Energy Central, Bill Opalka speculates that regulatory hurdles and opportunities for citizen intervention may be somewhat less daunting when it comes to Great Lake wind development, by comparison with projects off the ocean coasts. That may be debatable. And certainly the jury's out on whether Great Lakes projects would be technically easier and overall cheaper.

But a lot of people and a growing number of organizations are exploring these questions. Two months ago, public officials, developers, and stakeholders in the Great Lakes Wind Collaborative met at Case Western Reserve University in Cleveland, for their third annual meeting. A member of the National Renewable Energy Laboratory told them that lakes' wind might be a $100 billion market.

Two years ago, Michigan State University's Land Policy Institute estimated that 100,000 wind turbines situated off Michigan's coasts--the state borders four of the five Great Lakes--could generate as much as 321 GW of electricity, if towers were sunk at all depths. If towers only were built at depths comparable to those already achieved elsewhere in offshore wind farms, the lakes' wind potential would be closer to 100 GW--still a prodigious amount.

At the end of October, the White House, Energy Department, and with the lakes collaborative co-hosted two-day workshop in Chicago to discuss Great Lakes siting issues in detail. Specific projects are under consideration near Cleveland on Lake Erie, off the eastern shore of Lake Michigan near Pentwater, Mich., and in Erie or Ontario off New York State's shores.

More on IPCC Reform

As long as we're on the subject of how the Intergovernmental Panel on Climate Change needs to be reformed to be more authoritative, believable, and helpful, we may as well take note of a conference held earlier this week at Columbia University. The panel on IPCC Reform and the Global Climate Challenge brought together scholars  from political science to meteorology and produced a remarkable amount of agreement on some key points.

--There's an excessively wide gap between the scientific assessments, on the one hand, and political leaders, policymakers, and members of the general public, on the other. "The IPCC is an assessment of science, by scientists, for scientists," as climatologist Gavin Schmidt  put it.

Peter Haas, a specialist on environmental governance, said the assessment reports, besides needing to be more widely seen as legitimate, also should be more timely and pertinent. Only the second of the four assessments reports that have appeared so far was published on the eve of important political decisions being taken, Haas observed.

(Nobody mentioned that the assessment reports also are extremely expensive and bulky, and are never actually seen outside a scientist's office. Why is there not a 175-page summary written for the educated public and published as a paperback?)

--One reason why the IPCC have come under criticism for the way it handles uncertainty is the clumsy way its authors are asked to attach error bars to statements, regardless of whether they have any real basis for evaluating uncertainties or not. The authors need to source what they know more carefully, and refrain from talking about what they don't know, said Syukura Manabe, the computer modeling pioneer.

--The separation of science and policy is too sharp, and the analysis of climate impacts and mitigation strategies too weak, as economist Jeffrey Sachs put it. Manabe noted that most of the errors found in the last assessment were in the impacts section--a subject all the more important now that the Kyoto process is being overwhelmed by fast-growing emissions from the BRIC countries.

Suki Manabe (photo), who can be reasonably described as the father of climate modeling,* was saying that if we can't significantly slow global warming, then we'd better start preparing for its consequences, which are going to be far-reaching and serious. We need to understand them much better, he said, but we know for example that water is going to be a huge issue as the world's wet areas get wetter and its dry areas get drier. We're going to have to worry a lot more about things like water transport and desalination, he said.

*Information on his remarkable career can be found in Spencer Weart's book The Discovery of Global Warming and at the climate website Weart maintains at the American Institute of Physics or in my book Kicking the Carbon Habit, one chapter of which is a profile of Manabe.

"Cool It" Instead Fuels It

The title of the new movie featuring "skeptical environmentalist" Bjorn Lomborg is of course an intentional pun: It refers at once to cooling the debate over global warming and finding a way to actually cool the planet. The second point may come as surprise to some, who may not be aware that Lomborg is not and never has been a climate change denier. What he has argued is that proposed solutions to global warming are too expensive and detract from spending on more serious global problems such as disease, ignorance, and poverty.

Cool It opens with a rather long and boring personal introduction of Lomborg and a defense of his controversial book. Only after 20 minutes do we arrive at the 1990 Rio conference, where the U.N. Framework Convention on Climate Change was adopted; it is subjected to an inadequate and tendentious discussion, along with the follow-on Kyoto Protocol and last year's Copenhagen climate conference. We see President Obama telling the assembled delegates in Denmark that we have been talking about climate chance for two decades "without much to show for it."

That's a falsehood, and the film does its viewers no service and Lomborg no credit in repeating and endorsing it. The Kyoto Protocol called on the advanced industrial countries to cut their greenhouse gas emissions by 7-8 percent by 2008-12 by comparison with 1990. As of 2007, Germany had cut its emissions by 21.3 percent, the United Kingdom by 17.3 percent, France by 5.3 percent, and the members of the European Union that adopted the protocol by a combined 4.3 percent. In the United States, however, emissions increased by 16. 8 percent; in Australia, the other major country that declined to ratify the protocol, they were up 30.1 percent.

Had there been no Rio treaty or no Kyoto Protocol, it's reasonable to assume that European emissions would have climbed from 1990 to 1997 almost as much as U.S. and Australian emissions did. So to say we have "nothing" to show for the two agreements is as if five buddies pledged to stop drinking, four succeeded but the biggest guzzler of all relapsed, and we thereupon declared the abstinence program a total failure. The failure is the hopeless drunk, not the program.

Though the argument in Cool It is based on the relative costs of greenhouse gas reduction versus other strategies, the film is remarkably casual in the way it throws numbers around. It does not explain carefully where its major cost estimates are coming from, what their basis is, or put them in a meaningful context. Not surprisingly, the movie has left viewers confused. One reviewer came away with the impression that the combined industrial country cost of complying with Kyoto was said to be $280 million per year, an absurdly small number. The film's actual number, $280 billion sounds big but actually is not so huge: Assuming the combined annual GDP of the industrial countries is at least $50 trillion, $280 billion is a fraction of 1 percent.

The same kind of problem arises with the estimate of how much it will cost Europe to attain its 20-20-20 goals, which Lomborg puts at $250 billion. We don't really know what the basis of that number is, and to me it sounds rather inflated. But even if it's correct, let's remember that it represents about 1.6 percent of Europe's $15 trillion domestic product and about $500/person/year. Is that too much to pay for insurance against the long-term risks of dangerous global warming? Lomborg says, unintelligibly, that it would be like buying home insurance that only covers the doorframe.

Lomborg wants to think he's clarifying the debate about global warming, but he's not. What he proposes is that instead of spending $250 billion per year on greenhouse gas reduction the money instead should be spent on the development of futuristic gee-whiz energy technologies, which he continues to be amazingly credulous about. Where is the $250 billion to come from? One reviewer came away with the impression that it would come from a carbon tax, and that may indeed be what Lomborg is suggesting.

If so, the supposed maverick is saying essentially what just about every other serious advocate of action on climate is saying: We should either tax carbon or auction emission credits in a cap-and-trade system, and use the very large revenues to fund development of low- and zero-carbon energy technologies. Lomborg is purveying the conventional wisdom, but in claiming to be saying something distinctively different, he's only sowing confusion and tossing random kindling on what's already a blazing fire.

Trenberth Goes Back to Basics on Climate Change

At last count we had about 40 comments on the interview with you about IPCC reform, many of them hostile. Where do you think that extreme hostility is coming from? You and your colleagues must think of yourselves as honest, hard-working scientists, and yet you stand accused of being a “liar,” part of a corrupt enterprise, “the system.”

This kind of thing is not confined to climate science. There are a lot of other examples out there. It's a nasty atmosphere, and any hot topic brings this out. Climate science has become highly politicized, and climate change denial is closely related to the strong opposition many people feel at having the government interfere in their affairs in any way. Under these circumstances something like “climategate” was bound to happen. I received something like 19 pages of e-mail with extremely abusive language in them. I kept asking the mail people why our filters weren’t catching them.

Many of the hostile comments on the interview appear to come from engineers. Some express anger, for example, about IEEE’s stand on climate change, which must be some kind of misunderstanding, as IEEE as such doesn’t take positions on controversial matters of politics and policy. Are you surprised to see well educated, well credentialed individuals expressing such anger?

Yes, I am rather. Last year I gave an invited lecture at an IEEE meeting of aeronautical engineers, with about 700 in attendance. Afterwards a tremendous number of people came up to me and said they had no idea as to the nature of the evidence regarding climate change, how everything fits together. In general scientists and engineers have open minds.

In terms of the major claims that are so hotly contested-—that the earth is getting warmer, that human activity is playing a significant role, and that warming will gather pace unless we change our ways rather radically—to what extent to they rest on computer modeling and how much on observational science?

Our knowledge that the earth is warming does not come from modeling at all. It comes from a very wide variety of sources and is incontrovertible—“unequivocal,” the IPCC said in its last assessment. The main sources are temperature readings, melting of Arctic sea ice and glaciers, snow extent rise in sea level, and ocean temperatures.

What about the human role?

We can see that greenhouse gas concentrations are growing in time and that temperatures are rising, and we postulate a causal relationship. But we need computer models to test and prove that relationship. We run the models with the observed changes in the atmosphere and see what temperatures are generated, then remove the gas forcings, and compare the results. Though there's some natural variability, of course, since 1970 human-induced warming has clearly emerged from the noise.

As for prediction of future temperatures and other effects of climate change, there we rely entirely on models.

What about paleoclimatic research—ice science and such?

It plays a smaller part. It depends on use of proxies to substitute for instrumental readings, and when you go back further than 500 years, the number of such proxies drops and data get blurrier. As you go back in time annual resolution is no longer possible. We also know much more about forcings in the modern period: We have information on solar activity going back to Galileo, and we know quite a bit about volcanic activity, which affects the composition of the atmosphere significantly.

Can you say something about the widespread belief that solar activity somehow accounts for the temperature changes we’ve seen in recent decades?

That’s easily disproven. It’s nonsense. Since 1979 we've had spacecraft measuring total solar irradiance, and there's been no change—if anything the sun has cooled slightly. There's nothing in the record that indicates that the sun is responsible for any of the warming in this period.

And further back?

The sun did get somewhat more active early in the last century and probably did account for some of the warming until about 1950. The best current estimate is that it explained about one tenth of one degree Celsius of the warming, which totaled about 0.3C from 1900 to 1950. There was a solar minimum in the early 1700s and probably was partly behind the so-called Little Ice Age. Going much further back to the alternation of ice ages and interglacial periods, they were driven by subtle changes in the orbital relationships of sun and earth—the Milankovitch cycles. We learned from the study of them that small changes in solar irradiation can generate huge changes in climate. In fact, the changes in radiative forcing from the rise in greenhouse gas concentrations in the atmosphere are much larger than the changes connected with the Milankovitch cycles. 

For the record, are you the person who Dave D says messed with the Australian temperature record?

I’m a native of New Zealand and have had nothing to do with the Australian record.

As a person who has contributed a great deal of time to IPCC work, do you feel that lead scientists should be directly compensated by the IPCC?

No, that would only make the process even more vulnerable to the criticism that it is somehow self-serving.

Renaissance Hiccups: What Do Recent Nuclear Reactor Incidents Tell Us?

The nuclear proponents will say that the system is working. Two separate incidents at the Vermont Yankee and Indian Point nuclear plants on resulted in shutdowns of both sites, but authorities in both cases were confident that the problems posed no threat to the public. And hey, that's the idea, right? Catch things before they really do become a problem. Let's take a look.

The Vermont Yankee plant, in Vernon, Vermont, closed down because a routine check found a leak from a welded over pipe in the feedwater system; this is a closed loop that brings water to cool the reactor. But this wasn't just regular water dripping at about 60 drops per minute from the pipe; it was radioactive water. According to an update from Entergy, who owns the plant (and whose Vermont Yankee website can be found on the somewhat ironically domained, the pipe was in a section of the system that could not be repaired while the plant remained active, so they made a "conservative decision" to pull it offline. Since Sunday, the pipe has apparently been repaired and the reactors were being powered back up and reconnected to the grid on Wednesday.

Meanwhile, in Buchanan, New York, one of two main transformers exploded on Sunday evening and caused an automatic shutdown of the second of three reactors at Indian Point (pictured). Importantly, these transformers are located outside of any area with nuclear material nearby, so the all-clear report that sounded just a few hours later was undoubtedly justified.

In a statement from -- oh look, Entergy owns this one too! (And Indian Point has a similarly hilarious website, Anyway, they claim everything is now functioning as normal; nothing to see here, apparently.

And yes, neither of these incidents seem to have been particularly dangerous, and the response systems in place do appear to be working as they should. The bigger issue here is that these aren't necessarily isolated problems. Vermont Yankee became the first plant in more than two decades to be shuttered (well, after 2012, at least) by the public when the state senate voted against renewing its license. The reason? Radioactive tritium leaks. Entergy reported last month that tritium levels are low and that there is no danger of water contamination, but the problem highlights the fact that our nuclear infrastructure is not getting any younger.

Whether or not the next incident actually is a threat is almost beside the point; multi-billion dollar loan guarantees will be a hard political sell if the public keeps hearing about radioactive water leaks and explosions. If the old plants can't keep quiet for a while, the nuclear renaissance might be dead in the water.

(Image via Daniel Case)

New York State's Climate Plan Contrasts with California's

Six years ago the New Yorker magazine caused rather a stir with an article in which writer David Owen described how he and his wife had moved to a utopian green community where they got by living in just 700 square feet, with no dishwasher , garbage disposal or car, and a monthly electric bill that came to about one dollar. That utopian community was Manhattan.

Yesterday New York State issued an interim greenhouse gas reduction plan, its (outgoing) governor David Paterson having pledged last year to put the state on a path such that its emissions would be 80 percent lower in 2050 than in 1990. That's a weaker goal than California's, which targets 2020, a timeframe in which leaders can be held accountable for their results. The philosophy and approach behind New York's plan also is strikingly different from California's. Yet the two states also have two big things in common: Relatively low per capita energy consumption and emissions, because of their highly urbanized "green" lifestyles, and populaces and political leaderships that are firmly dedicated to combating global warming.

At some risk of oversimplification, it can be said that California's commitment is tightly bound up with its self-perception as a high-tech state that stands to benefit in the long run from driving development of energy-saving, carbon-reducing technologies, while New York is more focused on helping avoid the local ill effects to be expected from climate change.

Those effects, as spelled out in a fact sheet that accompanied release of the interim report, could include a temperature rise by 2080 of 5 to 7.5 degrees Fahrenheit, heavier rainfall and flooding, more frequent and severe heat waves, and a rise in the level of coastal waters and the Hudson River estuary of 12-55 inches.

The report puts the emphasis squarely on improving the efficiency of buildings, which account for about 40 percent of the state's emissions, and transportation, where New York has been a pioneer in electrifying public transport, notably buses. It also calls for adopting a more aggressive renewable portfolio standard, to encourage lower-carbon generation, and for "expansion" of the Regional Greenhouse Gas Initiative, which envisions a regions carbon trading system. It characterizes its proposals for buildings and transport as "no regrets" options--that is, policies that would make good sense even in the absence of climate change.

The problem with the plans being made by New York, obviously, is that the state can't do it alone. Avoiding the ill effects of climate change depends on everybody all over the world doing their part. So, while California and New York, Britain, Germany and Denmark all make good-faith efforts to reduce their emissions, all will be naught unless they somehow persuade the other big emitters to make equivalent efforts.

IEA Anticipates a Fossil-Dominated Future

The International Energy Agency's annual World Energy Outlook, released yesterday, is receiving relatively little attention in the press even though the alternative energy scenarios the IEA traces are of above-average interest.

Admittedly, the main messages of the report are pretty much what we already all know in our guts: Oil, coal, and natural gas will continue to dominate world energy consumption, whether we conduct business as usual, adopt greener policies, or make heroic efforts to keep carbon concentrations in the atmosphere below 450 ppm so as to prevent temperatures from rising more than 2 more degrees celsius in this century.

What gives the report its bite is its sharp attention to the issue of what a really effective global climate policy would require, a subject on which the IEA and its director Nobuo Tanaka have shown themselves to be true believers. As in its previous annual report, the agency still expects world energy consumption to grow sharply, with fast-developing countries accounting for the lion's share of additional energy demand. Among fossil fuels, reliance on natural gas will grow the most strongly, an estimated 44 percent by 2035--more than a third of that increase being "unconventional" (shale) gas. Though consumption of coal and oil decreases in the advanced industrial countries according to the IEA's intermediate "new policies" scenario--the one it seems to consider most probable--demand for oil and coal in China, India and other developing countries increases by a significantly greater amount.

With expectations for global climate policy much dampened since Copenhagen  last December and the U.S. failure to adopt a greenhouse gas reduction bill this year, the notion of reducing governmental subsidies for fossil fuels has been getting a lot of attention of late as an alternative approach. But the IEA survey implicitly casts serious doubt on the plausibility of that approach. Subsidies are by far the highest in just the countries that have oil and gas coming out their ears: Iran, Saudi Arabia, Russia, etc. Are we seriously to believe that governments in such countries are going to risk their popularity or stability by sharply boosting domestic energy prices?

The IAE report makes specifically clear what we already know, namely that really effective climate policy--of a kind that might contain the global temperature rise in this century at no more than two additional degrees Celsius--would require a gargantuan effort on the part of just those countries most reluctant to commit themselves to it. In terms of what would be required between now and 2035, China would have to account of 32 percent of CO2 abatement and the United States 18 percent.

As yet, regrettably, there's still no sign that the challenge issued by IEEE Spectrum magazine in November 1999 will be met: Noting that China would surpass the United States as the world's top source of greenhouse gas emissions in the next century, we said that "if countries like the United States want to mitigate risks of climate change, after cleaning up for themselves and getting their own houses in order, the next-best thing they can do is help China and India do the same."


Toward a Non-Reflecting, Self-Cleaning Solar Panel

Continuing on yesterday's improving-solar-panels beat, today we learn of a new process that can quickly create a solar surface that barely reflects any light and can clean itself. And the innovation comes from sunny Finland, of course.

Researchers at Aalto University (a recently formed conglomeration of three older Universities) created a new method -- using deep reactive ion etching -- for fabricating a pyramid-shaped nanostructure on a silicon surface. The silicon wafer, once etched, can then be used to create a stamp, making further wafer fabrication an easy additional step.

Generally, the smooth silicon surfaces used in solar cells reflect a lot of the light that hits them, lowering their efficiency. The shaped surfaces, though, barely reflect any light at all.

Water and particle accumulation on solar cell surfaces also increase the reflectivity, so the researchers went a step further. They coated the surface with a "low surface energy fluoropolymer," which made it ultrahydrophobic. Water droplets that hit a solar cell with such a coating would quickly roll off, carrying dust and other particles with them. There are other strategies for dust removal on solar cells -- including one that was developed for use on Mars, of all places -- but no matter what the method it's clear that keeping a cell clean will boost its efficiency by an enormous amount.

According to the researchers' paper, which was published in the journal Advanced Materials: "High-throughput fabrication of low-cost self-cleaning surfaces, which suppress the reflection of light over a wide spectral range, is expected to have applications ranging from chemical analysis of drugs and biomolecules to photovoltaics."

(Image via Aalto University)


Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More