Energywise iconEnergywise

Could Oil Tankers Be New Reactor Market?

With rising fuel prices, pending international limits on sulfur emissions, and concerns about greenhouse gas emissions in mind, the British shipping consultancy Lloyd's Registry has launched a study of whether oil tankers should be nuclear-powered. At present, reactors have been used almost exclusively to power  military ships, starting with the famous U.S.S. Nautilus submarine (above). In civilian shipping, with the exception of one Russian container boat, nuclear propulsion is used only for icebreakers. Vince Jenkins, global marine risk adviser at Lloyds, clients of the firm are showing interest in alternative propulsion technologies that would cut carbon emissions. "Nuclear power is the only technology that can replace carbon emissions entirely," Jenkins told the Financial Times. 

Two for the Price of One: Singlet Fission and Improved Solar Cells

New research by investigators at the National Renewable Energy Laboratory suggest an interesting way of improving the output of solar photovoltaic cells. Singlet fission is a process in which the absorption of a single photon can produce two electron-hole pairs, thereby potentially doubling the output of a solar cell.

The research, published in the Journal of the American Chemical Society, involved the molecule 1,3-diphenylisobenzofuran. In an e-mail, one of the investigators told me that singlet fission is closely related to another process called multiple exciton generation, previously indicated as another possibility for improving solar PV output. MEG occurs in quantom dots, though, compared to the chromophore molecules used in singlet fission, according to Dr. Justin Johnson, of the NREL.

"There is a lot of fundamental research yet to be done to understand singlet fission before it could be useful in solar energy," Johnson told me. He guessed that commercial-scale use of the idea in solar technology is at least five years off, but added that the idea could be applied right away for two types of solar cell: "One is an organic photovoltaic designs (e.g., pentacene/fullerene systems) and the other is dye-sensitized solar cells (i.e., a molecular dye attached to nanocrystalline titanium dioxide)." Johnson continued:

"The efficiency gain arises from the fact that without singlet fission these configurations waste energy to heat during the steps after light absorption and before charge collection. In many cases, the two excitons produced by singlet fission can both be harvested in the same way that one exciton is harvested in a conventional device but without the lost heat, leading to a factor of two increase in efficiency. This is just one scenario, and the exact details of how the device is configured must be taken into account to determine the actual expected enhancement."

He also said that though the molecules used in the recently published work can be difficult to work with, his group is now starting to use easier materials that could be mass produced and would remain stable for at least 10 years, meaning the eventual cost of such solar cells wouldn't be much higher than currently available PV technology.

(Image via NREL)

Coal and Cancun

To first approximation, the political struggle over climate policy can be reduced to the future of coal. The Cancun climate talks are going nowhere because of a stalemate between the United States and China, which each account for about a quarter of the world's greenhouse gas emissions. China gets about three quarters of it electricity from coal, the United States close to half, and neither country is willing to confront its coal industry head-on.

Countries like the UK and Germany, which have been systematically winding down their coal industries, have cut their greenhouse gas emissions sharply and are inclined to stick with the Kyoto formula--first the industrial countries reduce their emissions, then the developing countries. That principle of "differentiated responsibilities" is considered sacrosanct in the Third World. But the United States wants the fastest developing countries, China in particular, to join the industrial in emissions cuts.

That in essence is why Cancun is deadlocked, just as Copenhagen was this time last year.

So if it's a question globally of whether to systematically reduce carbon emissions or to reduce reliance on coal, Big Coal is winning the Big Political Battle. On the ground, however, where it's a question of costs, resources, and technology, coal may be starting to lose. The two biggest factors in play? The revolution in unconventional gas, which already is transforming the future of fossil fuels in the United States and soon may have a similar impact in China. And the world market in coal, which could be undergoing the same kind of transition we have witnessed in world oil during the last decade.

Speaking last week at Platt's Global Energy Outlook Forum in New York City, reporter Bill Holland of Platts Gas Daily cited two recent reports--one from Credit Suisse, one from Deutsche Bank--both concluding that if natural gas prices are at $4 per million BTUs, there is no poiint in building a new coal-fired power plant in the United States and little point in continuing to operate one. As it happens, current U.S. gas prices are roughly $4/mBTU, and futures prices as far out as twelve years are in the vicinity of $5/mBTU and never higher than $6/mBTU. So there's little prospect, Holland concludes, that it will make sense for U.S. utilities to remain as dependent on coal as they are now.

To be sure, some very large coal plants are  continuing to be brought online in the United States. But tighter restrictions on emissions of mercury, SO2 and NOx already are in the works, so even if EPA's soon-to-be-released carbon restrictions are relatively lax, operators of coal plants are sure to see their costs rise quite sharply. Credit Suisse estimates that almost a third of current coal-generating capacity in the United States is subject to no controls whatsoever. As regulation firms up and inexpensive "fracked" gas increasingly enters the picture, demand for coal could drop as much as 30 percent and demand for gas rise as much as 15 percent, Credit Suisse estimates.

Natural gas already is making rather sharp inroads into coal generation in the United States, causing domestic demand for coal to drop somewhat. But the global picture is the opposite. China, whose voracious appetite for coal is generally explained in terms of its seemingly inexhaustible supplies, recently has turned into an importer of significant scale. Those who haven't already lost their short-term memory will recall that early in the last decade China suddenly emerged in the world petroleum market as an importer, which turns out to have signaled the run-up in world oil prices that has the most important single feature of the global economy since mid-decade. Proponents of King Hubbert's "peak oil" theory, which had correctly predicted a peaking in the United States, claimed vindication.

Now similar claims are being made about coal. Writing in a recent issue of Nature, Richard Heinberg and David Fridley argue that economically recoverable reserves in the major producing countries may be much lower than the conventional wisdom has had it, and that China's coal production could peak as soon a 2015. If China's imports continue to grow at current rates, the country will be buying the equivalent of total Asia-Pacific coal exports. Two years ago, the world's entire seaborne trade in coal amounted to barely more than a fifth of what China alone consumes annually.

Hubbert correctly foresaw that U.S. oil production would peak in the early 1970s. In terms of energy content, U.S. coal production peaked at the end of the 1990s, Heinberg and Firdley note.

Will Reality Follow Projections? Renewable Energy Reports Highlight Enormous Potential

As we've mentioned here before, a big part of the renewable energy field seems to involve reports and estimates with less forward progress toward realizing some of that potential than one might like. Some recent additions to the report field highlight the fact that though much of the possibilities do remain just that -- possibilities -- for the moment, we may be on the cusp of truly breathtaking advances in renewable energy development.

One report from the industry group Solar Energy Industries Association presented at the COP16 climate summit in Cancun estimates that global solar capacity could reach 980 gigawatts by 2020. If true, this would represent a truly amazing jump from present: after a record year in 2009, the global solar capacity was only about 20 gigawatts.

As one maker of solar photovoltaic cells told BusinessWeek, that huge leap isn't exactly an easy thing to accomplish. “The capital needed to manufacture that much capacity is staggering,” said Nancy Hartsoch, vice president of marketing at SolFocus Inc. in California. The report most likely includes completion of huge projects like the DeserTec plants in the Sahara.

Moving to the wind sector, a report from the non-profit Environment New Hampshire highlights the lack of real aggression on the part of the United States when it comes to offshore wind. With more than 2 gigawatts already spinning off Europe, and China's first offshore wind farm (102 MW) now online, the paltry goals for the turbine-less US seem all the more weak. The National Renewable Energy Laboratory's report from earlier this year showed the enormous existing potential around US shores, and the east coast alone holds more than 200 gigawatts of offshore wind potential (after taking locations and socioeconomic factors into account). Even still, Europe aims to have 40 gigawatts of offshore wind by 2020, compared to only 10 gigawatts in the US. There is hope, though, that the Interior Department's recent announcement of an improved permitting process might kickstart the industry.

Ten gigawatts of offshore wind in the US would add only about 30 percent to the existing wind capacity overall (which accounts for about two percent of the total US electricity generation capacity). To make a dent in emissions reductions goals, that number will have to grow.

(Image via Environment New Hampshire)

Will Winter Weather Worsen EV Range? Not Hardly.

We’re big fans of Mythbusters and love busting some of our own myths. 

So we decided to test a myth we hear time and time again: electric cars can’t go very far in the winter when in-cabin electric heating zaps the power from the battery pack.

We put on our thermal underwear, grabbed a warm coffee and headed out into unseasonably cold weather with a 2011 Tesla Roadster Sport 2.5 to see just how the sports car coped with ice, snow and a raging cold northerly wind.

As luck would have it our car for the weekend was exactly the same vehicle we borrowed in October, so we were able to draw a good comparison between the two weekends in true Mythbuster style.

As we pulled out of Central London and headed west down one of the many freeways radiating out of the U.K’s capital the entire U.K. was under severe weather advisories for heavy snow and extreme cold. 

While the temperatures we were set to experience were mild in comparison to a severe north east Winter we were promised temperatures as cold as 14 °F, with daytime maximums barely creeping above freezing point.

But this didn’t phase the Tesla Roadster Sport 2.5. With powerful seat heaters and a fully electric heater providing enough warmth to make the cabin more than cosy, the Tesla forged forward into the encroaching darkness.

About 40 miles before our destination snow started to fall. Initially light, the snowfall became heavier until our car registered an outside temperature of around 25 °F. 

At this point the Tesla, still in Range mode, was predicting enough power for at least another 80 miles. On arrival, the on-board computer predicted a further 40 miles in range mode was possible.

The next day, we took the Tesla on an exhaustive trip designed to give it a thorough working out. First, a 60 mile freeway trip at 70mph, followed by a further 80 mile meandering route through some of the southwest of England’s most challenging roads.

With the temperature below 27 °F for the entire trip and temperatures dropping to an indicated 20 °F while driving through the iconic Cheddar Gorge, our test car didn’t put a foot wrong, climbing up the 1,000 feet twisty road with ease.

Even with the best will in the world we just couldn’t make the Tesla Roadster Sport 2.5 lose grip while driving. The Tesla’s traction control made sure it stayed pointed in the right direction, even when we drove on sheet ice.

In fact, the only way we could force the Tesla to slide around was to find a deserted parking lot and turn the traction control off.

We’d planned to film our frigid fun, but it turned out our camera equipment just couldn’t handle the cold and switched off as soon as it was exposed to the extreme windchill. No such problems for the car, however, which kept on providing heat, power and entertainment for a whole weekend.

In total we used just over 140 kilowatt-hours of power over the weekend, resulting in a massive 450 miles of snow-filled fun. 

We struggled to find a difference in performance, range or energy consumption between our cold-weekend and our mid-Fall test-drive. Whatever we threw at it, the 2010 Tesla Roadster Sport 2.5 kept going.

Our only problem? A frozen trunk mechanism after the overnight temperature dipped below 16 °F which required a few hours of driving to thaw out.

Other than that, we’d have to say that Tesla have managed to change our perception of Winter electric vehicle motoring.

Myth Busted.

This article, written by Nikki Gordon-Bloomfield, originally appeared on, a content partner of IEEE Spectrum.

The Ideal Wind Farm: Tweaking Turbine Spacing to Improve Output

In the early days of wind energy development, it seemed there was little thought put into some of the details of how to put together a wind farm. The Altamont Pass farm might pass as the poster child for some early missteps, as its small and tightly clustered turbines kill more than 4,000 birds per year (including 70 protected golden eagles). More recently, a lot more thought is going into just how the thousands of turbines the world is building should be spaced

In a presentation at an American Physical Society meeting this week, Johns Hopkins researcher Charles Meneveau discussed work on an algorithm designed to optimize the placement of turbines in a wind farm. Among the findings -- which are based on computer modeling of the flow of air around spinning turbines -- is that generally we've been placing them too close together.

According to a press release, large turbines (of the five-megawatt variety) should be separated by 15 rotor-diameters rather than seven, which is commonly used today. Turbulence created by the spinning blades creates a situation where the speed and direction of the wind is muddied, meaning that turbines placed close together might not be creating as much energy as they could at slightly larger spacings.

This isn't the first research looking at how to get the most out of a lot of turbines placed close together. Earlier this year, investigators in Spain published a paper in Renewable Energy on an algorithm designed to optimize wind farm arrangement. All of this work is a crucial step in improving wind power's overall viability: the continuing effort to bring the cost-per-kilowatt down into fossil fuel range will make each new wind farm that much easier to build.

(Image via Wikimedia Commons)

Stuxnet Sends Ominous Message

Two months ago the German cybersecurity expert Frank Rieger published a compelling analysis of Stuxnet suggesting it targeted Iranian nuclear facilities, quite possibly the big uranium enrichment complex at Natanz. Two weeks ago the U.S. cybersecurity firm Symantec published an exhaustive analysis that showed beyond any reasonable doubt that Natanz was the main target, though perhaps not the only target. All that is arresting enough. But there's also a larger message, namely that any large networked system--from the smart grid to oil refineries or nuclear reactors--could be vulnerable to malware of similar sophistication.

To quote the summary that concludes the Symantec report: "Stuxnet represents the first of many milestones in malicious code history--it is the first to exploit four operating system vulnerabilities, compromise two digital certificates, and inject code into industrial control systems and hide the code from the operator... Stuxnet has highlighted direct-attack  attempts on critical infrastructure are possible and not just theory or movie possibilities.. . . Stuxnet is the type of threat we hope to never see again."

Huge Stockpile of Kazakh Weapons-Grade Plutonium Is Secured

The Kazakh and U.S. governments recently moved about 100 tons of high-grade plutonium from a poorly secured location on the Caspian Sea to a better-secured secret location at the diagonally opposite corner of Kazakhstan. The operation was described last week in a three-part eyewitness report on National Public Radio  by Mike Shuster, NPR's longtime roving ace reporter.

What made this maneuver so specially important, Shuster explained, was that the plutonium happened to be of the highest quality for nuclear weapons--"ivory" grade, as people in the trade put it.

For those accustomed to thinking of all plutonium as weapons-grade, the Kazakh situation calls for somewhat more explanation than Shuster deemed appropriate for a general audience. It's well known that uranium can either be natural, somewhat enriched for reactors, or highly enriched for weapons, but plutonium is generally thought to be weapons-grade by definition. That's essentially true, but with an important qualification. If the plutonium is produced by neutron bombardment of uranium in a standard power reactor being operated normally, it ends up containing excessive quantities of the higher-isotopes (Pu-241, 243, etc), rendering any weapon made from it prone to premature ignition. Therefore, any country wanting to build good and reliable nuclear weapons will not want to make them from plutonium extracted from standard reactor fuel. (Tthis is why Iran's Bushehr reactor, built with Russian help, is essentially irrelevant to its nuclear-weapons program.)

Even for a general audience, Shuster might have mentioned that the Kazakh plutonium was made in an early breeder reactor, the BN-350, which was customized to both produce electricity and (in facility seen above) desalinate water from the Caspian. Breeder-bred plutonium is higher grade, which is one of several reasons why breeders are such a singularly bad idea.

As Tom Cochran explained in a recent interview here, Russia's breeder program ultimately was a failure because the breeder fuel cycle was never "closed." That is to say, the plutonium produced in the breeders was never processed into fresh reactor fuel, to realize the dream of an endlessly self-propagating fuel supply. Thus it came about that Kazakhstan found itself sitting on enough fuel for make nearly 800 quite satisfactory atomic bombs.

Great Lakes Wind

Four years ago, surveying the U.S. president's excellent strategic plan for climate change technology, I was struck by a detail on a map charting the country's wind resources: It showed of course a lot of strong winds through the Great Plains and off the two ocean coasts; but it also showed something less obvious to all—namely, that the entire surface area of the Great Lakes also was among the country's very windiest regions.

Why not, I wondered, focus wind energy development just there. In contract to the Great Plains, which are notoriously distant from the country's major load centers, the Great Lakes are smack dab in the middle of the country's highly populated old industrial heartland, a region that happens to get most of its electricity from carbon-intense coal. And surely it might be easier and less expensive to develop offshore wind on the lakes than on the coasts.

It turns out I'm not the only genius having such thoughts. On one of his recent posts at Energy Central, Bill Opalka speculates that regulatory hurdles and opportunities for citizen intervention may be somewhat less daunting when it comes to Great Lake wind development, by comparison with projects off the ocean coasts. That may be debatable. And certainly the jury's out on whether Great Lakes projects would be technically easier and overall cheaper.

But a lot of people and a growing number of organizations are exploring these questions. Two months ago, public officials, developers, and stakeholders in the Great Lakes Wind Collaborative met at Case Western Reserve University in Cleveland, for their third annual meeting. A member of the National Renewable Energy Laboratory told them that lakes' wind might be a $100 billion market.

Two years ago, Michigan State University's Land Policy Institute estimated that 100,000 wind turbines situated off Michigan's coasts--the state borders four of the five Great Lakes--could generate as much as 321 GW of electricity, if towers were sunk at all depths. If towers only were built at depths comparable to those already achieved elsewhere in offshore wind farms, the lakes' wind potential would be closer to 100 GW--still a prodigious amount.

At the end of October, the White House, Energy Department, and with the lakes collaborative co-hosted two-day workshop in Chicago to discuss Great Lakes siting issues in detail. Specific projects are under consideration near Cleveland on Lake Erie, off the eastern shore of Lake Michigan near Pentwater, Mich., and in Erie or Ontario off New York State's shores.

More on IPCC Reform

As long as we're on the subject of how the Intergovernmental Panel on Climate Change needs to be reformed to be more authoritative, believable, and helpful, we may as well take note of a conference held earlier this week at Columbia University. The panel on IPCC Reform and the Global Climate Challenge brought together scholars  from political science to meteorology and produced a remarkable amount of agreement on some key points.

--There's an excessively wide gap between the scientific assessments, on the one hand, and political leaders, policymakers, and members of the general public, on the other. "The IPCC is an assessment of science, by scientists, for scientists," as climatologist Gavin Schmidt  put it.

Peter Haas, a specialist on environmental governance, said the assessment reports, besides needing to be more widely seen as legitimate, also should be more timely and pertinent. Only the second of the four assessments reports that have appeared so far was published on the eve of important political decisions being taken, Haas observed.

(Nobody mentioned that the assessment reports also are extremely expensive and bulky, and are never actually seen outside a scientist's office. Why is there not a 175-page summary written for the educated public and published as a paperback?)

--One reason why the IPCC have come under criticism for the way it handles uncertainty is the clumsy way its authors are asked to attach error bars to statements, regardless of whether they have any real basis for evaluating uncertainties or not. The authors need to source what they know more carefully, and refrain from talking about what they don't know, said Syukura Manabe, the computer modeling pioneer.

--The separation of science and policy is too sharp, and the analysis of climate impacts and mitigation strategies too weak, as economist Jeffrey Sachs put it. Manabe noted that most of the errors found in the last assessment were in the impacts section--a subject all the more important now that the Kyoto process is being overwhelmed by fast-growing emissions from the BRIC countries.

Suki Manabe (photo), who can be reasonably described as the father of climate modeling,* was saying that if we can't significantly slow global warming, then we'd better start preparing for its consequences, which are going to be far-reaching and serious. We need to understand them much better, he said, but we know for example that water is going to be a huge issue as the world's wet areas get wetter and its dry areas get drier. We're going to have to worry a lot more about things like water transport and desalination, he said.

*Information on his remarkable career can be found in Spencer Weart's book The Discovery of Global Warming and at the climate website Weart maintains at the American Institute of Physics or in my book Kicking the Carbon Habit, one chapter of which is a profile of Manabe.


Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More