Energywise iconEnergywise

U.S. Lighting Competition Status

So far there's still just one contestant for the U.S. Energy Department’s L Prize, as DOE announced on Sept. 24: Philips Lighting. The award, authorized by Congress in 2007, seeks to stimulate rapid replacement of two very widely used and very energy-inefficient lamps, the 60-watt incandescent bulb, and the Par 38 halogen. DOE says that if the four-hundred-million-plus 60-W bulbs currently used in the United States were replaced by LED lighting, enough electricity would be saved to illuminate more than seventeen million households, with five million fewer tons of carbon emitted annually.

Philips reportedly has delivered 2000 prototype bulbs to DOE for testing, claiming they meet all contest requirements: the same amount and colors of a 60-W bulb but for 10 watts, a 25,000-hour lifetime, etc. However, Congress has been dragging its feet on actually appropriating the money needed to reward the winner or winners, and no additional contestants have entered or even expressed firm interest in competing.

In a way it may not matter. A company like Philips doesn't really need the $10 million that would go for the 60-W-incandescent replacement or the $5 million for a successor to the PAR 38. And even if there never is another competitor, the award is made to the first company over the line, so Philips can still win fair and squarely. What really matters, points out DOE's Christina Kielich in an e-mail, is that 27 U.S. companies have partnered with the L Prize program to drive the winning product or products out into the marketplace.

“For example," she says, "PG&E and Southern California Edison might elect to do a bill stuffer, rebate, or other incentive to get the product into the hands of millions of electricity customers. The partners are ramping up plans for field testing next spring, so a number of them will have hands-on experience too."

Latest IEA Energy Report

The latest annual report from the International Energy Agency is well worth consulting directly: Though it's received wide coverage in the general press, that coverage tends to be slanted and unbalanced by comparison with the IEA's own executive summery or even just the excellent press release the agency issued with the report.

The IEA sees itself frankly as setting the stage for the Copenhagen climate conference next month, and contrasts what it calls a reference scenario--what's generally called business as usual--with a 450 scenario. In the reference scenario, temperatures could rise by as much as 6 degrees celsius by comparison with pre-industrial times, whereas the scenario in which carbon concentrations in the atmosphere are limited to 450 ppm would keep the cumulative rise to about 2 degrees.

In the reference scenario, global energy demand increases 40 percent in the two decades to 2030 and fossil fuels continue to dominate supply, with 90 percent of the increase in world demand occurring in the fast-industrializing less-developed countries. In the 450 scenario, fossil fuel demand peaks in 2020, and carbon emissions in 2030 are slightly lower than they were in 2007. Improved energy efficiency would account for more than half the carbon abatement in the 450 scenario, but greater reliance on zero-carbon electricity generation also would play a big role: renewables would account for 37 percent of electric power production in 2030, nuclear reactors 18 percent, and clean coal--coal with carbon capture and storage—5 percent.

The IEA estimates that these adjustments would cost the world $10.5 trillion over two decades, but these expenditures would be largely offset by savings in variety of benefits, including public health. "The challenge for climate negotiators [at Copenhagen] is to agree on instruments that will give the right incentives to ensure that the necessary investments are made and on mechanisms to finance those investments,” said IEA Executive Director NobuoTanaka.


Alternative Energy Plan for China

An international task force is presenting today, Nov. 12, a proposed low-carbon five-year plan to the China Council for International Cooperation on Environment and Development in Beijing. Sir Gordon Conway, co-chairman of the task force, describes some its findings in today's Financial Times. He notes that the Chinese leadership is acutely concerned about the climate problem because of the country's vulnerability to adverse warming effects, its fear of being locked into outdated technologies that will be a liability in a lower-carbon world, and indeed a desire to be global leaders in developing green technology.

The  CCICED task force outlines three carbon-emissions scenarios, and the pessimistic ones are more than a little disconcerting. The business-as-usual scenario has China emitting 13 billion tons of carbon per year in 2050, which is almost twice as much as the whole world is emitting right now. A lower-carbon scenario reduces the country's emissions to 9 billion tons by 2050, which is still almost 30 percent higher than the world's today. A third "enhanced low-carbon scenario" gets China's annual emissions down to 5 billion t/y by mid-century, 30 percent below the present-day world's.

According to Gordon, "the Chinese believe significant reductions can be achieved by decoupling growth from greenhouse gas emissions, as Sweden has done."  Their plan, he says, "is to reduce energy consumption per unit of GDP by 75-85 percent by 2050," by means of industrial restructuring and efficiency gains. The energy mix will become steady greener, with 50 percent of power coming from low-carbon sources by 2030 and all new electricity generation being low-carbon by 2050. But even so, if the enhanced low-carbon scenario is to be achieved, "it will require innovation and technology sharing on a global scale."

It may be startling to hear--even hard to believe--that China's leaders are considering policy changes of this magnitude. But a decade ago, when I was privileged to spend ten days in China looking into the dilemmas posed by coal-fired power generation for an IEEE Spectrum special report (November-December 1999), I was startled by the openness with which Beijing officials were willing to address concerns. I would not have been the first to remark on the country anomalies: a communist government getting set to restructure its electricity system and introduce competition, as if Margaret Thatcher herself were in charge; mid-level officials discussing sensitive and embarrassing issues of pollution and public health in a more relaxed manner than a reporter typically finds in Washington, Paris, or London.

This doesn't mean the Chinese will necessarily succeed in meeting carbon goals, of course. But it would be a mistake to assume they are insincere in setting such goals, or that they are just humoring other countries' negotiators.

A Negative Warming Feedback

A report this fall in Wiley's Global Change Biology reports on a negative feedback from global warming—that is to say, a feedback that makes things better rather than worse. The article in a relatively obscure scientific journal caught the notice of a news service reporter rather belatedly, and that report got my attention later still, but I'm passing it along because so much of the news from the climate front is bad while this—however modest--is a jolly good thing by comparison.

Worst-case-scenario thinking about climate change tends to assume that things will be even dire than models predict because of unrecognized, undiscovered, or underestimated positive feedbacks. There are two notable examples, both big: the feedback from reflective ice in the Arctic and Antarctic flashing to absorptive blue water, as warming takes place; and the feedback from melting permafrost, which could release vast amounts of methane into the atmosphere. (Radiative absorption was underestimated in early climate models, and methane release is hard to predict in current ones.) 

But in a September issue of Global Change Biology, members of the British Antarctic Survey report their discovery that as Antarctic ice has melted, large blooms of marine plants called phytoplankton have been flourishing in waters exposed. The phytoplankton have the potential to absorb roughly 3.5 million metric tons of carbon per year, roughly equivalent to the carbon "sink" represented by tropical rain forest covering 6-17,000 hectares. Lead author Lloyd Peck conceded that this is a small number when compared with the 7 billion tons of carbon ejected by human activity into the atmosphere each year. But he said that as more Antarctic ice melts, the new phytoplankton blooms "have the potential to be a significant biological sink for carbon." 

France's Nuclear Reputation Takes More Hits

The French nuclear industry is accustomed to getting high marks from critical observers, whether trade journalists, business school scholars, or historians like Gabrielle Hecht, author of The Radiance of France. The country’s standardized method of designing and building reactors, its national commitment to this one technology in particular, and the country's well-known traditions of scientific management and rational administration have all come in for some of the credit. But the French industry’s  lustrous reputation has had a tarnishing of late, with a variety of developments and allegations, starting with critical comments by a French expert visiting the United States.

Even as the national French utility EDF was completing a deal with Maryland’s Constellation, to consolidate a marketing foothold in the United States, there came the news on Nov. 2 that French, British and Finnish regulators had told country’s nuclear contractor Areva to adopt additional safety measures for its EPR reactor: Specifically, the regulators complained that emergency control systems are not sufficiently distinct from routine control systems. The top French regulator said there is "no certainty it will be possible to prove anacceptable level of safety based on the current [EPR] architecture." 

The news of EPR safety concerns coincided with the disclosure that 19 of France's 58 reactors are off-line for maintenance and refueling. French Prime Minister Francois Fillon publicly rebuked the country's national utility EDF for allowing this situation to develop, and leaving the country in a position where it would have to import electricity during the winter for the second year running.  Naturally the combined news sent Areva's stock into a small slide.

To keep things in perspective, Areva and EDF are among the few nuclear players in the world that are actually building nuclear power plants and which have concrete plans to build more. Areva is well advanced with construction of  EPR reactors in France and Finland, though both projects are rather badly behind schedule, and it has started to build a pair in China. It has rather specific plans to build four in England and two more in China, and somewhat vaguer hopes of building six in India and four in the United States.

Photovoltaic Grid Parity

In the world of solar energy, "grid parity" generally refers to the point and time when photovoltaic electricity—whether centrally generated or distributed—will be competitive with other sources of electricity. A recent report done by researchers at the Lawrence Berkeley National Laboratory finds that the installed cost of photovoltaic systems declined by more than 30 percent from 1998 to 2008, from $10.80 per watt to $7.50/W. That may sound like very encouraging news but in fact is not, however you look at it.

According to one eye-catching allegation, there's a kind of Moore's law in photovoltaics, which holds that costs come down by 20 percent with every doubling of installed capacity. Rates of installation have varied in the last decade, of course, both year-to-year and world region to world region. What is more, the very idea of a photovoltic Moore's law is a bit slippery—and has tripped me up at least once. But if one postulates conservatively that the installed PV base has doubled roughly every two and a half years in the last decade, then average photovoltaic costs should have come down by close to 60 percent since 1998, not 30 percent.

Generally speaking, grid parity—the point where photovoltaic electricity could compete without subsidies with electricity generated from coal, natural gas, wind, or nuclear—is put at $1/W. That may be a somewhat too demanding standard, considering that photovoltaics work best on rooftops or integrated into construction material, so that electricity is consumed at the point of production, eliminating transmission and distribution costs. But even if grid parity were put at $2/W and installation costs declined at a rate of 30 percent per decade from the currently estimated $7.50/W, it would take PV electricity until roughly mid-century to become economically competitive.

Energy planners with the European Union expect photovoltaic grid parity to be reached around 2015 in Europe's southern-most countries such as Spain and Portugal—at least when PV materials are used in solar concentrating systems. But is there any basis for expecting conventional PV to become competitive that soon?

There's not. Not only are photovoltaics manufacturing costs not in the ballpark right now, they're so far from the ballpark, there's no way of knowing whether they'll ever be in the ballpark. 

Having said that, let me introduce a major qualification. Installation costs are not the best way of evaluating the cost-effectiveness of PV or any other electricity generating system. The generally accepted and standard measure of generating costs is  the levelized busbar cost—that is, the cost of electricity at the point where it is fed into the grid, taking all cost factors into account (investment, financing, operation, maintenance, and so on). Ideally, to evaluate claims made about grid parity or a PV Moore's law, we would want to measure performance in terms of levelized costs and in the same units we all see on our monthly utility bills, namely cents per kilowatt-hours.

As it happens, however, there are at least two insurmountable obstacles to our actually doing that. One is that photovoltaic electricity is too new and its cost elements are changing too fast for anybody to make reliable estimates of its average levelized costs over time. Just as importantly, operators of PV plants are often very cagey about how much it is costing them to run the installations (a problem I ran into earlier this year, when I tried to assess the  the biggest new little plant in the East, on the Pennsylvania-New Jersey border).

That leaves us for better or worse with estimated installed costs. The good thing about using them as a metric is that the size and cost of any given PV plant is generally a matter of public record. So there are a lot of trade groups and energy monitoring organizations that add up the wattage and costs every year, on a global basis. The bad thing is that it would be a full-time job to figure out whose estimates are most reliable. Frankly, I tend to dip around somewhat randomly, following my instincts.

For example, Daniel Yergin's Cambridge Energy Research Associates estimated aggregate world PV installation costs at about $7/W in 2004, wind at a bit under $1/W. According to Marketbuzz/Solarbuzz, average world PV installation costs came to about $6.2/W in 2008, four years later. That's an improvement of 11.5 percent in four years, much too slow a rate of improvement to give us grid parity any time in the foreseeable future.


A Note on Coal Pollution Fatalities

Some readers of an earlier post have complained about the provenance of an estimate that ten thousand U.S. citizens die yearly from exposure to coal pollution. A New York Times report about a new National Academy of Sciences study--the subject of my post--hadsaid that  coal and automotive pollution were  about equally responsible for causing $120 billion in economic damage each year and 20,000 deaths. But the number 20,000 (or 10,000 coal, 10,000 automotive) does not  appear in the Academy study itself, which is what prompted me to call Maureen Cropper, who co-chaired the expert panel that did the review.

I did not discuss Cropper's views in detail in my post, but she confirmed that the numbers 20,000 or 10,000  do not appear in the NAS report and said, consistent with the Times, that because 96 percent of the economic damage from coal plant pollution is attributable to premature deaths, one can in fact divide the total damage by $6 million--the value attributed to each lost life--to obtain the number of fatalities.

To put that the other way around, Cropper said in effect that estimated economic damage is based on estimated yearly premature deaths from coal pollution and from automotive pollution of about 10,000 per year each, but that those numbers are implicit, not explicit.

At least one reader of my post was surprised that sickness does not account for a larger share of estimated economic damage, and so was I, considering that hundreds of thousands of people are hospitalized each year for asthma and a variety of other pulmonary disorders. Cropper said in effect that as many of those hospitalized will end up among the prematurely dead, the cost of their hospitalizations is in effect a part of the damage attributed to their deaths. But she conceded that the costs of morbidity may be somewhat under-estimated in the study.

Chinese Eye Fast Breeders

The Wall Street Journal reports that China plans to design and build an 800 MW fast breeder reactor to come online around 2020. The rationale is essentially the same as usual: concern that the country will not have enough uranium to fuel its growing reactor fleet, and a desire to stretch fuel by transmuting spent uranium  into new plutonium.  The trouble is, every country that has tried to do this by building breeders has failed: France, which runs what is by most accounts the world's most sophisticated nuclear industry, built a commercial-scale fast breeder only to close it down after a few troubled years. Japan's smaller-scale demonstration reactor suffered coolant leaks. A large Soviet breeder is believed to have caught fire.

The essential problems with breeders are easily stated. The fuel consists essentially of bomb-grade plutonium, which is highly reactive and very volatile. Cooling the reactor and transferring its heat to turbines requires use of liquid sodium, which is flammable and therefore must be even more rigorously contained than the water coolant used in conventional reactors.

In the very worst case, if there were a loss of liquid sodium coolant and the breeder's core started to melt, the reactor fuel could reconfigure itself into a critical mass and suffer what's called in expert jargon a "prompt critical burst"--that is to say, in plain English, a nuclear explosion.

I wish I could say to the Chinese, "Good luck," but actually I just hope they'll come to their senses and change their minds. One of the first things Jimmy Carter--a nuclear engineer--did after taking office was cancel the U.S. breeder program. He was right.

Earlier this year, a subsidiary of the Russian company Rosatom completed construction of a small 25 MW Experimental Fast Reactor in Beijing, fellow EnergyWise contributor Peter Fairley reported. It was to be loaded with fuel this summer, to test and demonstrate the basic technology of neutron capture. In a fast breeder reactor, fast neutrons emitted by fissioning plutonium are captured by non-fissile uranium-238, typically in a "blanket" surrounding the plutonium core. The U-238 transmutes to plutonium-239, which can be recovered and recycled as fresh fuel. Hence the claim that breeders produce more fuel than they consume.

Besides being hard to control, which affects design economics and public confidence, breeders suffer from the disadvantage that their fuel cycle depends on transportation of weapons-grade plutonium, an enticing target for terrorists. This was the main publicly stated consideration in Carter's decision to ditch the U.S. breeder program in April 1976 

Greenhouse Gas Emissions from Food

Proper estimation of GHG  from food production and consumption is notoriously complicated: livestock emit methane, a much more potent greenhouse gas than carbon dioxide; commercial meat production is disproportionately grain-intensive, and so the more meat is eaten, the greater the climate burden in terms of fertilizer and fuel inputs; finished food products, whether meat or grain, have to be transported, packaged, and sold, involving still further burdens. Several years ago,  a University of Chicago study found that in a typical household, GHG emissions connected with food can be as important as those associated with the home's car or cars.

A new report in World Watch Magazine argues that a previous study done under the auspices of the Food and Agriculture Organization radically underestimated the emissions from meat and poultry production. The FAO estimated in 2006 that such emissions amount to about 18 percent of total world GHG emissions; but World Watch puts them at 51 percent.

Could Mechanics Best Power Electronics in EVs?

Could smarter mechanical transmissions knock power electronics out of wind turbines, providing a cheaper and more efficient means of coupling the variable energy from ever-shifting winds to the regular waveform of AC power on the grid? They could according to my reporting in MIT's TechReview today on Viryd Technologies' bid to exploit continuously variable transmissions (CVTs). If mechanics reclaiming territory ceded to electronics sounds like a technological step backwards, here's an more heretical corollary: the same CVTs could also squeeze the power electronics out of electric vehicles (EVs).

That's the argument put forward by Rob Smithson, CTO for Viryd parent company Fallbrook Technologies and one of the inventors of its clever CVT (dubbed NuVinci in a tip-of-the-hat to the Italian polymath who first dreamed up the CVT concept). "If you look at cost in large car-replacement type EVs today, the cost gets dominated by the battery pack and the motor controls. There’s an opportunity to knock out one of those two with an infinitely variable transmission," insists Smithson.

Most EV elaborations today, says Smithson, rely on the electric motor to meet the entire dynamic performance envelope of the vehicle, from vehicle speed to torque demand -- a feat made possible by hefty power electronics. Swap in a CVT to handle the vehicle speed, however, and the electric motor can operate as a fixed speed variable torque device. "When that happens there’s a tremendous opportunity there to simplify your power electronics and a lot of the attendant cost that goes with that," he says. For more details, see Fallbrook's white paper on increased power, speed and range observed in a NuVinci-equipped electric scooter.

Smithson is well aware that his proposition will sound heretical to many EV designers ("I’m looking forward to my turn at being burned at the stake," he told me with a chuckle). But an EV source I trust says Smithson could be pardoned. Ed Benjamin, an expert in light electric vehicle and bicycle technology and managing director of Benjamin Consulting, agrees that CVTs have great potential in EVs. "If a CVT was light, had a wide range, worked well and did not lose much energy - it couldgreatly improve the performance of Light Electric Vehicles, extending the capability of the drive system and extending battery / range," says Benjamin.

Benjamin adds that Fallbrook's NuVinci CVT is the best CVT he has seen. "It is an impressive device. Ingenious, clean, works well. Not too heavy, does not lose a lot of energy," he says. At the same time, he notes that many attempts to engineer CVTs have failed in the past and, "often they are just a hair away from being right."

I'd call it a story worth following. Fallbrook has already commercialized its CVT in high-end bicycles, and says it is developing applications for power transmission in electric vehicles as well as auxiliary power generation in military vehicles and optimization of vehicle air conditioning, which accounts for nearly one-tenth of U.S. annual fuel consumption. And then there are the wind turbines I covered for TechReview today, which are arguably the toughest application of all. Automobiles are designed for something like 5,000 hours of lifetime operation, whereas wind turbines must run more like 80,000 hours.

For those who still doubt the feistyness of mechanical engineers (and their EE sympathisers!) to challenge the trend towards digital power control and transmission, check out the back-to-the-future example of GE's variable frequency transformers. We covered this adaptation of transformers for coupling non-synchronous power grids in 2007 in "Power Transmission Without the Power Electronics".


Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More