Energywise iconEnergywise

2013 Renewable Energy Recap: A Year of Record Setters and Energy Storage Momentum

Looking back is substantially easier than predicting the future. One year ago, I wrote the following: "This time, though, I am more confident: in 2013 the first offshore turbine in U.S. waters will start spinning. (Probably.)"

I wasn't wrong! (Technically.) In June, a modest 20-meter-tall (65-foot) floating turbine began feeding power to the grid from a harbor in Maine. It wasn't much, and the big offshore wind farms all gunning for first place remain tied up in pre-construction quagmires, but it was a turbine, and it was offshore. Cause for celebration! And more generally, offshore wind does appear poised to actually make a leap; early efforts by Cape Wind in Nantucket Sound suggest that long-cursed project may qualify for a tax credit based on construction starting by the end of this year. And a number of other big wind farms, particularly off the coasts of Rhode Island, New Jersey, and Virginia, may soon get under way. In spite of the positive signs for the industry, though, I have learned my lesson: I will make no promises of spinning offshore turbines in 2014. We'll just have to wait and see.

Back onshore, 2013 was marked by a steady march toward practical, utility-scale energy storage, as well as a series of short-lived record setters in solar and wind generation. One after another, big concentrating solar thermal plants claimed largest-in-the-world status: Abu Dhabi's Shams 1, Arizona's Solana (more on that one in a moment), and finally California's Ivanpah plant. CSP has been considered the most viable way to bet big on solar—and these 100-plus-megawatt plants seem to back that up—but the ever-dropping prices on photovoltaics has slowed some of  CSP's momentum, and perhaps delayed some of the grandest of desert solar plans.

Wind also went big this year, with the United Kingdom's London Array switching on to become the world's biggest offshore wind farm, beating out the Irish Walney site. Big wind plans abound, both on and offshore, though the London Array's full gigawatt capacity may be tough to beat any time soon.

But generating all this clean solar and wind energy is only one aspect of renewable power. Intermittency and dispatchability have long plagued efforts to scale renewables, and 2013 was the year that energy storage really began to take the spotlight. California now has the country's first energy storage mandate, a law requiring storage capacity that can output 1325 megawatts by the end of 2020, and 200 MW by 2015. How the state will achieve this is currently up in the air, but they should have several options: it seemed that the year was full of news on new approaches to storing power.

At the Advanced Research Projects Agency—Energy summit in February, storage projects dominated the exhibit hall floor. Ideas ranged from improved flywheels to iron-flow batteries, but more established approaches are likely to win out for the foreseeable future: improved lithium-ion batteries and old ideas like compressed air energy storage. Several compressed air companies are starting to actually ship units (or will soon start) that can help wind farms max out on their capacity. In the U.K., a largest-yet pilot project will test a huge Li-ion battery installation in Bedfordshire.

And power plants are starting to include storage from the outset as well. That Solana plant in Arizona, built by Abengoa Solar, incorporates molten salt storage that lets the plant produce power for six hours after the sun goes down.

All of this sounds like great news for renewables, and it is. But when facing the magnitude of the climate change challenge, a few gigawatts here or there aren't remotely enough. A report from the International Energy Agency laid out the problem in a nutshell: the overall share of energy attributable to coal, oil, and gas today has not changed one smidge from the late 1980s. Renewables will need to grow at a staggering pace in order to make a significant difference in emissions.

We'll check back next year to see how the effort is going.

Global E-Waste Will Jump 33 Percent in the Next Five Years

All of those cell phones, computers, tablets, toys, and toaster ovens really add up. Not only are consumers gathering more and more electrical and electronic equipment, but we are also tossing much of it in the trash.

The piles of electronic waste, or e-waste, is rising rapidly across the globe, according to a new study by the Solving the E-Waste Problem (StEP) Initiative. The United States and China were responsible for nearly half the world’s total in market volume of e-waste in 2012, which includes anything with a battery or electrical cord.

The figures per capita, however, are far different. China generates about 5.4 kilograms per person, compared to 29.8 per capita in America. The United States has the highest figure of major countries, but it is behind a handful of other places including Singapore, United Arab Emirates, Qatar, Switzerland, Hong Kong and Luxembourg.

The StEP Initiative, which is a partnership of United Nations organizations, industry, governments and non-governmental organizations, wants to help countries better understand their e-waste streams and help build legislation and processes to deal with it. There is a great deal of variability in the effectiveness of e-waste programs between and within countries. 

"Although there is ample information about the negative environmental and health impacts of primitive e-waste recycling methods, the lack of comprehensive data has made it hard to grasp the full magnitude of the problem," Ruediger Kuehr, executive secretary of the StEP Initiative, said in a statement. "We believe that this constantly updated, map-linked database showing e-waste volume by country together with legal texts will help lead to better awareness and policy making at the public and private levels."

The StEP Initiative’s interactive map has details on each country’s e-waste numbers and regional or federal rules about how to dispose of the waste. The timing is critical as global e-waste is expected to rise 33 percent to about 72 million tons per year by 2017.

Another new study [PDF] by MIT and the U.S. National Center for Electronics Recycling took a deeper dive into the e-waste generated in the United States. The researchers found about two-thirds of the used electronics in 2010 were collected, yet only about 8.5 percent of those collected units were exported. The researchers say that is likely a low figure. It also does not account for most of the components that are exported after initial recycling in the United States.

One of the problems with exports is that there is not explicit tracking of the products as they move into secondary markets. Also, the country something is exported to is often not the final resting place. A smartphone may first go to Hong Kong, but then be pulled apart with components going to other countries in the region.

Different technologies also have vastly different paths from the United States. Larger items, such as televisions and monitors, are more likely to end up in Mexico, Venezuela, or Paraguay, while mobile phones are more likely to go to Asia or Latin America. Overall, Latin America and the Caribbean are the most common destinations for products, with Asia following and then Africa being the least common destination for North American e-waste.

The studies help provide a clearer picture of the fate of global e-waste, but much more can be done. “We cannot possibly manage complex, transboundary e-waste flows until we have a better understanding of the quantities involved and the destinations. This research is an important first step in that direction,” Joel Clark, founder of the Materials Systems Laboratory at MIT, said in a statement. The researchers recommend creating trade codes for electronics products to enable better tracking, gaining more access to shipment level trade data, and getting greater reporting of re-export destinations.

 

A Second Big Boost for Modular Nuclear Reactors

Illustration: NuScale
NuScale’s modular reactor can self-cool indefinitely in the event of an outage.

It's long been recognized that nuclear energy will achieve its full potential only if much smaller, inherently safer reactors are developed, so as to be an attractive option in a much wider range of situations. A variety of interesting concepts for compact modular reactors have emerged in the last decade, and now some of them are starting to attract real money. This week, the U.S. Department of Energy announced it would award one such developer, NuScale Power of Corvallis, Oregon, up to US $226 million to support design work.

This was the second such DOE grant. Last November, the Energy Department made its first grant of its $452 million modular reactor program to Babcock & Wilcox, to support its mPower concept. The mPower project is considered to be “a step ahead of NuScale’s because it has a preliminary agreement with a customer, the Tennessee Valley Authority,” according to Matthew Wald of the New York Times.

In the NuScale concept, as described in Spectrum’s round-up on modular reactors two year ago, “the nuclear fuel assemblies sit inside a long core vessel, which in turn is housed in a secondary containment vessel immersed in water. Unlike conventional light-water reactors, which require large pumps to circulate water through the core, the NuScale reactor is based on convection.”

The United States, having given birth to a handful of innovative ideas for small reactors, seems to be well ahead of the rest of the world in this particular technology. But it does not have the field all to itself. Russia's reactor company has developed a small floating nuclear power plant, which appears to be on the threshold of commercial application, most likely in offshore oilfield settings, initially.

Battery Startup Envia Is Accused of Fraud

Early last year the Bay Area startup Envia Systems momentarily got a lot of attention with claims that it had developed a lithium ion battery with three times the energy density and perhaps half the electricity cost of standard lithium ion batteries on the market. (We advised caution at the time) Now Envia stands accused by three former executives of having stolen intellectual property, misrepresenting itself to investors … and of firing them when they started to raise embarrassing questions.

Dana Hall of the San Jose Mercury News reported the rather sensational developments concerning Envia a week ago, noting that the company had received $4 million in funding from ARPA-E and an investment from GM, which hoped to license the technology if it panned out. Subsequently, GigaOm's Katie Fehrenbacher picked up on the allegations about Envia and discussed them in some detail.

This is not the first time Envia has stood accused of misrepresenting itself. Last year the tech business blogger John Petersen pointed out that the company had doctored a chart from Lux Research, to make itself stand out more from competitors. Interestingly, however, Lux itself is reserving judgment about the new charges against Envia.

Lux notes that both Envia and the company it allegedly stole IP from were trying to further develop a technology pioneered at Argonne National Laboratory (ANL) . "Both NanoeXa [the alleged victim company] and Envia had tried to improve upon ANL’s technology, but only Envia successfully patented the work," notes Lux. "The courts will have to determine whether any stolen IP played into these developments, but it’s certainly not a given that NanoeXa’s claims are true."

Of course we are no more able than Lux to pass judgement on the details of the case. However, we stand by our previous warning that any supposedly revolutionary development in storage technology should be treated with extreme caution.

APS Argues to Extend Lifespan of Nuclear Reactors to 80 Years

There are no technical barriers to running some nuclear plants for up to 80 years, according to a new report from the American Physical Society.

The study, which advocates keeping many of the approximately 100 reactors in the U.S. running for at least 60 years, argues that the tradeoff between nuclear and gas is not necessarily a bargain the country should be making.

Nuclear power provides about 100 gigawatts of power, about 20 percent of the nation’s electricity supply. If its contribution starts to wane after 2030 as reactors close, there could be an energy shortfall, according to the APS.

“Nuclear power plants provide the nation with a source of clean energy at a time when renewables such as solar and wind are not yet ready to fill the potential gap in the nation’s base power needs created by the loss of nuclear power,” Roy Schwitters, lead author of the APS report, said in a statement.

Some experts in the renewable industry would challenge the claims that solar and wind cannot provide a sizeable portion of the energy that would otherwise go missing after 2030. Energy efficiency could also play a role, as it did when the San Onofre Nuclear Generating Station was taken offline in 2012 after radioactive steam leaks were detected.

According to the Energy Information Administration, energy use per household is expected to decrease through 2040, but overall use is expected to grow in the same time period as more homes overall use air conditioning and electronic gadgets. Still, aggressive energy efficiency measures could temper those projections.

Renewables and efficiency aside, as coal and fission plants are retired they are largely being replaced by natural gas-fired power plants. The use of natural gas, especially in the place of coal, which is the largest portion of generation, is a trend that might be the cheapest way for the U.S. to meet its greenhouse gas emissions targets through 2030, according to a NREL report.

The APS, however, argues that nuclear power continues to be a lower-emissions scenario than natural gas plants and that extending the lives of nuclear plants is “both complex and urgent.” Utilities also need clear government policy because they, by necessity, plan decades, not just years, in advance. The organization calls for more nuclear research—to raise understanding of how to maintain existing nuclear and build new ones, and to make clear just how long operating licenses can safely be renewed. Some nuclear experts argue that the United States should be leading the world in nuclear research, particularly in areas such as small nuclear reactors.

Opponents of nuclear power do not see it, advances notwithstanding, as the solution. "This is not a future technology. It’s an old technology, and it serves a useful purpose. But that purpose is running its course," Gregory Jaczko, who was chairman of the U.S. Nuclear Regulatory Commission (NRC) at the time of the Fukushima Daiichi accident, told IEEE Spectrum earlier this year.

The NRC has already issued extensions to many nuclear plants, raising their retirement age to 60 years, and is evaluating the possibility of some plants being open for 80 years. But it's unclear whether utilities will want to spend the money required to keep the assets up to date.

There are two research programs addressing the five main challenges to long-term operation: primary system metals and piping; concrete and containment structures; electrical cables; reactor pressure vessel and buried piping. “These programs have not uncovered any technical show-stoppers that would prevent the renewal of licenses from 60 to 80 years,” the study authors wrote, adding that more research is needed.

There are advances in monitoring aging nuclear plants, including techniques such as acoustic and ultrasonic monitoring, but the upgrades identified by the monitoring can cost up to $1 billion over a 40-to-60-year extension—money that some utilities might rather spend on new gas-fired plants instead of the controversial nuclear plants.

There is one more issue that could also hamper plans for new nuclear plants or for extending the lives of existing ones beyond 60 years: water. Nuclear plants are among the thirstiest options for electricity generation, and just this past summer, a heat wave threatened to shut down a nuclear power plant in Plymouth, Mass. because the amount of water taken from Cape Cod Bay had exceeded the limit set by the NRC.

 

Photo: Dominion Energy, APS

Toyota Licenses Wireless Charging Tech from WiTricity

You drive home in your electric car, enter your garage, and step out the car holding your briefcase in one hand and groceries in the other. Wouldn't it be nice if you could charge the car without physically plugging in?

Toyota thinks so. Wireless power startup WiTricity announced yesterday that Toyota has licensed inductive charging technology from the MIT spin-off and that the carmaker will build wireless power capture devices into future vehicles. Toyota invested in WiTricity two years ago.

The idea of wireless charging isn't new: GM’s ill-fated EV1 was charged using an inductive paddle. And although wireless charging for EVs or consumer electronics is far from commonplace, advances in the past decade show that the technology is maturing and that manufacturers are committed to building it into their products.

Earlier this year, Satoshi Ogiso—one of the engineers who headed development of the first Prius—said Toyota will begin verifying a wireless power charging system next year in the U.S., Europe, and Japan. Nissan, which makes the all-electric Leaf, is working on a wireless charging system and told reporters last year that it intends to offer it as an option in a 2015 model year Infiniti. Daimler and Volvo are also working on wireless charging and Bosch already sells a wireless charging system for the Leaf and Chevy Volt.

Read More

Top U.S. Companies, Following Government Lead, Are Pricing Carbon

The lead story in yesterday’s New York Times reported the somewhat startling news that “more than two dozen of the nation’s biggest corporations, including the five major oil companies, are planning their future growth on the expectation that the government will force them to pay a price for carbon pollution as a way to control global warming.” The Times tended to focus on what it took to be the political implications of its news, and treated its discovery as further evidence of a growing split between the Republican Party’s business-oriented establishment, which tends to go with the flow, and its much more obstructive Tea Party base.

The Times report is weaker in elucidating the immediate implications of corporate carbon pricing for future business and technology. The one concrete example it provides of how carbon pricing affects long-term corporate planning is ExxonMobil’s getting into the U.S. natural gas business in a big way Because natural gas is somewhat less carbon-intense than oil (and of course much less than coal), in a future where the cost of carbon emissions is factored into fossil fuel prices gas will be even cheaper relative to oil than it otherwise would be.

But the implications of corporate carbon pricing are much broader than just that limited example might suggest: If the nation's biggest companies are betting that they will have to pay a price for emitting carbon in the not-too-distant future, that probably means that much of the investor community is increasingly operating on the same assumption. So, in turn developers of all low-carbon, zero-carbon, and energy conservation technologies—from wind and solar generation to green building techniques and demand-response software applications—should be getting a boost.

The Times might have mentioned that the U.S. government already sets a price on carbon. Corporate planners undoubtedly have an eye on the government's methodology and probably adopt schedules for estimated future carbon costs that are quite similar to the government’s. According to a fact sheet from the U.S. Environmental Protection Agencies, EPA and other Federal agencies such as the transportation department take what they call the “social cost of carbon” into account in various of their rule-making proceedings. Estimated future carbon costs are spelled out in schedules, which are used to calculate the benefits from, say, making cars more fuel-efficient or coal-fired power plants less polluting.

A “technical support document” issued last May by the U.S. government’s Interagency Working Group on the Social Cost of Carbon, Technical Update of the Social Cost of Carbon for Regulatory Impact Analysis -- Under Executive Order 12866, describes the conceptual framework and methodology in more detail.

Photo: David Parsons/Getty Images

Cars Could Become Flood Predictors

If you often feel that your morning commute in the car is wasted time, a new initiative out of Germany could transform your daily drive into a citizen science experiment that may help predict localized floods and droughts with more precision.

Researchers at the University of Hanover in Germany equipped cars with GPS and devices to measure rainfall, dubbing the vehicles RainCars. Although there are far more measurement variables to account for with a moving car than with a stationary rain gauge, the multitude of cars on the road could ultimately provide better accuracy at a hyper-local level. The research is published in the journal Hydrology and Earth System Sciences.

“If moving cars could be used to measure rainfall the network density could be improved dramatically,” project-leader Uwe Haberlandt said in a statement. Earlier research had shown that data from the wiper speed on many cars—which is an indication of rainfall rate—could provide better measurements of spatial precipitation than a handful of very accurate devices.

To build on earlier work, the researchers set up a lab experiment with a rain simulator and cars with different wiper systems. First, someone in the car manually adjusted the wipers based on visibility. “Front visibility is a good indicator for rainfall intensity,” Ehsan Rabiei, the paper’s lead author, said in a statement. However, human variability in choosing wiper speed isn’t the most reliable indicator of rainfall.

The next tests used newer vehicles that were equipped with two types of optical sensors that measure the rain accumulation and then automatically adjust the wiper action as needed. The sensors use a system of infrared laser beams to continuously detect the rain on the device and are therefore offer more accurate measurements than fickle human behavior.

 

Read More

Betting Long on EVs

Most of the recent news about electric and hybrid-electric vehicles has not been encouraging: three Tesla lithium-ion battery fires; sharply scaled-back projections for EV  sales on the part of the world's leading manufacturer, Renault-Nissan; the long-awaited Fisker bankruptcy filing, listing celebrity investors like Leonardo di Caprio and Don Cheadle, Marc Andreeson and John Doerr, Al Gore and the vice president's son Robert Hunter Biden.

It does not bode well for the EV's immediate prospects when some of the most prominent electric car enthusiasts in Hollywood, venture capital and politics get badly burned. Last month, the automotive industry's most enthusiastic proponent of electriics, Renault-Nissan's Carlos Ghosn, admitted to the Financial Times that the company's EV sales are running far behind projections. It might take Renault-Nissan until 2020-21 rather than 2016 to sell 1.5 million elecrtrics, he said.

To help put matters in perspective, GigaOm's Katie Fehrenbacher has reminded her readers that Tesla's current car has basically been a triumph and that the company has at least a fighting chance of coming out next with an affordable mainstream EV. "Tesla’s Model S has been a big success, and it’s part of a long term evolution that will allow Tesla to eventually deliver its third-generation, mainstream electric car,” wrote Fehrenbacher. "Finding success with that monumental project will be even more difficult than it was with the Model S, yet it will be even more of a revolutionary step toward moving the world off of gasoline-powered cars."

But what if Tesla does not beat the odds? What about the current lackluster sales of the Volt and Leaf?  Is it possible that the recent flurry of interest in EVs will turn out to be transitory and that, once again, we will see them drop off the radar screen? In the end, will the Leaf, Volt and Tesla suffer the same ignominious fate as General Motors's ill-starred EV-1?

Read More

Belgium Claims World’s Largest Offshore Wind Turbine

The largest offshore wind turbine on the planet is now spinning off of the coast of Belgium at the Belwind site. Alstom produced the 6-megawatt Haliade turbine and installed it off of the Ostend harbor last weekend.

The blades stretch out more than 73 meters and the turbine towers more than 100 meters above sea level. The turbine does not have a gearbox but instead uses a permanent-magnet generator. Fewer mechanical parts means less maintenance and higher reliability, according to Alstom.

The size and mechanical configuration will allow the turbine to produce about 15 percent more power than existing offshore turbines and can supply electricity to about 5000 households.

The Haliade 150 was initially tested at Le Carnet site in France. Alstom is now building factories to construct the six-megawatt turbines on a commercial scale.  

As Alstom claims the largest offshore wind turbine for the time being, the title of world’s largest offshore wind farm currently goes to the London Array off the coast of Kent. It boasts 630 megawatts and can power half a million homes.

Offshore wind farms may seem to be constantly vying for bigger and better, but some projects are seeing setbacks. One wind farm, whose turbines would have sat atop towers twice as tall as Alstom’s Haliade 150 and produced nearly twice as much energy as the London Array, has been shelved.

German energy giant RWE was planning the mega wind farm, the Atlantic Array, off the Bristol Channel in the United Kingdom. RWE Innogy’s director of offshore wind, in a press release Paul Cowling said the project is cancelled because of technological challenges and market conditions.

The latter may have been a larger factor, according to the BBC. Sources told the BBC said the project was having trouble getting financing. The project would have reportedly cost about £4 billion ($6.5 billion). There was also some environmental opposition since it would have sat just 13.5 kilometers from a nature reserve.

RWE said that the technical challenges included “substantially deeper water” than initially thought and adverse seabed conditions. With the Atlantic Array a no-go, RWE is focusing on completing the world’s second largest offshore wind farm: a 576-megawatt array known as Gwynt-Y-Mor off the northern coast of Wales.

 

UPDATE (December 2, 2013):

Alstom's Haliade 150 may hold the title of largest wind turbine, but that claim is likely short lived. There are various other 6-MW offshore turbines coming to market. “But at the moment our turbine installed at Belwind site is the biggest and the most efficient installed offshore, because of the size of its rotor,” says Stephanie Roux, spokesperson for Alstom Renewable Power. Most other offshore turbines of similar size are prototypes or not operating at full commercial capacity, according to Alstom. But a 7 MW offshore wind turbine is already being tested In Scotland.  

Photo: Alstom

Advertisement

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Advertisement
Load More