Energywise iconEnergywise

Huge New Solar Thermal Plant Can Keep Running for Six Hours After Sun Goes Down

The Ivanpah plant in the Mojave may have recently snatched away the title of "world's largest," but Abengoa Solar's Solana plant in the desert near Gila Bend, Arizona, still has its share of superlatives. At 280 megawatts, Solana is one of the largest plant using parabolic mirrors in the world, and it is undoubtedly the largest to use substantial thermal storage to keep the juice flowing for hours after the sun goes down. Intermittency is still among the most common complaints about industrial-scale renewable energy, so proving that this storage tech can work is a huge step for the solar industry.

Abengoa announced on Wednesday that the Solana plant "passed commercial operation tests." The first of these involved running the plant's generator at full power while also ramping up the thermal storage system. Next, after letting the solar part of the plant stop once the sun was down, operators fired up the generator and produced electricity for six full hours using only the thermal storage system. Intermittency, you matter not here.

Read More

Tesla's Lithium-Ion Battery Catches Fire

Tesla seems to make news at every turn. Most of this year's headlines about Tesla's Model S have been high praise, but this week, one Model S was in the spotlight for another reason: catching fire.

On Tuesday, a Tesla driver in Washington State drove over some metal debris on the highway, according to a report in The New York Times. The driver turned off the freeway and then the car caught fire. Elizabeth Jarvis-Shean, a spokesperson from Tesla, confirmed that one of the 16 modules that make up the Model S battery pack caught fire after direct impact with a large metal object. Earlier this year, the U.S. National Highway Traffic Safety Administration gave the Tesla Model S a 5-star safety rating (its highest) in all categories.

The video below captures the fire, in which a man in a passing car exclaims, "oh, that's a Tesla dude!" (Warning: video contains coarse language.)

“No one was injured, and the sole occupant had sufficient time to exit the vehicle safely and call the authorities,” Jarvis-Shean said in a statement on Wednesday. “Subsequently, a fire caused by the substantial damage sustained during the collision was contained to the front of the vehicle thanks to the design and construction of the vehicle and battery pack.”

After the fire was extinguished by the fire department, it reignited and was “difficult to extinguish,” a fire department official told The New York Times.

Lithium-ion batteries, the primary choice for electric vehicles, are known for their potential to catch fire, although the incidences are rare. Last year, a Chevy Volt caught fire a few days after being crash tested. The problems are not limited to cars. Earlier this year, Boeing’s 787 Dreamliner was grounded after fires in the plane’s lithium-ion battery.

Although fires involving lithium-ion batteries receive the bulk of the headlines, there are far more fires involving gas stations and conventional combustion engines each year. Between 2004 and 2008, the U.S. National Fire Protection Association reported an average of about 5000 fires per year in and around gas stations. From 2006 to 2010, there were roughly 152 000 automobile fires annually, on average.

Even if fires in electric vehicles are relatively rare, it is still a black mark on an emerging industry. Tesla was named the Motor Trend 2013 Car of the Year, but its stock price was trading at about $173 on Thursday after opening at $190.15 on Wednesday.

Tesla said it is investigating the fire.


Photo: AJ Gill/YouTube

What Is the Actual Status of Carbon Capture Technology?

The U.S. Environmental Protection Agency's (EPA's) long-expected rules for carbon emissions from new coal-fired power plants have thrown into sharp relief the question of whether carbon capture and sequestration or storage (CC&S) can be described as "demonstrated" and ready for commercialization. This is because the new regulations declare, in effect, that no new coal plant can be built without CC&S.

The EPA is reported to have alluded to four CC&S plants and projects in connection with its coal regs—in California, Mississippi, Texas, and Saskatchewan, Canada—though I cannot find that statement in the press release, technical backgrounders or detailed report on regulatory impacts issued with the new coal regs on 20 Sept. But surely the most substantial and furthest along of the U.S. projects is the IGCC plant that Southern Company is building in Kemper, Mississippi [photo].

Integrated gasification combined-cycle technology involves coal gasification and then the separation of carbon dioxide from other flue gases, including those containing nitrogen. Can Kemper be considered a demonstration of a technology ready for commercialization? By no stretch of the imagination. The plant is being built with a $270 million contribution from the U.S. Department of Energy. And Southern Company has issued a statement saying the Kemper technology "cannot be consistently replicated on a national scale."

DOE originally hoped to see IGCC tested and demonstrated at a plant to be built near Tampa, Florida, but the local utility lost interest. The Energy Department thereupon helped get the demonstration transferred to Mississippi. The government subsequently changed horses mid-race. FutureGen, which was to have sponsored the major U.S. demonstration of IGCC, collapsed toward the end of George W. Bush's presidency. The Obama administration decided early on to resuscitate the project, but switched to an alternative technology known as oxycombustion or oxyfiring. (In oxyfiring, nitrogen is removed from air pre-combustion, which simplifies separation of carbon dioxide post-combustion.) The change suggested that IGCC was suffering some loss of confidence at Obama's DOE as well.

Outside North America, the important work being done on CC&S is also pre-commercial. Norway and its national oil company, Statoil, having declared in the past that depleted North Sea oil fields contain enough storage capacity to sequester all the carbon dioxide emitted from European power plants, have a major R&D facility at Mongstad. On 23 Sept., however, the government and Statoil, while reaffirming their commitment to research there, pulled the plug on upgrading Mongstad to commercial-scale.

Sweden's national utility Vattenfall has been testing oxycombustion technology at a power plant it owns in eastern Germany, at Schwarze Pumpe. As described in an MIT summary, "Vattenfall announced in November 2009 that it was achieving nearly 100 percent CO2 capture at Schwarze Pumpe. As of the beginning of June 2010, Schwarze Pumpe has now been in operation over 6500 hours during the last one and a half years… Vattenfall is continuously rebuilding and developing this unit." Vattenfall plans to continue operations for ten years.

All that is well and good, but of course it does not add up to having a demonstrated technology ready for commercial deployment.  Five years ago, Spectrum declared Schwarze Pumpe a "winner," not because it necessarily would lead to a usable technology, but because Vattenfall at least was trying seriously to find one.

Photo: XTUV0010/Wikipedia




DOE Maps Path to Huge Cost Savings for Solar

The price of a solar photovoltaic module has dropped dramatically over the last few years. But to get solar installations down toward ideal price points, the cost of making the panels isn't the only thing that needs to come down: so-called "soft costs" represent half or more of most solar installations. These costs include permitting, labor, inspection, interconnection (if you're going grid-connected, at least), and others, and the U.S. Department of Energy's National Renewable Energy Laboratory (NREL) thinks we can cut those down to size as well.

In a new report, NREL maps out a way to bring soft costs down from $3.32/watt in 2010 for a 5-kilowatt residential system to $0.65/watt in 2020. For small commercial systems below 250 kW, the report suggests a drop from $2.64/watt in 2010 to $0.44/watt in 2020. These soft cost reductions would allow the U.S. to reach the Department of Energy's SunShot Initiative goals of $1.50/watt and $1.25/watt for residential and commercial installations, respectively.

But first, the bad news: if the current trajectory of soft costs continues, those SunShot goals will not be met. Achieving the extra cost reductions necessary to get there won't be trivial, especially for residential installations—in fact, an additional $0.46/watt is needed beyond the current trajectory, a sizable amount when we're gunning for $0.65/watt in total. Financing and customer acquisition costs are most likely to get there without much help, while permitting and interconnection need some help. That help could take the form of streamlined inspection processes and a standardized permitting fee that is substantially lower than what currently exists. The average permitting fee now, though it varies widely across jurisdictions, is $430; NREL suggests bringing that to $250 across the board.

Commercial systems, meanwhile, need only $0.11/watt beyond current trajectory in order to achieve the SunShot goals. Labor costs may come down easier than with residential systems; the report suggests that universal adoption of integrated racking, where modules arrive at a site already assembled and ready for installation, is one method for dropping costs in the right direction.

In general, soft costs are increasingly recognized as perhaps the primary barrier to bringing solar prices down into the truly competitive range. And that seems to go for manufacturing of solar panels as well as for installations: A recent paper in Energy and Environmental Science compared costs of solar manufacturing in China and the U.S., and found soft costs including labor and supply chain are the biggest differences. If the U.S. wants to keep up with the world's biggest solar manufacturer, working on those costs unrelated to materials is a good place to start. And they better hurry: the cost of building a PV module at major companies in China is going to drop all the way to $0.36/watt by 2017, according to one recent report. With module prices continuing that sort of decline, focusing on the soft side of solar is getting more and more important.

Photo: Tim Boyle/Bloomberg/Getty Images

Is Energy Efficiency the Most Popular In-Home Automation?

A new study from the Consumer Electronics Association found that energy efficiency technologies are the most popular amongst home automation options in American houses.

Programmable and/or smart thermostats beat out home security and entertainment automation for the top honor, with 47 percent of households saying they had at least one.

The findings, which come from an online survey of about 1000 people, would seem to be a win for energy efficiency. But most of the homes had programmable thermostats, which are often used incorrectly, if at all.

One study from Lawrence Berkley National Laboratory [PDF] found that 89 percent of survey respondents rarely or never used the thermostat to set a weekday or weekend program. Seventy percent were not set at all.

Programmable thermostats have been around for more than 30 years, but a new generation of smart thermostats that connect with smartphones and the Internet make programming far easier. Not only is the interface easier to use but some have algorithms that can learn your household thermal characteristics and daily patterns to help fine-tune settings.

The energy savings for software-based, digital thermostats range from about 15 to 30 percent. But such smart thermostats are still in the minority, with only 12 percent of CEA respondents saying they had one, and even then it was often in conjunction with older thermostats in the home. When old-school programmable thermostats are taken out of consideration, automated home security becomes the most popular technology choice. 

The survey found that saving money was a key motivator when it comes to energy efficiency products, but most people don’t save anything with their programmable thermostats, and the smart thermostat market has been emerging slowly in the past few years despite the potential savings. Energy efficiency has historically been a hard sell, even if it makes sense financially over the long run.

The first generation of two-way digital smart thermostats was often sold through utility channels and the cost was too high. But with the proliferation of smartphones and lower costs, smart thermostats have started to catch on.

The recent popularity has also come from smart-thermostats makers partnering with other players, particularly home security companies and service providers, both of which want to provide end-to-end home automation services for a monthly fee.

Even Nest Labs, which has a smart thermostat that works on a proprietary system, has partnered with some utilities and just launched an application programming interface (API) for developers that want to create smart home apps that run on top of the thermostat. It is also reportedly developing a smoke detector as its second connected home product.

It’s hard to say if a smart smoke detector is the next wave in home automation, and the CEA survey gave mixed signals in terms of what segment will grow the fastest. The study found safety and security are the building blocks of home automation packages, with climate, lighting and appliance controls viewed as “very desirable” but not critical. But the survey also found smart and programmable thermostats were the items that people expressed the highest purchase intent for within the next two years.

Even though connected home offerings are growing, and everyone from your cable provider to your local big box store is selling them, CEA found most consumers still don’t think they have a need for the products. But for products that can provide convenience, peace of mind and a cool factor—just as a smartphone does—the sky could be the limit.


Photo: Randi Klett

Production of Solar Panels Outpaced Investments Last Year

Worldwide photovoltaic  (PV) solar panel production rose 10 percent in 2012 despite a 9 percent drop in investment, reports the European Commission (pdf). The numbers are imprecise, because solar panel makers use different types of production and sales figures, but the Commission authors estimate that producers added between 35 GW and 42 GW of PV capacity in 2012. The growth follows several years in which European governments have trimmed subsidies to solar power, prompting many private investors to shy away from the sector and driving some companies to bankruptcy.

Something about solar is special, though: investment in PV capacity still made up over half (57.7 percent) of new renewable energy investments, for a total of $137.7 billion, and analysts predict further growth through 2015. Part of the reason for investment's lag behind production is that producers added so much production capacity during the pre-recession subsidy boom that they need less capital investment to sustain high production levels. Making the hardware isn't the hard part.

Indeed, a recent Energy and Environmental Science study found that "soft" costs such as supply-chain efficiencies and regulatory barriers made up more of the difference in production costs between regions than hardware production costs. They predicted that the right business management and regulatory boosts could enable U.S. manufacturers to match China's. The EC report also shows optimism for PV in the United States: it figures U.S. PV capacity grew from 3.4 GW to 7.7 GW in 2012, almost doubling in response to a mix of legislative mandates and tax credits.

Courtesy European Commission Joint Research Center

Most of the rest of the growth comes from Asia, where governments are still in the first flush of support for solar energy. The EC report expects new guaranteed prices for solar power there, much like the prices which drove Europe's own solar boom in the mid-2000s. In Australia, about 10 percent of homes already have PV systems.

That doesn't mean the sun is setting on solar in Europe, though. After a pilot run near sunless London, Ikea announced that it would offer PV panels at all its United Kingdom stores. The firm figures consumers can earn £770 ($1247) a year between subsidies and savings on conventional electricity bills. Upfront costs are at least £5700, but typical panels last decades and should amortize installation costs within a little over 7 years. That should make up for some of the UK's gray days.


Real-Time Monitoring Yields Payoffs in Power Distribution

A key smart grid concept is the notion that the sharp traditional distinction between electrical transmission and distribution will blur as local feeder systems become much more flexible and controllable, and as power is fed back into the grid from distributed generation and storage assets. Making distribution systems two-way and flexible represents an enormous engineering challenge, of course. But as it is met and surmounted, consumers will see tangible benefits.

Writing in a recent issue of Power Electronics, John McDonald, director of technical strategy and policy development for GE Energy Management in Atlanta, provides two vivid examples of the new challenges in electricity distribution. In San Diego, where well-to-do homeowners have installed photovoltaic arrays in large numbers along the coast, power output tends to spike around noon, when the morning fog has burned off. Since air conditioning is generally not required in the breezy homes, service panels usually are limited to 100 amps, so that a single service transformer feeds as many as 20 houses. “All these factors contribute to spikes in voltage variability when PV output is high, load is low, voltage regulation is electromechanical and, possibly, there's high impedance on the feeder," a source with Sempra Energy told McDonald, a designated IEEE smart grid expert.

In another example involving Sempra's San Diego Gas & Electric, the utility is required to accommodate the needs of a large avocado farm, inland, where large PV arrays have been installed to help offset the power needs of the irrigation system. There too, output rises in the middle of the day, after fog has burned off, but the irrigation system is run mainly in the early morning. So, with net metering, "the farm injects substantial amounts of power onto the feeder, creating a voltage differential that results in reverse power flows," the Sempra Energy source told McDonald. "That, in turn, leads to the maximum [transformer] tap, then lockout, of an upstream voltage regulator and has even affected voltage on the primary distribution line."

Part of the solution, for SDG&E and for other utilities, lies in use of dynamic VAR devices assisted by power electronics—solid state devices that can modify current characteristics. (McDonald notes that in Europe, where government policies encourage high PV penetration, inverters to provide reactive power compensation are widely required.) Greater voltage variability in distribution systems can sorely stress transformers, and so "major vendors are doing R&D on prototype LTCs [load taps changers] employing power electronics," writes McDonald.

Generally, it is proving helpful to gather information from the full range of smart sensors in a distribution system—from protective relays and feeder controllers to smart meters at homes—to monitor in real time how voltage is being affected by distributed assets. In Arizona, GE has helped a utility design and install a data collection system parallel to the SCADA system to detect and record voltage variability.

It's a business in which not just the big established companies like GE, Oracle, SAP and IBM are active, but also a host of hard-charging startups like Opower in San Francisco and Space-Time Insight, in San Mateo, California. Space-Time, having developed systems to collect and process data from a wide variety of sources, has obtained contracts to enhance network intelligence in the Sacramento Municipal Utility District (SMUD), Ontario's Hydro One, and the Czech Republic's CES. The general expectation, as Rebecca Smith put in in a recent Wall Street Journal  roundup, is that the improved power system intelligence will yield not only greater reliability but more timely and efficient repairs.

Photo: iStockphoto

Latest IPCC Climate Assessment Reaffirms Previous Findings

The Intergovernmental Panel on Climate Change (IPPC) released a new assessment of climate science this morning, its fifth comprehensive evaluation of the basic physics. In light of the controversies that dogged its work and the doings of some major contributors—an error that found its way into the last assessment concerning the rates at which Himalayan glaciers are melting; the "climategate" kerfuffle over leaked e-mails—perhaps the most significant aspect of the new report is simply its reaffirmation of previous findings.

Of course, the report is being spun slightly differently depending on the ideological proclivities of publishers and, no doubt, the personal interests and obsessions of particular writers. The Wall Street Journal, in its treatment today, stresses a moderate lowering of the IPCC's projected warming range in this century by comparison with 1850-1900: to 1.5-4 degrees celsius (not 2 degrees, as the Journal misleadingly put it), from 2-4.5 °C previously. The New York Times mentions the IPCC's lowering its estimate of the warming to be expected from a doubling of greenhouse gas concentration, to 2.7-8.1 degrees from 3.6-8.1, but it emphasizes the limit put on the amount of CO2 that can be spewed into the atmosphere if warming is to be held at 3.6 °C: That limit is 1 trillion tons of carbon in toto—roughly half of which has already been emitted since the beginning of the industrial revolution in 1750, with the second half due to be emitted by 2040 if business continues as usual.

The IPCC assessment, even in the 36-page summary for policymakers, does not make for easy reading and leaves a good deal open to interpretation. Almost every statement is so carefully hedged with probability estimates--"very likely," "likely," virtually certain," "might," "cannot be made with confidence," "very unlikely," and so on--that one easily loses sight of the substantive statements actually being made.

Still, some of the most dramatic statements in the report are made flatly, without any qualification whatsoever. Atmospheric concentrations of carbon dioxide, methane, and nitrous oxide have all increased sharply since 1750 and now exceed pre-industrial levels by about 40 percent, 150 percent, and 20 percent respectively. Indeed, their levels "now substantially exceed the highest concentrations recorded in ice cores during the past 800,000 years.”

The IPCC estimates that the average rate of ice loss from glaciers around the world was 275 gigatons per year in 1993-2009, as compared to 226 Gt/yr in the longer 1971-2009 period; Greenland ice sheet melting has "very likely" increased "substantially" to 215 Gt/yr in 2002-2011 from 34 Gt/yr in 1992-2001. Antarctic ices loss "likely" went form 30 Gt/yr in 1992-2001 to 147 Gt/yr in 2002-11.

The IPCC minimizes the significance of slowed warming registered in the last ten years, saying it is well within the bound of normal decadal variability. As for future warming (treated in Section E1 on p. 15 of the summary for policymakers), it is likely to exceed 1.5 or 2 °C in various modeled scenarios, is "unlikely" to exceed 4 °C in all but one of the models, but is "about as likely as not" to exceed 4 °C in the most pessimistic model.

Considering that estimated warming has been in the range of 1.5-4.5 °C since the dawn of computerized climate modeling, one might wonder why so many scientists have gone to the trouble, at considerable expense, of once again subjecting projections to every reasonable kind of probing they can think of only to conclude the original estimates were about right all along. And indeed some reputable authorities are wondering exactly that. In an editorial last week, Nature magazine suggested that it is time for the IPCC to stop doing its regular, comprehensive assessments and instead do focused reports on topics of urgent concern, like the report it issued last year on weather extremes.

I'm tempted to agree. The repetition of so many well-known findings, together with the endless qualification of each finding in terms of probability, tends to deaden rather than awaken interest. Then too there is the inherent caution of the IPCC, given that all reports must be produced by consensus, a conservatism that no doubt was heightened this time around by the climate mini-scandals of the past few years. But what if there were suddenly to be an abrupt change in mainstream assessment of climate prospects, either for better or worse? In the absence of the IPCC's regular assessments, there would be no authoritative and generally respected procedure for validating the change and alerting policymakers to its implications.

Photo: iStockPhoto

World’s Largest Solar Thermal Plant Syncs to the Grid

The Ivanpah Solar Electric Generating System delivered its first kilowatts of power to Pacific Gas and Electric (PG&E) on Tuesday.

The world’s largest solar thermal plant, located in the Mojave Desert, sent energy from its Unit 1 station to PG&E, which provides power to parts of Northern California. When the plant is fully operational later this year, it will produce 377 megawatts. Two of the plant's three units will supply energy to PG&E and the other will deliver power to Southern California Edison.

"Given the magnitude and complexity of Ivanpah, it was very important that we successfully complete this milestone showing all systems were on track," Tom Doyle, president of NRG Solar, one of the plant’s owners, said in a statement.

The massive project spans more than 1400 hectares of public land and will double the amount of commercial solar thermal energy available in the United States. There are other large concentrated solar power (CSP) projects in the Middle East and Spain, but most of the growth in solar in the United States has come from photovoltaic (PV) panel projects, which have come down considerably in price in recent years.

Even with the proliferation of cheap solar PV, there is other value in CSP projects, which use large mirrors aimed at large central towers that create steam to drive turbines. A study earlier this year from National Renewable Energy Laboratory (NREL) found that a concentrated solar facility would be particularly useful for providing short-term capacity when other operators are offline or as a peaker plant when demand is highest. And steam turbines, unlike intermittent wind and solar PV, offer a steady power supply that operators can turn on or off or fine-tune on demand.

Google, which has invested heavily in renewable energy projects—including $168 million it put into Ivanpah—also sees value in CSP. “At Google we invest in renewable energy projects that have the potential to transform the energy landscape. Ivanpah is one of those projects,” Rick Needham, director of Energy and Sustainability at Google, said in a statement. In addition to generation, Google's investments in wind and solar include a solar financing company and the Atlantic Wind Connection project.

And it just wouldn't be an energy project without some criticism. Ivanpah's creators have been chided for the plant's potential to transform the physical landscape—especially its impact on the desert ecosystem and desert tortoises in particular. But some environmentalists see the risk as an acceptable one if utility-scale solar installations are replacing coal-fired power plants. California has a goal to get 33 percent of its electricity from renewables by 2020.

Photo Credit: Brightsource Energy

Counting the Sins of China's Synthetic Gas

Heavy water use, threats of tainted groundwater, and artificial earthquakes are but a sampling of the environmental side effects that have tarnished North America's recent boom in natural gas production via hydraulic fracturing or fracking. No surprise then that in European countries such as the U.K. that are looking to frack for cheap domestic gas, the environmental protesters often arrive ahead of the drill rigs.

But countries seeking fresh gas supplies could do far worse than fracking. So say Duke University researchers who, in today's issue of the research journal Nature Climate Change, shine a jaundiced spotlight on China's plans to synthesize natural gas from coal. Nine synthetic gas plants recently approved by Beijing would increase the annual demand for water in the country's arid northern regions by over 180 million metric tons, the Duke team concluded, while emissions of carbon dioxide would entirely wipe out the climate-cooling impact of China's massive wind and solar power installations.

"At a minimum, Chinese policymakers should delay implementing their synthetic natural gas plan to avoid a potentially costly and environmentally damaging outcome," says Chi-Jen Yang, a research scientist at Duke's Center on Global Change and the study's lead author, in a statement issued yesterday.

Synthetic gas plants use extreme heat and pressure to gasify coal, producing a combination of carbon monoxide and hydrogen. Steam and catalysts are then added to convert those gases to methane to produce a pipeline-ready substitute for natural gas.

It takes a whole lot of steam: According to Duke's estimates, China's synthetic gas plants will consume up to 100 times as much water (per cubic meter of gas) as shale gas production through fracking.

Relative greenhouse impact is harder to pinpoint because fracking's climate footprint remains controversial. Recent U.S. Environmental Protection Agency and industry studies dispute earlier results suggesting that fracked wells leak more methane—a potent greenhouse gas—than conventional wells.

What is certain, say Yang and his colleagues, is that synthetic gas production will be carbon intensive relative to conventional gas. Burning conventional natural gas to produce power releases two to three times less carbon into the atmosphere than when burning coal, but burning synthetic gas will be 36 to 82 percent dirtier than coal-fired plants.

Capturing and storing CO2 emissions could slash the climate costs, and China may have the technology to do it. Last year, Chinese power firm Huaneng started up the world's most advanced coal gasification power plant, which sports equipment to efficiently extract carbon waste from gasified coal. Similar technology could potentially enable China's synthetic gas plants to capture and sequester their CO2 instead of sending it up the stack.

Of course adding such equipment adds to construction and operating costs. Duke's team clearly doubts that Beijing will make synthetic gas producers go there.

Photo: David Gray / Reuters


Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More