Energywise iconEnergywise

Is Energy Efficiency the Most Popular In-Home Automation?

A new study from the Consumer Electronics Association found that energy efficiency technologies are the most popular amongst home automation options in American houses.

Programmable and/or smart thermostats beat out home security and entertainment automation for the top honor, with 47 percent of households saying they had at least one.

The findings, which come from an online survey of about 1000 people, would seem to be a win for energy efficiency. But most of the homes had programmable thermostats, which are often used incorrectly, if at all.

One study from Lawrence Berkley National Laboratory [PDF] found that 89 percent of survey respondents rarely or never used the thermostat to set a weekday or weekend program. Seventy percent were not set at all.

Programmable thermostats have been around for more than 30 years, but a new generation of smart thermostats that connect with smartphones and the Internet make programming far easier. Not only is the interface easier to use but some have algorithms that can learn your household thermal characteristics and daily patterns to help fine-tune settings.

The energy savings for software-based, digital thermostats range from about 15 to 30 percent. But such smart thermostats are still in the minority, with only 12 percent of CEA respondents saying they had one, and even then it was often in conjunction with older thermostats in the home. When old-school programmable thermostats are taken out of consideration, automated home security becomes the most popular technology choice. 

The survey found that saving money was a key motivator when it comes to energy efficiency products, but most people don’t save anything with their programmable thermostats, and the smart thermostat market has been emerging slowly in the past few years despite the potential savings. Energy efficiency has historically been a hard sell, even if it makes sense financially over the long run.

The first generation of two-way digital smart thermostats was often sold through utility channels and the cost was too high. But with the proliferation of smartphones and lower costs, smart thermostats have started to catch on.

The recent popularity has also come from smart-thermostats makers partnering with other players, particularly home security companies and service providers, both of which want to provide end-to-end home automation services for a monthly fee.

Even Nest Labs, which has a smart thermostat that works on a proprietary system, has partnered with some utilities and just launched an application programming interface (API) for developers that want to create smart home apps that run on top of the thermostat. It is also reportedly developing a smoke detector as its second connected home product.

It’s hard to say if a smart smoke detector is the next wave in home automation, and the CEA survey gave mixed signals in terms of what segment will grow the fastest. The study found safety and security are the building blocks of home automation packages, with climate, lighting and appliance controls viewed as “very desirable” but not critical. But the survey also found smart and programmable thermostats were the items that people expressed the highest purchase intent for within the next two years.

Even though connected home offerings are growing, and everyone from your cable provider to your local big box store is selling them, CEA found most consumers still don’t think they have a need for the products. But for products that can provide convenience, peace of mind and a cool factor—just as a smartphone does—the sky could be the limit.


Photo: Randi Klett

Production of Solar Panels Outpaced Investments Last Year

Worldwide photovoltaic  (PV) solar panel production rose 10 percent in 2012 despite a 9 percent drop in investment, reports the European Commission (pdf). The numbers are imprecise, because solar panel makers use different types of production and sales figures, but the Commission authors estimate that producers added between 35 GW and 42 GW of PV capacity in 2012. The growth follows several years in which European governments have trimmed subsidies to solar power, prompting many private investors to shy away from the sector and driving some companies to bankruptcy.

Something about solar is special, though: investment in PV capacity still made up over half (57.7 percent) of new renewable energy investments, for a total of $137.7 billion, and analysts predict further growth through 2015. Part of the reason for investment's lag behind production is that producers added so much production capacity during the pre-recession subsidy boom that they need less capital investment to sustain high production levels. Making the hardware isn't the hard part.

Indeed, a recent Energy and Environmental Science study found that "soft" costs such as supply-chain efficiencies and regulatory barriers made up more of the difference in production costs between regions than hardware production costs. They predicted that the right business management and regulatory boosts could enable U.S. manufacturers to match China's. The EC report also shows optimism for PV in the United States: it figures U.S. PV capacity grew from 3.4 GW to 7.7 GW in 2012, almost doubling in response to a mix of legislative mandates and tax credits.

Courtesy European Commission Joint Research Center

Most of the rest of the growth comes from Asia, where governments are still in the first flush of support for solar energy. The EC report expects new guaranteed prices for solar power there, much like the prices which drove Europe's own solar boom in the mid-2000s. In Australia, about 10 percent of homes already have PV systems.

That doesn't mean the sun is setting on solar in Europe, though. After a pilot run near sunless London, Ikea announced that it would offer PV panels at all its United Kingdom stores. The firm figures consumers can earn £770 ($1247) a year between subsidies and savings on conventional electricity bills. Upfront costs are at least £5700, but typical panels last decades and should amortize installation costs within a little over 7 years. That should make up for some of the UK's gray days.


Real-Time Monitoring Yields Payoffs in Power Distribution

A key smart grid concept is the notion that the sharp traditional distinction between electrical transmission and distribution will blur as local feeder systems become much more flexible and controllable, and as power is fed back into the grid from distributed generation and storage assets. Making distribution systems two-way and flexible represents an enormous engineering challenge, of course. But as it is met and surmounted, consumers will see tangible benefits.

Writing in a recent issue of Power Electronics, John McDonald, director of technical strategy and policy development for GE Energy Management in Atlanta, provides two vivid examples of the new challenges in electricity distribution. In San Diego, where well-to-do homeowners have installed photovoltaic arrays in large numbers along the coast, power output tends to spike around noon, when the morning fog has burned off. Since air conditioning is generally not required in the breezy homes, service panels usually are limited to 100 amps, so that a single service transformer feeds as many as 20 houses. “All these factors contribute to spikes in voltage variability when PV output is high, load is low, voltage regulation is electromechanical and, possibly, there's high impedance on the feeder," a source with Sempra Energy told McDonald, a designated IEEE smart grid expert.

In another example involving Sempra's San Diego Gas & Electric, the utility is required to accommodate the needs of a large avocado farm, inland, where large PV arrays have been installed to help offset the power needs of the irrigation system. There too, output rises in the middle of the day, after fog has burned off, but the irrigation system is run mainly in the early morning. So, with net metering, "the farm injects substantial amounts of power onto the feeder, creating a voltage differential that results in reverse power flows," the Sempra Energy source told McDonald. "That, in turn, leads to the maximum [transformer] tap, then lockout, of an upstream voltage regulator and has even affected voltage on the primary distribution line."

Part of the solution, for SDG&E and for other utilities, lies in use of dynamic VAR devices assisted by power electronics—solid state devices that can modify current characteristics. (McDonald notes that in Europe, where government policies encourage high PV penetration, inverters to provide reactive power compensation are widely required.) Greater voltage variability in distribution systems can sorely stress transformers, and so "major vendors are doing R&D on prototype LTCs [load taps changers] employing power electronics," writes McDonald.

Generally, it is proving helpful to gather information from the full range of smart sensors in a distribution system—from protective relays and feeder controllers to smart meters at homes—to monitor in real time how voltage is being affected by distributed assets. In Arizona, GE has helped a utility design and install a data collection system parallel to the SCADA system to detect and record voltage variability.

It's a business in which not just the big established companies like GE, Oracle, SAP and IBM are active, but also a host of hard-charging startups like Opower in San Francisco and Space-Time Insight, in San Mateo, California. Space-Time, having developed systems to collect and process data from a wide variety of sources, has obtained contracts to enhance network intelligence in the Sacramento Municipal Utility District (SMUD), Ontario's Hydro One, and the Czech Republic's CES. The general expectation, as Rebecca Smith put in in a recent Wall Street Journal  roundup, is that the improved power system intelligence will yield not only greater reliability but more timely and efficient repairs.

Photo: iStockphoto

Latest IPCC Climate Assessment Reaffirms Previous Findings

The Intergovernmental Panel on Climate Change (IPPC) released a new assessment of climate science this morning, its fifth comprehensive evaluation of the basic physics. In light of the controversies that dogged its work and the doings of some major contributors—an error that found its way into the last assessment concerning the rates at which Himalayan glaciers are melting; the "climategate" kerfuffle over leaked e-mails—perhaps the most significant aspect of the new report is simply its reaffirmation of previous findings.

Of course, the report is being spun slightly differently depending on the ideological proclivities of publishers and, no doubt, the personal interests and obsessions of particular writers. The Wall Street Journal, in its treatment today, stresses a moderate lowering of the IPCC's projected warming range in this century by comparison with 1850-1900: to 1.5-4 degrees celsius (not 2 degrees, as the Journal misleadingly put it), from 2-4.5 °C previously. The New York Times mentions the IPCC's lowering its estimate of the warming to be expected from a doubling of greenhouse gas concentration, to 2.7-8.1 degrees from 3.6-8.1, but it emphasizes the limit put on the amount of CO2 that can be spewed into the atmosphere if warming is to be held at 3.6 °C: That limit is 1 trillion tons of carbon in toto—roughly half of which has already been emitted since the beginning of the industrial revolution in 1750, with the second half due to be emitted by 2040 if business continues as usual.

The IPCC assessment, even in the 36-page summary for policymakers, does not make for easy reading and leaves a good deal open to interpretation. Almost every statement is so carefully hedged with probability estimates--"very likely," "likely," virtually certain," "might," "cannot be made with confidence," "very unlikely," and so on--that one easily loses sight of the substantive statements actually being made.

Still, some of the most dramatic statements in the report are made flatly, without any qualification whatsoever. Atmospheric concentrations of carbon dioxide, methane, and nitrous oxide have all increased sharply since 1750 and now exceed pre-industrial levels by about 40 percent, 150 percent, and 20 percent respectively. Indeed, their levels "now substantially exceed the highest concentrations recorded in ice cores during the past 800,000 years.”

The IPCC estimates that the average rate of ice loss from glaciers around the world was 275 gigatons per year in 1993-2009, as compared to 226 Gt/yr in the longer 1971-2009 period; Greenland ice sheet melting has "very likely" increased "substantially" to 215 Gt/yr in 2002-2011 from 34 Gt/yr in 1992-2001. Antarctic ices loss "likely" went form 30 Gt/yr in 1992-2001 to 147 Gt/yr in 2002-11.

The IPCC minimizes the significance of slowed warming registered in the last ten years, saying it is well within the bound of normal decadal variability. As for future warming (treated in Section E1 on p. 15 of the summary for policymakers), it is likely to exceed 1.5 or 2 °C in various modeled scenarios, is "unlikely" to exceed 4 °C in all but one of the models, but is "about as likely as not" to exceed 4 °C in the most pessimistic model.

Considering that estimated warming has been in the range of 1.5-4.5 °C since the dawn of computerized climate modeling, one might wonder why so many scientists have gone to the trouble, at considerable expense, of once again subjecting projections to every reasonable kind of probing they can think of only to conclude the original estimates were about right all along. And indeed some reputable authorities are wondering exactly that. In an editorial last week, Nature magazine suggested that it is time for the IPCC to stop doing its regular, comprehensive assessments and instead do focused reports on topics of urgent concern, like the report it issued last year on weather extremes.

I'm tempted to agree. The repetition of so many well-known findings, together with the endless qualification of each finding in terms of probability, tends to deaden rather than awaken interest. Then too there is the inherent caution of the IPCC, given that all reports must be produced by consensus, a conservatism that no doubt was heightened this time around by the climate mini-scandals of the past few years. But what if there were suddenly to be an abrupt change in mainstream assessment of climate prospects, either for better or worse? In the absence of the IPCC's regular assessments, there would be no authoritative and generally respected procedure for validating the change and alerting policymakers to its implications.

Photo: iStockPhoto

World’s Largest Solar Thermal Plant Syncs to the Grid

The Ivanpah Solar Electric Generating System delivered its first kilowatts of power to Pacific Gas and Electric (PG&E) on Tuesday.

The world’s largest solar thermal plant, located in the Mojave Desert, sent energy from its Unit 1 station to PG&E, which provides power to parts of Northern California. When the plant is fully operational later this year, it will produce 377 megawatts. Two of the plant's three units will supply energy to PG&E and the other will deliver power to Southern California Edison.

"Given the magnitude and complexity of Ivanpah, it was very important that we successfully complete this milestone showing all systems were on track," Tom Doyle, president of NRG Solar, one of the plant’s owners, said in a statement.

The massive project spans more than 1400 hectares of public land and will double the amount of commercial solar thermal energy available in the United States. There are other large concentrated solar power (CSP) projects in the Middle East and Spain, but most of the growth in solar in the United States has come from photovoltaic (PV) panel projects, which have come down considerably in price in recent years.

Even with the proliferation of cheap solar PV, there is other value in CSP projects, which use large mirrors aimed at large central towers that create steam to drive turbines. A study earlier this year from National Renewable Energy Laboratory (NREL) found that a concentrated solar facility would be particularly useful for providing short-term capacity when other operators are offline or as a peaker plant when demand is highest. And steam turbines, unlike intermittent wind and solar PV, offer a steady power supply that operators can turn on or off or fine-tune on demand.

Google, which has invested heavily in renewable energy projects—including $168 million it put into Ivanpah—also sees value in CSP. “At Google we invest in renewable energy projects that have the potential to transform the energy landscape. Ivanpah is one of those projects,” Rick Needham, director of Energy and Sustainability at Google, said in a statement. In addition to generation, Google's investments in wind and solar include a solar financing company and the Atlantic Wind Connection project.

And it just wouldn't be an energy project without some criticism. Ivanpah's creators have been chided for the plant's potential to transform the physical landscape—especially its impact on the desert ecosystem and desert tortoises in particular. But some environmentalists see the risk as an acceptable one if utility-scale solar installations are replacing coal-fired power plants. California has a goal to get 33 percent of its electricity from renewables by 2020.

Photo Credit: Brightsource Energy

Counting the Sins of China's Synthetic Gas

Heavy water use, threats of tainted groundwater, and artificial earthquakes are but a sampling of the environmental side effects that have tarnished North America's recent boom in natural gas production via hydraulic fracturing or fracking. No surprise then that in European countries such as the U.K. that are looking to frack for cheap domestic gas, the environmental protesters often arrive ahead of the drill rigs.

But countries seeking fresh gas supplies could do far worse than fracking. So say Duke University researchers who, in today's issue of the research journal Nature Climate Change, shine a jaundiced spotlight on China's plans to synthesize natural gas from coal. Nine synthetic gas plants recently approved by Beijing would increase the annual demand for water in the country's arid northern regions by over 180 million metric tons, the Duke team concluded, while emissions of carbon dioxide would entirely wipe out the climate-cooling impact of China's massive wind and solar power installations.

"At a minimum, Chinese policymakers should delay implementing their synthetic natural gas plan to avoid a potentially costly and environmentally damaging outcome," says Chi-Jen Yang, a research scientist at Duke's Center on Global Change and the study's lead author, in a statement issued yesterday.

Synthetic gas plants use extreme heat and pressure to gasify coal, producing a combination of carbon monoxide and hydrogen. Steam and catalysts are then added to convert those gases to methane to produce a pipeline-ready substitute for natural gas.

It takes a whole lot of steam: According to Duke's estimates, China's synthetic gas plants will consume up to 100 times as much water (per cubic meter of gas) as shale gas production through fracking.

Relative greenhouse impact is harder to pinpoint because fracking's climate footprint remains controversial. Recent U.S. Environmental Protection Agency and industry studies dispute earlier results suggesting that fracked wells leak more methane—a potent greenhouse gas—than conventional wells.

What is certain, say Yang and his colleagues, is that synthetic gas production will be carbon intensive relative to conventional gas. Burning conventional natural gas to produce power releases two to three times less carbon into the atmosphere than when burning coal, but burning synthetic gas will be 36 to 82 percent dirtier than coal-fired plants.

Capturing and storing CO2 emissions could slash the climate costs, and China may have the technology to do it. Last year, Chinese power firm Huaneng started up the world's most advanced coal gasification power plant, which sports equipment to efficiently extract carbon waste from gasified coal. Similar technology could potentially enable China's synthetic gas plants to capture and sequester their CO2 instead of sending it up the stack.

Of course adding such equipment adds to construction and operating costs. Duke's team clearly doubts that Beijing will make synthetic gas producers go there.

Photo: David Gray / Reuters

GE to Muscle into Fuel Cells with Hybrid System

General Electric is working on an efficient distributed power system that combines proprietary fuel cell technology with its existing gas engines [like the one in the photo].

The company's research organization is developing a novel fuel cell that operates on natural gas, according to Mark Little, the director of GE Global Research and chief technology officer. When combined with an engine generator, the system can convert 70 percent of the fuel to electricity, which is more efficient than the combined cycle natural gas power plants powering the grid.

The fuel cell will generate electricity from reformed natural gas, or gas that's treated with steam and heat to make hydrogen and oxygen, he says. Residual gases from the fuel cell process—a "synthesis gas" that contains carbon monoxide and hydrogen—would then be burned in a piston engine to generate more electricity. The waste gas that comes from the fuel cell needs to be specially treated but “we know we can burn these things. They’re well within the fuel specs of our current engine,” Little says.

This distributed power system could provide electricity to a small industrial site or a data center, for example. It would replace diesel generators that are often used to power remote locations or bring electricity to places without a central grid. 

GE sells engines from two companies it acquired, Austria-based Jenbacher and Wisconsin-based Waukesha. It has done its own research on solid oxide fuel cells, and in 2011, it invested in Plug Power, which makes fuel cells for homes and small businesses. But Little indicated that this distributed power system will use new fuel cell technology invented by GE and configured to work in tandem with GE's engines. “We have a real breakthrough in fuel cell technology that we think can enable the system to be distributed and yet work at a very high efficiency level,” he says.

Commercial customers are showing more interest in stationary fuel cells and natural gas generators because they can provide back-up power and potentially lower energy costs. GE's system, which is still a few years a way from commercial availability, will be aimed at customers outside of the United States, Little says. Because the United States has relatively cheap natural gas, the combined power generation unit is unlikely to be cost competitive with grid power there. However, the price for natural gas in many other countries is more than double that in the United States and the hybrid power generation unit will “compete beautifully,” Little says.

GE's hybrid fuel system is just one of many research efforts the conglomerate has underway to take advantage of unconventional oil and natural gas drilling. Among the projects now being considered at a planned research center in Oklahoma is a way to use liquid carbon dioxide as the fluid to fracture, or frack, wells, rather than a mixture of water and chemicals. The company is developing a hybrid locomotive engine that can run on both diesel and natural gas. And it is working on small-scale liquid natural gas fueling stations that could be placed along railroad lines.

In another effort, GE is developing sensors and software to make oil and gas wells smarter. Researchers are working on different types of photonic sensors that are able to withstand very high heat and pressure. These  would be better  than electronic sensors for gathering flow and fluid composition data within wells, according to GE researchers.

Image credit: GE

Can Rotterdam's Port Become a Virtual Power Plant?

The energy sector is critical to Rotterdam’s economy, but the port city has aggressive plans to cut its carbon dioxide emissions in half by 2025.

The city takes in imports of oil, coal, biomass, and natural gas that are used across Northwest Europe. It is not just a stopover, but also a major refinery hub for the region. Even though Rotterdam relies heavily on the fossil fuel industry, it is increasingly focused on how to leverage renewables and existing assets to power its own port.

Rotterdam is partnering with General Electric [PDF] to develop a smart grid that can act as a virtual power plant (VPP), which would integrate thermal and renewable power production with flexible users in a centrally controlled system that would act as a single power plant. The city has been working with GE in the past few years to reduce emissions, improve water management and increase energy efficiency.

A virtual power plant takes energy efficiency and demand-side management to another level. It can be thought of as a sophisticated microgrid cluster, in which digital measurement and monitoring equipment on distributed resources can respond to the needs of the grid in real time. For example, many of the large industrial plants in the port produce their own electricity and heat, which can be sold into the grid when wind or solar production falls. There may also be more traditional generation, such as a coal-fired power plant or combined heat and power.

“Within a VPP, the electricity use of one part can be coordinated with the production of electricity in another part. A harbor, where many companies produce and consume electricity at a limited distance from each other, should be a suitable location to test and implement such a VP,” Daan Six of Belgian research organization VITO said in a report on the potential of a VPP in Rotterdam.

A virtual power plant usually responds in real-time to changing electricity rates. Depending on the cost of electricity, a large industrial customer may sell some power back to the grid or provide grid balancing services like frequency regulation, which is a larger problem with intermittent wind and solar than with steady, thermal generators.

A dynamic microgrid with various ways to produce and curb kilowatts can lead to cleaner energy use, especially if fossil-fueled peaking power plants can be avoided by consumers curbing their energy use. But a virtual power plant is not necessarily a replacement for fossil fuel-fired plants. An industrial customer might turn to backup generators that run on diesel, for instance, when the price signal is too high to take power from of the grid.

“Rotterdam is certainly one of those global conglomerates of industry in a very tight space and, because of the petrochemical and other activity there, with incredibly high energy demands,” GE’s Stephen Burdis told PortStrategy. “That is one of the drivers behind the project.”

The virtual power plant project is part of a larger energy restructuring in Rotterdam. E.ON and GDF Suez are constructing coal/biomass power stations that will decrease the carbon footprint compared to a coal-only power plant. Some refineries in the port are already capturing carbon dioxide and providing it to greenhouse growers. Steam waste heat is being captured for district heating and the port has plans to double its capacity for wind energy.

The efficiency efforts in Rotterdam are part of a broader effort in the North Sea region, E-harbors, which aims to maximize the use of renewable energy for transportation and electricity consumption.


Photo: Mercator Media

EPA Issues Regulations for New Coal-Fired Plants

As expected, at the end of last week the U.S. Environmental Protection Agency (EPA) released regulations for carbon emissions from new coal-fired and gas-fired power plants, under authority of the Clean Air Act and a key Supreme Court decision upholding the agency's authority to regulate greenhouse gas emissions. As expected, the EPA proposes to limit emissions from new coal plants to about 500 kilograms (1100 pounds) per megawatt-hour, and from gas plants, to 450 kg (1000 lbs) per MWh.

The EPA originally aimed to set a single standard for both coal and gas plants, at the lower limit, but under pressure from industry, it slightly eased the limit for coal. It is a minor concession. Average emissions today from a newly built, state-of-the-art coal plant are around 800 kg (1800 lbs) per MWh, according to GigaOm's Katie Fehrenbacher. The EPA's regulations are being universally interpreted to imply, therefore, that no new coal plant can be built in the United States unless it provides for carbon capture and storage (CSS).

The immediate impact of the regs, also by universal agreement among industry analysts and commentators, will be nil. That is because no new coal-fired power plants are being built anyway. The reason generally given is competition from much cheaper and much cleaner natural gas. But there is a second, just as important reason. Starting about a decade ago, a nationwide anti-coal campaign erupted at the environmentalist grassroots, which means that any company that proposes to build a new coal-fired plant has to go through the procedural trials previously reserved for those proposing a new nuclear plant.

That radical change in the country's political chemistry was a main factor, for example, in the dramatic decision by Texas's top utility in 2007  to ditch plans for a fleet of new coal plants and—this anyway was the initial plan—to go nuclear instead. Much sooner than most industry experts and political commentators had expected, the message had penetrated to the grassroots that coal plants are the country's single most concentrated source of greenhouse gas emissions (as EPA put it in its press release announcing the new regulations last Friday), and a major cause of respiratory disease as well.

To say that the new regs were generally expected and will have little or no immediate impact is not to say, actually, that they necessarily will be upheld when challenged in court. The EPA is required to demonstrate that the new carbon limits are achievable by means of current demonstrated and economically viable technology, that is to say, CCS. Right now, only scattered pilot CCS plants exist, and industry shows no inclination whatsoever to build a large plant equipped with CCS on a commercial basis, either in the United States or abroad. Challenges to the regs will likely climb all the way through the legal appeals process, eventually reaching the Supreme Court. In the end, industry may well win.

In the meantime, regulatory uncertainty will be one more reason for energy companies not to build new coal plants on any basis. But that uncertainty is but one of three conditions, any one of which is sufficient to virtually guarantee there will be no new coal-based generation. For a real renaissance in coal-fired power to occur, not only would gas prices have to rise sharply, but there would have to a fundamental change in environmentalist opinion.

Photo: Luke Sharrett/Bloomberg/Getty Images


Rooftop Solar Faces Growing Opposition from Utilities

Although solar energy is still a midget among U.S. energy sources, its rapid growth from a small base is beginning to make some of the big players nervous. Regulated utilities in a number of states—Arizona, California, Colorado, Idaho and Louisiana—have started to complain about the various benefits for photovoltaics (PV), says Mac Gunther, in a article appearing on Yale's environment360 website. Gunther, a contributing editor at Fortune, describes the position of PV in the U.S. energy mix as "puny" or "a mere blip," inasmuch as it accounted for barely one-tenth of 1 percent of U.S. electricity last year. (Coal delivered 37 percent and natural gas 30 percent.) Yet rooftop PV installations jumped nearly 50 percent last year, enough to make some incumbents seriously nervous.

Critics of solar incentives object to the whole panoply of state and Federal subsidies favoring PV, but they particularly object to aspects of "net metering," the requirement that utilities allow distributed generators like owners of rootfop arrays to sell electricity back into the grid. The subject of net metering is a complicated one. In the United Kingdom, which to a great extent inspired the injection of free-market principles into electric power systems, net metering is not generally allowed or encouraged. There, the industry has persuaded regulators that with net metering, distributed generators become, in effect, free riders—they benefit from selling into the grid without bearing their fair share of paying for its maintenance. In the extreme case, a household that always produced excess energy and never bought power from the local utility might pay nothing to support the grid.

In the United States, net metering was required by 2005 Federal energy legislation, but details of implementation vary drastically from state to state. A key issue is whether utilities are required to pay customers selling solar electricity into the grid at wholesale or retail electricity rates. As Rick Tempchin, executive director for retail energy services at the Edison Electric Institute in Washington, D.C. has put it, “Paying credits at the full retail rate costs the utility money because that cost will be higher than the cost that the utility actually avoids by purchasing the distributed generation power. For example, in centralized markets, a utility can buy all of its power needs at the wholesale rate. This rate will always be less than the full retail rate it would have to pay to buy the same power from a customer.”

In some ways, the debate over net metering is rather closely analogous to arguments that have raged in U.S. education policy over school vouchers and, more recently, charter schools. There too, critics complain that allowing parents to take their children out of public schools at public expense results in less revenue for maintenance of school infrastructure. As in the education debates, political libertarians, including some Tea Party members, tend to support net metering because it permits and encourages individuals to produce their own power. In Arizona, where the debate is unusually heated, no doubt because PV is closest to commericial competitiveness there, net metering advocates have hired Barry Goldwater Jr, the son of the late arch-conservative Arizona senator, to promote their cause. The pro-coal, anti-environmentalist Koch brothers, on the other hand, have put money into Arizona to support a website that opposes net metering.

Photo: Andy Cross/The Denver Post/Getty Images


Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More