Energywise iconEnergywise

A Smart Phone Uses as Much Energy as a Refrigerator?

When you plug your smart phone into the wall, it draws a negligible amount of energy compared with other household electronics such as your set-top box or refrigerator.

But add in the amount of electricity it takes to move data across networks to deliver a total of, say, an hour of video to your smart phone or tablet each week, and over a year it adds up to more power than two new Energy Star refrigerators consume in a year.

And though phones and other electronics and appliances are becoming ever more efficient, that efficiency does not offset the proliferation of these devices around the world.

A new paper, “The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure, and Big Power," [PDF] investigates the energy draw of information-communications technologies (ICT) and how they are dwarfing what we traditionally think of as energy hogs in the home. The paper was commissioned by the U.S. National Mining Association and the American Coalition for Clean Coal Electricity.

The global ICT ecosystem uses about 1500 terawatt-hours of electricity annually, which is equal to the electricity used by Japan and Germany combined. That figure will only increase, especially as cloud architecture overtakes wired networks.

Read More

Blackout Threat Unmitigated a Decade After the Northeast Went Dark

The Northeast Blackout struck seven U.S. states and Ontario ten years ago today, prompting mandatory standards to prevent such a cascading power outage from happening again. Then, two years ago, it did. Arizona, California and Mexico’s Baja California took the hit in 2011, but the story was much the same. In both cases, inadequate information and planning and human error left the power grid unstable enough that a single downed power line unleashed an electrical tsunami that swamped neighboring lines and darkened millions of homes and businesses.

Power system experts who study blackouts say that they see a similar pattern in most cascading outages. They cite other recent notables, such as Western Europe in 2006, Brazil in 2009, and twice in India last year. The commonality is evidence to the experts that cascading failures are a dangerous facet of modern power grids that remains all but impossible to predict or prevent. “Large blackouts are likely to recur at regular intervals,” says Ian Dobson, a cascading failures expert and electrical and computer engineering professor at Iowa State University.

Worse still, bigger and more frequent blackouts may be coming. The Northeast Blackout, the worst in U.S. history, shut off 61 800 megawatts (MW) of power consumption. But a 100-year blackout—the largest for which there is a 1 percent chance of occurring every year—would be three times bigger than that event in 2003. According to University of Vermont electrical engineering professor Paul Hines and colleagues at Carnegie Mellon University, who made the calculation with algorithms used in natural disaster planning, such a blackout would interrupt 186 000 MW or roughly one quarter of all electrical service in the continental U.S.

Meanwhile, more frequent blackouts are also likely, a result of the increasing incidence of extreme weather such as thunderstorms, hurricanes, and blizzards predicted by climate models. Weather-related outages are already on the rise according to a report this week from the White House Council of Economic Advisors analyzing U.S. Department of Energy (DOE) stats on outages affecting 50 000 or more power customers. The report identifies more than 80 such weather-related outages per year, on average, from 2008 to 2012 -- more than double the average frequency observed during the previous five years.

The White House also cites efforts underway to improve grid reliability and resilience. After the 2003 blackout, Congress mandated that utilities comply with grid maintenance and operating standards set by the North American Electric Reliability Corporation, an industry-funded non-profit. As a result, utilities do things like trim trees under power lines more often. And $4.5 billion in economic stimulus funding allocated for grid upgrades accelerated the deployment of advanced grid technologies, such as sensors called phasor measurement units that give controllers a real-time picture of power flows across the grid.  

Jeff Dagle, chief electrical engineer at the DOE’s Pacific Northwest National Laboratory and a member of the task force that investigated the 2003 blackout, points to a speed-up in state-estimation software. Operators rely on such modeling to understand how their grid is behaving and foresee the impact of losing key components, so faster models enable faster evasive actions. “Ten years ago these commonly took 15 minutes. Today many would deem five minutes to be slow,” says Dagle.

The problem, say Dagle and other grid experts, is that power lines and power stations are aging at the same time that increasing levels of renewable power generation are straining the grid. “We’re putting more demands on the system,” says Dagle. In the calculus of improvements, degradation, and increasing demands, he says “it’s really hard to know [whether] we’re more or less reliable today.”

At the same time, it is unclear whether the "smart" upgrades touted by the White House will help or hurt. That’s because tools to quantify the risk of cascading failures are still immature. And until the risk can be accurately measured, grid engineers can’t know precisely how to reduce it.

Dobson says the research is coming now that the power engineering community increasingly recognizes cascading failures as a distinct and recurring problem—a concept that still elicited protests from power engineers in the aftermath of the 2003 blackout. Dobson cites work by the IEEE Power Engineering Society’s Cascading Failures Working Group, of which he and Hines are members.

One novel method he unveiled last year assesses the likelihood of cascading outages by counting the number of subsequent lines that trip off each time a power line goes down on a given grid. Using a database from the Bonneville Power Authority (the only U.S. utility whose line-trip data is publicly-available), Dobson showed he could quantify the likelihood that a line outage would propagate and how far it would go, on average. But the tool remains too weak to identify trends or evaluate specific engineering solutions. “The events are rare so it takes years of data to get a reasonable estimation,” says Dobson.

Until such tools are mature it will be difficult to target funds to those upgrades most likely to reduce blackout risk, say Dobson and others. Even grid upgrades underway could be threatened if their impact on blackout risk is poorly understood. “If newly-introduced technology gets blamed for a future blackout there could be pressure to run it below capacity, or even shut it off entirely. That would be quite a pity if we could have mitigated the failure risk instead,” says Dobson.

Photo: Spencer Platt/Getty Images

IBM Launches Advanced Renewable Forecasting Tool

Weather has always been an important factor in planning for the needs of the electrical grid, but lately it has become even more crucial with the proliferation of grid-scale wind and solar resources.

To meet the needs of utilities that are installing large amounts of renewable energy, IBM just launched software that brings together data analytics and cutting-edge weather prediction. The Hybrid Renewable Energy Forecasting, or HyRef, incorporates cloud imaging technology, sky-facing cameras, and operational and environmental sensors to build customized models of renewable outputs.

“If you can’t do the weather forecast better, you’re done,” says Lloyd A. Treinish, chief scientist of IBM's Deep Thunder program, which aims to improve local weather forecasting through the use of high-performance computing.

Wind farms, for example, have had sensors on the turbines, but they’re used to monitor the turbines and not the weather because the data has been too contaminated to be of use, Treinish says.

IBM took its expertise in renewables and weather forecasting, including its micro-forecasts used in Deep Thunder, to develop HyRef. One of the biggest challenges was taking inputs from the front of the wind blades, rather than behind it. Taking measurements from the front of the blade increased the acoustic noise, which had to be filtered out to get an accurate reading.

Once IBM had data that was free of contamination, it built a statistical model that could drill down to the individual turbine scale and provide forecasts for 15-minute intervals or up to a month in advance. “With accuracy and precision, you have that much better information so then you can have inputs that offer far greater fidelity for power output,” says Treinish.

The first client to use the software, Jibei Electricity Power Company Limited—a subsidiary company of the State Grid Corporation of China—hopes to increase the integration of renewable power generation by 10 percent. The utility is part of State’s Grid’s the Zhangbei 670 MW demonstration project, the world’s largest utility-scale renewable power plant, which combines wind and solar with energy storage and transmission.

“Clients keep telling us forecasts aren’t good enough,” Treinish says of current weather forecasts for renewables. IBM’s approach is to customize the software for each utility’s needs, whether that’s a large-scale solar array or offshore wind farm. In the case of Jibei, the utility is interested in day-ahead forecasting for its wind resources.

“We have some of the most high-resolution weather modeling software, and we bring that to bear,” says Treinish. IBM also relishes the multi-disciplinary challenge of building better forecasting tools, which combine expertise from math disciplines, computing, atmospheric physics and other sciences.

Much of the research for HyRef was borrowed from solving other weather-related challenges, such as flood forecasting for cities or outage detection for grid operators. “We’re data scavengers,” says Treinish. “We’ll use whatever we can use to make the forecast better.”

The interest in HyRef, so far, has been in areas around the globe that have a high penetration of renewable energy and are already having intermittency problems, according to Stephen Callahan, a partner in IBM’s global business services for the energy and utilities industry. For some clients, the goal is not just better forecasting for the individual utility, but bringing that accuracy to market at the wholesale level so that the economic value of renewable energy can be priced more effectively. “That is the threshold we’re all working to get to,” says Callahan.

Photo: Yagi Studio/Getty Images

Detecting and Correcting Methane Leakage: Is a Technical Fix Ahead?

It's fashionable and sensible in the complicated domains where technology and policy intersect to be suspicious of any narrowly conceived solution. "There's no technical fix," goes the usual refrain (which I certainly have voiced plenty of times myself). In the case of methane leakage from natural gas production and distribution systems, however, there really may be a combination of technical fixes on the horizon. Because of methane's high global warming potential relative to carbon dioxide, detecting and correcting methane leakage is going to be very important in the years ahead.

Two and a half years ago, the Maryland company Earth Networks announced it would start building a network of sensors to directly monitor greenhouse gas emissions on a regional basis, working with the Scripps Institution of Oceanography, and others. Each processing unit consists of a US $50 000 box, along with elaborate calibration software that Earth Network has developed with scientific partners at Scripps, NIST and NOAA, and connects by a tube with sensors installed 100 meters up on some existing tower [photo]. The initial plan called for putting about 100 units in the United States, 25 in Europe, and 25 other places in the world.

Earth Networks CEO Robert Marshall says that 28-30 units are now installed, mostly in the U.S. Northeast but also in the Los Angeles area and some other places. One of the company's units has taken over the famous job of monitoring global CO2 atop Hawaii's Mauna Loa and produced the definitive measurement several months ago of the Keeling curve's crossing the 400 ppm threshold.

Earth Networks, originally known as AWS Convergence Technology, makes its living primarily by delivering very finely grained weather forecasts and severe weather alerts. Its well-know trademarked product, WeatherBug, is widely used by clubs, schools, and planners of major sports and entertainment events. Before the terrible Moore, Oklahoma tornado last spring, the company's networks detected dense in-cloud lightning, an early warning signal of the catastrophic twister. As for the greenhouse gas monitoring network that Earth Networks is installing, this work is being done at present on what you might call a pro bono basis—or, if you prefer, as a speculative venture anticipating high future demand for the new service.

These days, says Marshall, there is little demand for GHG monitoring at the national level because there is no real issue of national compliance at present. The system established by the Kyoto Protocol in 1997 has lost traction, and the world awaits a new system of mandatory GHG cuts, to be formulated at a conference in Paris at the end of 2015. But in the meantime, Marshall points out, two regional GHG reduction programs have been established in the United States—in the Northeast and in California—and many cities here and overseas are adopting objectives that will require monitoring.

As the GHG detection networks get built out, they will be able to determine how much methane is escaping from major fracking fields and from aging urban gas distribution systems, to name the two most important situations giving rise to acute concern. At present, reports of emissions from gas fields are often based on one-day spot checks done by aircraft flyovers. "There's nobody else out there doing what we do," says Marshall. "Permanently installing sensors that take data continually and can monitor emissions from an entire gas field, as opposed to just individual wells."

Once the Earth Networks GHG detectors are able to provide alerts to situations where methane leakage is high, then newly developed portable monitors can be used to pinpoint the exact spots where leaks are highest, so that corrective measures can be taken. One such portable device, a "gasbot" developed in Sweden, was described in a recent post here; another, described in a recent New York Times article, was developed by instrument maker Picarro, the same company that makes the Earth Networks GHG boxes. Thus, the region-wide and portable monitoring devices have complementary roles to play so that, to the extent methane leakage turns out to be a really serious problem, it may also turn out to be a fixable problem.

Photo: Earth Networks

Another Very Strong Year for U.S. Wind

Researchers at the Lawrence Berkeley National Laboratory (LBL) have issued their latest authoritative report on the status of U.S. wind energy. Newly installed turbine capacity increased a whopping 90 percent in 2012, driven in large part by subsidies that were expected to expire (but did not); total wind capacity climbed to 60 gigawatts, the equivalent in terms of expected energy production of roughly a quarter of installed nuclear power.

Despite plummeting natural gas prices and wide switching from coal to gas generation, the expansion of wind outstripped gas last year in terms of capacity, though not in terms of expected energy production. The United States narrowly edged out China as world leader in wind capacity additions and left Germany, a one-time world market leader, in the dust.

When wind is considered as a percentage of total electricity consumption, the United States still ranks only twelfth, with Denmark in first place, Germany in fifth, and the UK in eighth. But the scope for potential U.S. expansion is considerable. The United States still has no offshore wind turbines installed, an area where Denmark, Germany, and the UK have led the way.

Many findings in the LBL report indicate a mature technology capable of standing on its own two feet, with predicable costs and returns. Capacity factors—the proportion of time units are generating at rated capacity—have been steady since the beginning of the century at about 30 percent; that is the most important single measure of performance and reliability. Turbine costs came down from about $1.80/watt in the late 1990s to $0.80/W in 2001-02, climbed back up to about $1.60/W a few years later, and now seem to be settling in the vicinity of $1/W. Total project costs, after dropping from more than $3/W in the 1980s to about $1.25 in 2004-05, seen to be stabilizing near $2/W.

Somewhat counter-intuitively, the report found that the biggest economies of scale are registered not for example by very large farms with medium-scale turbines but by small clusters of very large turbines [photo].

Looking both back and ahead, perhaps the most important aspect of wind's impressive performance has been the standard it is setting for solar energy, which also is intermittent and therefore has similar expected capacity factors. At current solar prices, which obviously are artificially low, photovoltaic arrays are being installed at costs similar to wind's—$1/W or less per panel, $2/W or less per project. Only time will tell whether in fact solar also is crossing the boundary to market competitiveness. As of today, according to the environmental reporter for London's Financial Times, PV arrays installed without subsidies account for only a tenth of one percent of total world installations.

Looking to the present and immediate future, the LBL experts expect this year to be somewhat slow for wind, as the project pipeline is rebuilt, but for solid growth to resume in 2014.

Photo: Jodi Jacobson/iStockphoto

Report Counts Up Solar Power Land Use Needs

The National Renewable Energy Laboratory (NREL) released a report [PDF] last week that aimed to quantify exactly how much room solar power requires. Land use and space issues have long been a point of contention when it comes to renewables, with opponents complaining that the huge spaces required for solar and wind aren't worth the effort. The NREL report suggests the acreage required for industrial-scale solar power plants is within the range of previous estimates, and generally doesn't seem off-the-charts outrageous.

“The numbers aren’t good news or bad news,” said Paul Denholm, one of the report's authors, in a press release. “It’s just that there was not an understanding of actual land use requirements before this work." The report used land use data from 72 percent of all large solar plants installed in the U.S., and found that the total area requirements for a photovoltaic (PV) plant between 1 and 20 megawatt capacity is 8.3 acres per MW. For larger PV plants, the total area needed is 7.9 acres per MW, while concentrating solar power plants (CSP) need 10 acres per MW. When weighted by generation rather than capacity, the larger PV plants (3.4 acres per gigawatt-hour per year) and CSP plants (3.5 acres/GWh/year) do a bit better than smaller PV plants (4.1 acres/GWh/year).

This isn't the first time NREL has looked at solar land use, though it is the first time they used a whole lot of actual power plants to figure out the numbers. In the past, they estimated that to power all of the U.S. with solar power, it would require 0.6 percent of all the area in the country. How do the latest numbers stack up with that? To the back of the envelope!

The new report says that a PV plant capable of powering 1 000 homes needs 32 acres. According to the U.S. Census Bureau, there are around 115 million occupied and fully used homes in the country. If we just scale up linearly (which is not, of course, how this would actually work), that means 3.68 million acres to power all of them. That's equivalent to 5 750 square miles, or around 0.1 percent of all the land the US has to offer. Not bad!

Perhaps more relevant is the question of how these land use requirements measure up to other forms of energy. When it comes to renewables, there's no doubt that solar power is far more area-efficient than wind power; an NREL report [PDF] from several years ago found a total requirement of about 84 acres per MW, a far cry from the 10 or so acres that solar seems to max out at. Geothermal energy might be the best of the bunch, though, in the low single digits.

Outside of renewables, things can get a bit complicated. Nuclear power is often considered very area efficient, though mining for uranium could add a complicated factor to that equation. Similarly, coal power plants themselves don't use a ton of space per megawatt generated, but there is little debate on the devastating land use impacts of coal mining. One study looked at what it would take to produce 10 percent and 100 percent of the whole world's power from various sources, and found nuclear and geothermal energy at the very lowest end of area needs, followed by coal, CSP, and natural gas.

Unfortunately, though, we don't yet have government studies similar to the new NREL study that go into each energy source in as much detail. Denholm stressed that doing such studies that use actual, existing plants for coal, nuclear, and natural gas would allow us to more firmly compare which energy sources get us the most bang per acre.

Photo: spg solar / Wikimedia Commons

How Big a Problem is Methane Leakage from Natural Gas Fracking?

By now it should be evident to all that hydraulic fracturing is a disruptive technology in every sense. With natural gas prices still in free-fall because of fracking, competition from gas-fired generation continues to lay waste to plans for new nuclear power—the most recent casualty being Duke's Levy project in Florida—while threatening investment in futuristic green and clean tech. Fracking is the most important single element in the dramatically improved energy position of the United States, and the main factor in the country's much lower greenhouse gas emissions.

The one thing that could slow the gas juggernaut is concern about methane leakage, which, because of CH4's high warming potential relative to CO2, could cancel benefits believed to accrue from conversion of coal to gas generation. There is evidence, however, that concerns about methane leakage may be somewhat exaggerated.

Read More

Electric Car Price Wars Heat Up

General Motors announced on Tuesday it will shave US $5000 off the price of its 2014 Chevrolet Volt, a drop of 13 percent.

The move comes after Nissan cut the price by similar margins for its all-electric Leaf earlier this year, and Ford did the same for its Focus Electric. The Volt could cost as little as $27 495 after federal tax incentives.

The lower costs are due in part to manufacturing efficiencies, Don Johnson, U.S. vice president of Chevrolet sales and service, said in a statement. “We have made great strides in reducing costs as we gain experience with electric vehicles and their components,” he said. “In fact, the Volt has seen an increase in battery range and the addition of creature comforts.”

It may be seen as a pricing war between electric vehicles (EVs), but all of the electric and plug-in hybrids are competing with higher fuel economy internal combustion engines, efficient diesel, and hybrids offerings and steady gas prices just below $4 per gallon in much of the United States.

Read More

UK Launches Europe’s Largest Energy Storage Trial

The largest European energy storage trial is underway in the United Kingdom. The project, which brings together S&C Electric, Samsung SDI, and Younicos, will deploy a 6-megawatt/10 megawatt-hour lithium-ion battery at a primary substation in Bedfordshire to assess the cost effectiveness of energy storage as part of the UK’s Carbon Plan.

The companies claim the storage could save more than US $9 million compared to traditional upgrades, such as replacing lines and transformers. Unlike many other regions, the UK’s deregulated utility market is incentivized towards low-carbon operations in which they are rewarded for better utilization of their existing assets, rather than just adding hard assets onto the networks.

Read More

How Significant Is Methane Leakage?

Over the past six to eight years, the United States has registered quite dramatic decreases in greenhouse gas emissions, putting the country on track to meet its 2009 Copenhagen pledge (though not its 1997 Kyoto Protocol commitment, which it repudiated). The decreases have primarily been the result of energy companies switching from coal-fired electricity generation to natural gas—partly a spontaneous market reaction to very low U.S. gas prices, and partly a reaction to ever-tightening Federal regulation of coal emissions.

But what if those decreases were only apparent because leakage of methane from gas production, distribution, and use has overinflated the supposed benefits of switching from coal to gas? On average, burning gas rather than coal to make electricity cuts carbon dioxide emissions by about 50 percent; the alleged decline in U.S. greenhouse gas emissions since about 2006 is largely based on that assumption. But methane is a much more potent greenhouse gas than CO2—up to two orders of magnitude, or 100-times, more potent. So if there's a lot of methane leakage, the benefits of switching to gas could be nil, and U.S. claims about making big progress in cutting emissions could be wrong.

Several major studies aimed at addressing this key issue are underway. Perhaps the most ambitious in scope is one being conducted by the Environmental Defense Fund (EDF) in conjunction with 85 academic researchers and natural gas companies. The results of the EDF methane leakage study are to be released next year in peer-reviewed science journals. The study covers production, processing, long-distance distribution, local distribution, and the transportation sector.

A preliminary paper on methane leakage by leaders of the EDF study, published in the Proceedings of the National Academy of Sciences, found that if the Environmental Protection Agency's well-to-city leakage estimate of 2.4 percent is about right, then there is indeed a net benefit from switching to gas. In fact, any leakage rate below 3.2 percent will yield a net benefit, say the EDF authors. But it is important to emphasize—you could say crucial—that the issue here is not merely whether switching from coal to gas produces a benefit; the issue is whether it produces a huge climate benefit, as is generally believed.

If the net effect on greenhouse gas emissions of switching from coal to gas is merely modest, then other factors might tip the balance against gas in a comprehensive cost-benefit analysis: factors such as water impacts of fracking (total water resources, drinking water), ramifications for local communities (traffic congestion, air pollution, property values), and long-term investments in other sources of clean energy (from renewables like wind and solar to nuclear energy). This week, Electrité de France (another EDF!) announced it was terminating nuclear work in the United States because of competition from rock-bottom natural gas prices.

The Environmental Defense Fund's methane leakage study is not the only important one underway. A U.S. Environmental Protection Agency assessment of hydraulic fracturing, also due out next year, may address the question of leakage, according to Anthony R Ingraffea, an engineering professor at Cornell. Writing in the New York Times on Monday, Ingraffea said early drafts of an Energy Department study "suggest that there are huge problems finding enough water for fracturing future wells."

Taking those studies into account, 2014 looks to be shaping up as the year in which we Americans learn whether we're good guys or not such good guys in the global greenhouse gas reduction drama.

Photo: Shannon Stapleton/Reuters

Advertisement

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Advertisement
Load More