Energywise iconEnergywise

How Much Recoverable Oil Do We Have?

Oil's availability is of course of immediate concern to every driver, especially at a time when gasoline prices are high once again. The much greater concern, however, is whether we are reaching a limit where oil can no longer be recovered at prices consumers are willing to pay.

If something like that turns out to be true—a scenario that generally goes by the name of "peak oil"—then long-term economic growth may be constrained across the industrial world. At the same time, to look at the brighter side of the picture, long-term carbon emissions may be lower than previously projected.

As it happens, expert opinion is radically divided on this key issue.

A recent report from analysts at Lux Research, "Evaluating New EOR [Enhanced Oil Recovery] Technologies in Oil Industry Mega-projects," proposes that by means of EOR, the industry may be able to tap up to 10.2 trillion barrels of unconventional oil, over and above 1.4 to 1.6 trillion barrels of conventional oil. (Lux puts the number for conventional oil reserves at 1.6 tbl; a year ago, IEEE Spectrum cited an estimate of 1.4 tbl, based on work by Michael Klare.)

Klare, a professor of peace and world security studies at Hampshire College in Massachusetts, seems to be in general accord with Lux's view that the age of oil is far from over. Writing in the Huffington Post, the left-liberal online publication, Klare said that "humanity is not entering a period that will be dominated by renewables. Instead, it is pioneering the third great  carbon era, The Age of Unconventional Oil and Gas." According to Lux, EOR techniques can boost recovery of oil in existing fields from an average of 25 percent today to up to 65 percent. Klare, citing International Energy Agency estimates, says that investment in such techniques will exceed US$ 22 trillion between now and 2035—three times the investment in renewable technology—and that world demand for oil will grow 26 percent in that period.

An article that appeared in the July 13 issue of Eos (the transactions of the American Geophysical Union) presented a radically different view of things. Taking a more economic view of what it means for oil to be recoverable, scientist James W. Murray and analyst Jim Hansen suggest that oil pricesand with them oil productionalready have arrived at the limit of what consumers worldwide are willing to pay. "Global production of crude oil and condensates…has essentially remained on a plateau of about 75 million barrels per day since 2005 despite a very large increase in the price of oil," say Murray and Hansen. (The latter is not to be confused with famous climate scientist Jim Hansen, of the Goddard Institute for Space Studies at Columbia University.) In effect, they suggest, prices have reached a level where consumers seek alternatives or conserve, rather than pay more; if oil prices go significantly higher, then the effect is to plunge the industrial world into recession, lowering demand.

The silver lining, Murray and Hansen suggest, is that the expert bodies like the Intergovernmental Panel on Climate Change (IPCC) may have over-estimated future carbon emissions resulting from oil combustion. It will be interesting to see, when the next major IPCC assessment appears next month, how it handles that issue.

Where do I stand personally on this immensely important and controversial question? I cannot claim to be an expert, but for what it's worth, my impressions correspond more closely to those of Murray and Hansen than to those of Lux, Klare, and the IEA.

Photo: At Chevron's Kern River oil field in Bakersfield, Calif., U.S., enhanced production technologies such as steam flooding have made it possible to extract oil once considered economically unfeasible to obtain. Ken James/Bloomberg/Getty Images

Solar Panels Return to the White House

It took President Obama three years to return solar panels to the rooftop of the White House, but the real saga began long before he took office.

Back in 2010, former Energy Secretary Steven Chu said that the administration would install between 20 and 50 solar panels. Despite the pledge, however, the White House did not respond to offers for free solar photovoltaic systems from companies such as Sungevity, according to Renewable Energy World.

Now, in 2013, President Obama has found new resolve to discuss climate change and a more resilient energy landscape. Earlier this summer, the president delivered a speech calling for stricter regulations on existing coal-fired power plants, more wind and solar generation on public lands, and more energy-efficient buildings in both the public and private sector.

At the time, he said the changes would start with the federal government, especially in the realm of improving energy efficiency; for instance, he called for new efficiency targets for federal buildings.

Obama is now taking that message back to his own home, installing solar PV as “part of an energy retrofit that will improve the overall energy efficiency of the building,” a White House official told the Washington Post.

Of course, this all goes back much further than Obama's time in office. President Ronald Regan removed solar panels in 1986 that Jimmy Carter had installed in 1979. President George W. Bush also put a solar array on a small building on the White House grounds in 2003 to help heat the pool.

“Better late than never—in truth, no one should ever have taken down the panels Jimmy Carter put on the roof way back in 1979,” Bill McKibben, director of the climate group 350.org, told the Washington Post. “But it’s very good to know that once again the country’s most powerful address will be drawing some of that power from the sun.”

Today, solar panels are 97 percent cheaper than they were when Carter was in office, but the U.S. still has far higher soft costs—such as permits for installation and interconnection fees—than some other countries, such as Germany.  

Although the panels are already being installed, there is no word yet on the final panel count or the total energy output. President Obama has pledged that 20 percent of the federal government’s energy use will be powered by renewables by 2020.

Photo: Chuck Kennedy

Supercomputing a Quieter Wind Turbine

Noise created by giant wind turbines is high on the list of barriers to renewable energy deployment, with NIMBY and health complaints threatening or at least delaying a number of projects around the world. But noise also is related to efficiency, and now the research division for turbine manufacturing giant GE says it has figured out how to reduce noise and boost output. Win all around, apparently.

GE worked with Sandia National Laboratories in Albuquerque, New Mexico, to engineer new designs that would reduce noise. They used the Red Mesa supercomputer, which, when it first began operations in 2010, could reach speeds (500 teraflops) that made it the 10th fastest computer in the world. GE used it to run a Stanford University-created program called a high-fidelity Large Eddy Simulation "to predict the detailed fluid dynamic phenomena and resulting wind blade noise," according to a GE press release. After three months of continuous runs with the Large Eddy Simulation, the researchers apparently had "valuable insights that were used to assess current engineering design models, the assumptions they make that most impact noise predictions, and the accuracy and reliability of model choices."

That's a bit vague, what's unequivocal is the bottom line: a turbine rotor design that's one-decibel quieter equates to a 2-percent increase in annual energy yield, GE says. And with 240 gigawatts of wind power forecasted to be installed around the world in the next five years, that 2-percent increase could be worth 5 GW.

Aside from efficiency, reducing noise could cut down on NIMBY fights when it comes to getting wind projects built, and could possibly allow the turbines to be built slightly closer to where people live. Reports of health problems thanks largely to turbine noise (as well as "shadow flicker") remain largely anecdotal, with bigger studies suggesting that conditions such as "wind turbine syndrome" are likely overblown. But it's clear that cutting down on noise would benefit pretty much everybody, whether or not they live near turbines.

Image: GE

A Smart Phone Uses as Much Energy as a Refrigerator?

When you plug your smart phone into the wall, it draws a negligible amount of energy compared with other household electronics such as your set-top box or refrigerator.

But add in the amount of electricity it takes to move data across networks to deliver a total of, say, an hour of video to your smart phone or tablet each week, and over a year it adds up to more power than two new Energy Star refrigerators consume in a year.

And though phones and other electronics and appliances are becoming ever more efficient, that efficiency does not offset the proliferation of these devices around the world.

A new paper, “The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure, and Big Power," [PDF] investigates the energy draw of information-communications technologies (ICT) and how they are dwarfing what we traditionally think of as energy hogs in the home. The paper was commissioned by the U.S. National Mining Association and the American Coalition for Clean Coal Electricity.

The global ICT ecosystem uses about 1500 terawatt-hours of electricity annually, which is equal to the electricity used by Japan and Germany combined. That figure will only increase, especially as cloud architecture overtakes wired networks.

Read More

Blackout Threat Unmitigated a Decade After the Northeast Went Dark

The Northeast Blackout struck seven U.S. states and Ontario ten years ago today, prompting mandatory standards to prevent such a cascading power outage from happening again. Then, two years ago, it did. Arizona, California and Mexico’s Baja California took the hit in 2011, but the story was much the same. In both cases, inadequate information and planning and human error left the power grid unstable enough that a single downed power line unleashed an electrical tsunami that swamped neighboring lines and darkened millions of homes and businesses.

Power system experts who study blackouts say that they see a similar pattern in most cascading outages. They cite other recent notables, such as Western Europe in 2006, Brazil in 2009, and twice in India last year. The commonality is evidence to the experts that cascading failures are a dangerous facet of modern power grids that remains all but impossible to predict or prevent. “Large blackouts are likely to recur at regular intervals,” says Ian Dobson, a cascading failures expert and electrical and computer engineering professor at Iowa State University.

Worse still, bigger and more frequent blackouts may be coming. The Northeast Blackout, the worst in U.S. history, shut off 61 800 megawatts (MW) of power consumption. But a 100-year blackout—the largest for which there is a 1 percent chance of occurring every year—would be three times bigger than that event in 2003. According to University of Vermont electrical engineering professor Paul Hines and colleagues at Carnegie Mellon University, who made the calculation with algorithms used in natural disaster planning, such a blackout would interrupt 186 000 MW or roughly one quarter of all electrical service in the continental U.S.

Meanwhile, more frequent blackouts are also likely, a result of the increasing incidence of extreme weather such as thunderstorms, hurricanes, and blizzards predicted by climate models. Weather-related outages are already on the rise according to a report this week from the White House Council of Economic Advisors analyzing U.S. Department of Energy (DOE) stats on outages affecting 50 000 or more power customers. The report identifies more than 80 such weather-related outages per year, on average, from 2008 to 2012 -- more than double the average frequency observed during the previous five years.

The White House also cites efforts underway to improve grid reliability and resilience. After the 2003 blackout, Congress mandated that utilities comply with grid maintenance and operating standards set by the North American Electric Reliability Corporation, an industry-funded non-profit. As a result, utilities do things like trim trees under power lines more often. And $4.5 billion in economic stimulus funding allocated for grid upgrades accelerated the deployment of advanced grid technologies, such as sensors called phasor measurement units that give controllers a real-time picture of power flows across the grid.  

Jeff Dagle, chief electrical engineer at the DOE’s Pacific Northwest National Laboratory and a member of the task force that investigated the 2003 blackout, points to a speed-up in state-estimation software. Operators rely on such modeling to understand how their grid is behaving and foresee the impact of losing key components, so faster models enable faster evasive actions. “Ten years ago these commonly took 15 minutes. Today many would deem five minutes to be slow,” says Dagle.

The problem, say Dagle and other grid experts, is that power lines and power stations are aging at the same time that increasing levels of renewable power generation are straining the grid. “We’re putting more demands on the system,” says Dagle. In the calculus of improvements, degradation, and increasing demands, he says “it’s really hard to know [whether] we’re more or less reliable today.”

At the same time, it is unclear whether the "smart" upgrades touted by the White House will help or hurt. That’s because tools to quantify the risk of cascading failures are still immature. And until the risk can be accurately measured, grid engineers can’t know precisely how to reduce it.

Dobson says the research is coming now that the power engineering community increasingly recognizes cascading failures as a distinct and recurring problem—a concept that still elicited protests from power engineers in the aftermath of the 2003 blackout. Dobson cites work by the IEEE Power Engineering Society’s Cascading Failures Working Group, of which he and Hines are members.

One novel method he unveiled last year assesses the likelihood of cascading outages by counting the number of subsequent lines that trip off each time a power line goes down on a given grid. Using a database from the Bonneville Power Authority (the only U.S. utility whose line-trip data is publicly-available), Dobson showed he could quantify the likelihood that a line outage would propagate and how far it would go, on average. But the tool remains too weak to identify trends or evaluate specific engineering solutions. “The events are rare so it takes years of data to get a reasonable estimation,” says Dobson.

Until such tools are mature it will be difficult to target funds to those upgrades most likely to reduce blackout risk, say Dobson and others. Even grid upgrades underway could be threatened if their impact on blackout risk is poorly understood. “If newly-introduced technology gets blamed for a future blackout there could be pressure to run it below capacity, or even shut it off entirely. That would be quite a pity if we could have mitigated the failure risk instead,” says Dobson.

Photo: Spencer Platt/Getty Images

IBM Launches Advanced Renewable Forecasting Tool

Weather has always been an important factor in planning for the needs of the electrical grid, but lately it has become even more crucial with the proliferation of grid-scale wind and solar resources.

To meet the needs of utilities that are installing large amounts of renewable energy, IBM just launched software that brings together data analytics and cutting-edge weather prediction. The Hybrid Renewable Energy Forecasting, or HyRef, incorporates cloud imaging technology, sky-facing cameras, and operational and environmental sensors to build customized models of renewable outputs.

“If you can’t do the weather forecast better, you’re done,” says Lloyd A. Treinish, chief scientist of IBM's Deep Thunder program, which aims to improve local weather forecasting through the use of high-performance computing.

Wind farms, for example, have had sensors on the turbines, but they’re used to monitor the turbines and not the weather because the data has been too contaminated to be of use, Treinish says.

IBM took its expertise in renewables and weather forecasting, including its micro-forecasts used in Deep Thunder, to develop HyRef. One of the biggest challenges was taking inputs from the front of the wind blades, rather than behind it. Taking measurements from the front of the blade increased the acoustic noise, which had to be filtered out to get an accurate reading.

Once IBM had data that was free of contamination, it built a statistical model that could drill down to the individual turbine scale and provide forecasts for 15-minute intervals or up to a month in advance. “With accuracy and precision, you have that much better information so then you can have inputs that offer far greater fidelity for power output,” says Treinish.

The first client to use the software, Jibei Electricity Power Company Limited—a subsidiary company of the State Grid Corporation of China—hopes to increase the integration of renewable power generation by 10 percent. The utility is part of State’s Grid’s the Zhangbei 670 MW demonstration project, the world’s largest utility-scale renewable power plant, which combines wind and solar with energy storage and transmission.

“Clients keep telling us forecasts aren’t good enough,” Treinish says of current weather forecasts for renewables. IBM’s approach is to customize the software for each utility’s needs, whether that’s a large-scale solar array or offshore wind farm. In the case of Jibei, the utility is interested in day-ahead forecasting for its wind resources.

“We have some of the most high-resolution weather modeling software, and we bring that to bear,” says Treinish. IBM also relishes the multi-disciplinary challenge of building better forecasting tools, which combine expertise from math disciplines, computing, atmospheric physics and other sciences.

Much of the research for HyRef was borrowed from solving other weather-related challenges, such as flood forecasting for cities or outage detection for grid operators. “We’re data scavengers,” says Treinish. “We’ll use whatever we can use to make the forecast better.”

The interest in HyRef, so far, has been in areas around the globe that have a high penetration of renewable energy and are already having intermittency problems, according to Stephen Callahan, a partner in IBM’s global business services for the energy and utilities industry. For some clients, the goal is not just better forecasting for the individual utility, but bringing that accuracy to market at the wholesale level so that the economic value of renewable energy can be priced more effectively. “That is the threshold we’re all working to get to,” says Callahan.

Photo: Yagi Studio/Getty Images

Detecting and Correcting Methane Leakage: Is a Technical Fix Ahead?

It's fashionable and sensible in the complicated domains where technology and policy intersect to be suspicious of any narrowly conceived solution. "There's no technical fix," goes the usual refrain (which I certainly have voiced plenty of times myself). In the case of methane leakage from natural gas production and distribution systems, however, there really may be a combination of technical fixes on the horizon. Because of methane's high global warming potential relative to carbon dioxide, detecting and correcting methane leakage is going to be very important in the years ahead.

Two and a half years ago, the Maryland company Earth Networks announced it would start building a network of sensors to directly monitor greenhouse gas emissions on a regional basis, working with the Scripps Institution of Oceanography, and others. Each processing unit consists of a US $50 000 box, along with elaborate calibration software that Earth Network has developed with scientific partners at Scripps, NIST and NOAA, and connects by a tube with sensors installed 100 meters up on some existing tower [photo]. The initial plan called for putting about 100 units in the United States, 25 in Europe, and 25 other places in the world.

Earth Networks CEO Robert Marshall says that 28-30 units are now installed, mostly in the U.S. Northeast but also in the Los Angeles area and some other places. One of the company's units has taken over the famous job of monitoring global CO2 atop Hawaii's Mauna Loa and produced the definitive measurement several months ago of the Keeling curve's crossing the 400 ppm threshold.

Earth Networks, originally known as AWS Convergence Technology, makes its living primarily by delivering very finely grained weather forecasts and severe weather alerts. Its well-know trademarked product, WeatherBug, is widely used by clubs, schools, and planners of major sports and entertainment events. Before the terrible Moore, Oklahoma tornado last spring, the company's networks detected dense in-cloud lightning, an early warning signal of the catastrophic twister. As for the greenhouse gas monitoring network that Earth Networks is installing, this work is being done at present on what you might call a pro bono basis—or, if you prefer, as a speculative venture anticipating high future demand for the new service.

These days, says Marshall, there is little demand for GHG monitoring at the national level because there is no real issue of national compliance at present. The system established by the Kyoto Protocol in 1997 has lost traction, and the world awaits a new system of mandatory GHG cuts, to be formulated at a conference in Paris at the end of 2015. But in the meantime, Marshall points out, two regional GHG reduction programs have been established in the United States—in the Northeast and in California—and many cities here and overseas are adopting objectives that will require monitoring.

As the GHG detection networks get built out, they will be able to determine how much methane is escaping from major fracking fields and from aging urban gas distribution systems, to name the two most important situations giving rise to acute concern. At present, reports of emissions from gas fields are often based on one-day spot checks done by aircraft flyovers. "There's nobody else out there doing what we do," says Marshall. "Permanently installing sensors that take data continually and can monitor emissions from an entire gas field, as opposed to just individual wells."

Once the Earth Networks GHG detectors are able to provide alerts to situations where methane leakage is high, then newly developed portable monitors can be used to pinpoint the exact spots where leaks are highest, so that corrective measures can be taken. One such portable device, a "gasbot" developed in Sweden, was described in a recent post here; another, described in a recent New York Times article, was developed by instrument maker Picarro, the same company that makes the Earth Networks GHG boxes. Thus, the region-wide and portable monitoring devices have complementary roles to play so that, to the extent methane leakage turns out to be a really serious problem, it may also turn out to be a fixable problem.

Photo: Earth Networks

Another Very Strong Year for U.S. Wind

Researchers at the Lawrence Berkeley National Laboratory (LBL) have issued their latest authoritative report on the status of U.S. wind energy. Newly installed turbine capacity increased a whopping 90 percent in 2012, driven in large part by subsidies that were expected to expire (but did not); total wind capacity climbed to 60 gigawatts, the equivalent in terms of expected energy production of roughly a quarter of installed nuclear power.

Despite plummeting natural gas prices and wide switching from coal to gas generation, the expansion of wind outstripped gas last year in terms of capacity, though not in terms of expected energy production. The United States narrowly edged out China as world leader in wind capacity additions and left Germany, a one-time world market leader, in the dust.

When wind is considered as a percentage of total electricity consumption, the United States still ranks only twelfth, with Denmark in first place, Germany in fifth, and the UK in eighth. But the scope for potential U.S. expansion is considerable. The United States still has no offshore wind turbines installed, an area where Denmark, Germany, and the UK have led the way.

Many findings in the LBL report indicate a mature technology capable of standing on its own two feet, with predicable costs and returns. Capacity factors—the proportion of time units are generating at rated capacity—have been steady since the beginning of the century at about 30 percent; that is the most important single measure of performance and reliability. Turbine costs came down from about $1.80/watt in the late 1990s to $0.80/W in 2001-02, climbed back up to about $1.60/W a few years later, and now seem to be settling in the vicinity of $1/W. Total project costs, after dropping from more than $3/W in the 1980s to about $1.25 in 2004-05, seen to be stabilizing near $2/W.

Somewhat counter-intuitively, the report found that the biggest economies of scale are registered not for example by very large farms with medium-scale turbines but by small clusters of very large turbines [photo].

Looking both back and ahead, perhaps the most important aspect of wind's impressive performance has been the standard it is setting for solar energy, which also is intermittent and therefore has similar expected capacity factors. At current solar prices, which obviously are artificially low, photovoltaic arrays are being installed at costs similar to wind's—$1/W or less per panel, $2/W or less per project. Only time will tell whether in fact solar also is crossing the boundary to market competitiveness. As of today, according to the environmental reporter for London's Financial Times, PV arrays installed without subsidies account for only a tenth of one percent of total world installations.

Looking to the present and immediate future, the LBL experts expect this year to be somewhat slow for wind, as the project pipeline is rebuilt, but for solid growth to resume in 2014.

Photo: Jodi Jacobson/iStockphoto

Report Counts Up Solar Power Land Use Needs

The National Renewable Energy Laboratory (NREL) released a report [PDF] last week that aimed to quantify exactly how much room solar power requires. Land use and space issues have long been a point of contention when it comes to renewables, with opponents complaining that the huge spaces required for solar and wind aren't worth the effort. The NREL report suggests the acreage required for industrial-scale solar power plants is within the range of previous estimates, and generally doesn't seem off-the-charts outrageous.

“The numbers aren’t good news or bad news,” said Paul Denholm, one of the report's authors, in a press release. “It’s just that there was not an understanding of actual land use requirements before this work." The report used land use data from 72 percent of all large solar plants installed in the U.S., and found that the total area requirements for a photovoltaic (PV) plant between 1 and 20 megawatt capacity is 8.3 acres per MW. For larger PV plants, the total area needed is 7.9 acres per MW, while concentrating solar power plants (CSP) need 10 acres per MW. When weighted by generation rather than capacity, the larger PV plants (3.4 acres per gigawatt-hour per year) and CSP plants (3.5 acres/GWh/year) do a bit better than smaller PV plants (4.1 acres/GWh/year).

This isn't the first time NREL has looked at solar land use, though it is the first time they used a whole lot of actual power plants to figure out the numbers. In the past, they estimated that to power all of the U.S. with solar power, it would require 0.6 percent of all the area in the country. How do the latest numbers stack up with that? To the back of the envelope!

The new report says that a PV plant capable of powering 1 000 homes needs 32 acres. According to the U.S. Census Bureau, there are around 115 million occupied and fully used homes in the country. If we just scale up linearly (which is not, of course, how this would actually work), that means 3.68 million acres to power all of them. That's equivalent to 5 750 square miles, or around 0.1 percent of all the land the US has to offer. Not bad!

Perhaps more relevant is the question of how these land use requirements measure up to other forms of energy. When it comes to renewables, there's no doubt that solar power is far more area-efficient than wind power; an NREL report [PDF] from several years ago found a total requirement of about 84 acres per MW, a far cry from the 10 or so acres that solar seems to max out at. Geothermal energy might be the best of the bunch, though, in the low single digits.

Outside of renewables, things can get a bit complicated. Nuclear power is often considered very area efficient, though mining for uranium could add a complicated factor to that equation. Similarly, coal power plants themselves don't use a ton of space per megawatt generated, but there is little debate on the devastating land use impacts of coal mining. One study looked at what it would take to produce 10 percent and 100 percent of the whole world's power from various sources, and found nuclear and geothermal energy at the very lowest end of area needs, followed by coal, CSP, and natural gas.

Unfortunately, though, we don't yet have government studies similar to the new NREL study that go into each energy source in as much detail. Denholm stressed that doing such studies that use actual, existing plants for coal, nuclear, and natural gas would allow us to more firmly compare which energy sources get us the most bang per acre.

Photo: spg solar / Wikimedia Commons

How Big a Problem is Methane Leakage from Natural Gas Fracking?

By now it should be evident to all that hydraulic fracturing is a disruptive technology in every sense. With natural gas prices still in free-fall because of fracking, competition from gas-fired generation continues to lay waste to plans for new nuclear power—the most recent casualty being Duke's Levy project in Florida—while threatening investment in futuristic green and clean tech. Fracking is the most important single element in the dramatically improved energy position of the United States, and the main factor in the country's much lower greenhouse gas emissions.

The one thing that could slow the gas juggernaut is concern about methane leakage, which, because of CH4's high warming potential relative to CO2, could cancel benefits believed to accrue from conversion of coal to gas generation. There is evidence, however, that concerns about methane leakage may be somewhat exaggerated.

Read More
Advertisement

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Advertisement
Load More