Energywise iconEnergywise

Silicon Carbide Power Electronics for Making Silicon

When people talk about the efficiency of solar cells, the conversation is usually about the efficiency of converting sunlight into electricity. But there are other opportunities for efficiency, from the manufacturing of cells to the ease of installing them.

Researchers in Germany are working on the manufacturing issue by focusing on the process that produces highly pure crystalline materials such as silicon.

Read More

A Better, Safer Flow Battery

Utilities and governments want massive energy storage assets to smooth out the intermittent nature of wind and solar power. One possibility are flow batteries that use liquids to store and release energy, but these devices have been plagued by low energy density — that is, how little energy they supply for their size. Now researchers at  Pacific Northwest National Laboratory in Richland, Wash., have developed a new flow battery that packs more than twice as much energy density as other flow batteries.

Read More

New Wave System Claims Huge Energy per Ton Potential

Wave energy could become practical using a system that mimics the human heart, say engineers in Sweden.  The system allows a wave power generator to produce five times more energy per tonne of device for one-third of the cost of competing state-of-the-art technologies, they claim.

Read More

EPA Coal Cuts Light Up Washington

A meeting at the U.S. Federal Energy Regulatory Commission’s (FERC’s) Washington headquarters yesterday lived up to expectations that it would be one of the most exciting sessions in the agency’s history. Buttoned up policy wonks, lobbyists, and power market experts showed up in droves—over 600 registered—to witness a discussion of what President Obama’s coal-cutting Clean Power Plan presaged for the U.S. power grid. The beltway crowd was joined by activists for and against fossil fuels—and extra security.

Inside proceedings, about the Environmental Protection Agency (EPA) plans’ impact on power grid reliability, protesters against fracking and liquid natural gas exports shouted “NATURAL GAS IS DIRTY” each time a speaker mentioned coal’s fossil fuel nemesis. Outside, the coal industry-backed American Coalition for Clean Coal Electricity distributed both free hand-warmers and dark warnings that dumping coal-fired power would leave Americans “cold in the dark.”

As expected, state regulators and utility executives from coal-reliant states such as Arizona and Michigan hammered home the ‘Cold in the Dark’ message in their exchanges with FERC’s commissioners. Gerry Anderson, Chairman and CEO of Detroit-based utility DTE Energy, called the Clean Power Plan “the most fundamental transformation of our bulk power system that we’ve ever undertaken.” 

EPA’s critics argue that the plan’s timing is unrealistic and its compliance options are inadequate. Anderson said Michigan will need to shut down, by 2020, roughly 40 percent of the coal-fired generation that currently provides half of the state’s power. That, he said, “borders on unachievable and would certainly be ill-advised from a reliability perspective.” 

EPA’s top air pollution official, Janet McCabe, defended her agency’s record and its respect for the grid. “Over EPA’s long history developing Clean Air Act standards, the agency has consistently treated electric system reliability as absolutely critical. In more than 40 years, at no time has compliance with the Clean Air Act resulted in reliability problems,” said McCabe. 

McCabe assured FERC that EPA had carefully crafted its plan to provide flexibility to states and utilities regarding how they cut emissions from coal-fired power generation, and how quickly they contribute to the rule’s overall goal of lowering power sector emissions by 30 percent by 2030 from 2005 levels. (Michigan has state-verified energy conservation and renewable energy options to comply with EPA’s plans according to the Natural Resources Defense Council.)

McCabe said EPA is considering additional flexibility before it finalizes the rule, as early as June. EPA would consider, for example, specific proposals for a “reliability safety valve” to allow a coal plant to run longer than anticipated if delays in critical replacement projects—say, a natural gas pipeline or a transmission line delivering distant wind power—threatened grid security. 

As it turned out, language codifying a reliability safety valve was on offer at yesterday’s meeting from Craig Glazer, VP for federal government policy at PJM Interconnection, the independent transmission grid operator for the Mid-Atlantic region. The language represents a consensus reached by regional system operators from across the country—one that is narrowly written and therefore unlikely to give coal interests much relief. “It can’t be a free pass,” said Glazer.

A loosely-constrained valve, explained Glazer, would undermine investment in alternatives to coal-fired power, especially for developers of clean energy technologies. “Nobody’s going to make those investments because they won’t know when the crunch time really comes. It makes it very hard for these new technologies to jump in,” said Glazer.

Clean energy advocates at the meeting, and officials from states that, like California, are on the leading edge of renewable energy development, discounted the idea that additional flexibility would be needed to protect the grid. They pushed back against reports of impending blackouts from some grid operators and the North American Electric Reliability Corporation (NERC). Those reports, they say, ignored or discounted evidence that alternative energy sources can deliver the essential grid services currently provided by conventional power plants. 

NERC's initial assessment, issued in November, foresees rolling blackouts and increased potential for "wide-scale, uncontrolled outages,” and NERC CEO Gerald Cauley says a more detailed study due out in April will identify reliability “hotspots” caused by EPA’s plan. At the FERC meeting, Cauley acknowledged that “the technology is out there allowing solar and wind to be contributors to grid reliability,” but he complained that regulators were not requiring them to do so. Cauley called on FERC to help make that happen.

Cleantech supporters, however, are calling on the government to ensure that NERC recognizes and incorporates renewable energy’s full capabilities when it issues projections of future grid operations. They got a boost from FERC Commissioner Norman Bay. The former chief of enforcement at FERC and Obama’s designee to become FERC’s next chairman in April, Bay pressed Cauley on the issue yesterday. 

Bay asked Cauley how he was going to ensure that NERC is more transparent, and wondered whether NERC would make public the underlying assumptions and models it will use to craft future reports. Cauley responded by acknowledging that NERC relied on forecasts provided by utilities, and worked with utility experts to “get ideas on trends and conclusions” when crafting its reliability studies. 

Cauley also acknowledged that they were not “entirely open and consensus based” the way NERC’s standards-development process was. And he demurred on how much more open the process could be, telling Bay, “I’ll have to get back to you on that.”

The challenge from Bay follows criticism leveled at NERC in a report issued last week by the Brattle Group, an energy analytics firm based in Boston. Brattle found that compliance with EPA’s plan was “unlikely to materially affect reliability.” 

Brattle’s report concurred with renewables advocates who have argued that NERC got it wrong by focusing too much on the loss of coal-fired generation and too little on that which would replace it: “The changes required to comply with the CPP will not occur in a vacuum—rather, they will be met with careful consideration and a measured response by market regulators, operators, and participants. We find that in its review NERC fails to adequately account for the extent to which the potential reliability issues it raises are already being addressed or can be addressed through planning and operations processes as well as through technical advancements.” 

Will Shuttering Coal Plants Really Threaten the Grid?

Does President Obama’s plan to squelch carbon emissions from coal-fired power plants really threaten the stability of the grid? That politically-charged question is scheduled for a high-profile airing today at a meeting in Washington to be telecast live starting at 9 am ET from the Federal Energy Regulatory Commission (FERC). 

Such “technical meetings” at FERC are usually pretty dry affairs. But this one could be unusually colorful, presenting starkly conflicting views of lower-carbon living, judging from written remarks submitted by panelists

Read More

Hoovering Up CO2 with CCS-equipped Biomass Power Plants

Last year's update from the Intergovernmental Panel on Climate Change identified biomass-fired power plants that capture their carbon—and thus sequester atmospheric CO2—as one of the most critical tools available for stabilizing climate change by the end of this century. Last week, researchers at the University of California at Berkeley reported that carbon-capturing bio-power plants could go two steps further, rendering the entire Western North American power grid carbon-negative by 2050.

The idea behind bioenergy with carbon capture and storage, or BECCS, is to capture carbon emissions from a combustion power plant's effluent using the same equipment and methods employed by a few CCS-equipped coal-fired power plants. Once such plant, which started up in September in Saskatchewan, is the world's first commercial-scale coal power plant to capture over 90 percent of its carbon. 

But whereas power plants that capture and sequester fossilized carbon can, at best, achieve carbon-neutral performance, BECCS can be carbon-negative. That's because the carbon in the wood and other biofuels they burn was sucked from the atmosphere as the plants grew. Storing that atmospheric carbon underground is tantamount to generating electricity while actually doing Earth's climate a favor. 

Last week's report, in the journal Nature Climate Change, purports to be the first detailed simulation of how BECCS would play out in a particular region. The research team, led by Daniel Kammen, director of the Renewable and Appropriate Energy Laboratory at UC Berkeley, simulated BECCS deployment on the Western power system (which interconnects most of the U.S. and Canada west of the Rockies, plus Mexico's Baja California). Their SWITCH-WECC model is a standard power grid model augmented with information about the location and cost of biomass fuel sources. 

After screening the sustainability of biomass resources available in the region from forestry, agriculture, and municipal wastes, the researchers identified enough biomass to meet between 7 and 9 percent of projected electricity demand for 2050. But they found that pushing BECCS to that level had an outsize impact on total power sector emissions.

By combining BECCS with aggressive deployment of renewable energy and fossil-fuel emission reductions, they projected that grid-wide carbon emissions could be reduced by 145 percent in 2050 relative to 1990 levels. In that scenario, with BECCS providing carbon-negative baseload power to complement solar, wind, hydropower and other renewable installations, overall emissions from the Western N.A. grid come in at -135 megatons of CO2 per year. That's enough to offset all of the emissions from Alberta's unconventional oil drilling, twice over.

No doubt critics will question the validity and relevance of Berkeley's findings, starting with the alleged carbon benefits. Many critics argue that bioenergy production leads to changes in land use—such as clearing of forests—that can generate large carbon releases and thus undercut the notion of negative emissions. 

Then there is the cost of capturing carbon from power plant emissions. The Saskatchewan coal plant's CCS equipment has been so pricey to install and operate that it may cost more per kilowatt-hour to run than the 12 cents that its operator, SaskPower, gets for selling the electricity it generates. 

In SaskPower's case, it pencils out because they can sell the captured CO2 to a nearby oil and gas operator, which uses it to stimulate oil production in the process of storing the CO2 underground. But the scale of BECCS contemplated by UC Berkeley's study is well beyond what oil markets will support. That means massive cost reductions must be achieved in the decades ahead. 

The third major question facing all future carbon capture and storage operations, whether they capture atmospheric or fossil CO2, is how securely the CO2 can be sequestered underground. Five years ago, one of the world’s largest CCS operations experienced large surface deforming, raising the spectre that rock layers expected to keep injected CO2 underground could fracture. No CO2 escaped from that remote Algerian site, but operators prematurely terminated CO2 injection there, and anxiety over CO2 leakage has paralyzed a number of CCS projects. 

According to the IPCC, these concerns are valid but, at least at present, none appear to be showstoppers. The international scientific body judges the challenge of stabilizing climate to be too large and important to eliminate BECCS from consideration. Berkeley's study is likely to strengthen that argument.

Grids Could Balance Themselves, Says Study

Germany’s huge investment in renewables has cut demand for fossil fuels, but it has also brought a new source of instability to the grid. Intermittent renewables, such as wind and solar power, increase the difficulty of monitoring and balancing the grid as the amount of power being produced fluctuates. Today, that balancing is done through centralized controls from grid operators, but researchers are looking at a how a grid might keep itself in trim without such control.

Read More

Could Europe's New Grid Algorithm Black-out Belgium?

Two of the big European power grid stories from 2014 were the software-enabled enlargement of the European Union's common electricity market and a spate of nuclear reactor shutdowns that left Belgium bracing for blackouts. Those developments have now collided with revelations that the optimization algorithm that integrates Europe's power markets could potentially trigger blackouts.

The flaw resides, ironically, in a long-anticipated upgrade to Europe's market algorithm. This promises to boost cross-border electricity flows across Europe, expanding supplies available to ailing systems such as Belgium’s. Earlier this month market news site ICIS reported that the upgrade, in the works since the launch of market coupling in 2010, has been delayed once again by European transmission system operators (TSOs).

Europe’s market integration relies on software called the Pan-European Hybrid Electricity Market Integration Algorithm, or Euphemia, which crunches every buy and sell bid submitted to participating national and regional day-ahead power markets. (The number of which will grow to 19 when the Italian and Slovenian markets join in at the end of this month.) Euphemia matches up buy and sell bids that make optimal use of available transmission capacity and meet total power demand at the lowest overall cost.

Euphemia’s troubled upgrade would change the way that the transmission available between the markets is determined. Currently TSOs tell Euphemia how much trading will be possible over each border on the following day, based on their best guess of what grid flows will look like. Under “Euphemia 2.0” the algorithm itself calculates the cross-border capacities as it optimizes the following days’ power trades. 

In theory, such so-called flow-based coupling of the markets should boost security of supply across Europe. Since Euphemia can more precisely predict how power will flow across the network (it is, after all, aligning the next day's buy/sell orders) it allocates transmission capacity more agressively than the national TSOs.

Less-conservative capacity allocations means a greater potential for imports to tight markets such as Belgium’s. The enhanced liquidity should also reduce power costs. Shadow runs of daily power trades using flow-based optimization suggest the daily savings to European consumers could be as high as 900,000 euros (US $1 million).

However, flow-based optimization has a glitch: it occasionally leads to what economists call non-intuitive trading. For example, a state with the market’s most expensive power supplies might nevertheless be approved to export power, or vice versa. 

In early 2014 Europe’s TSOs approved a simple work-around that was to go live last November. Euphemia’s software engineers programmed in a patch that would check the trades approved by its optimization and drop any that were non-intuitive.

Then Belgium saw three reactors go unexpectedly out of service in rapid succession, knocking out one-third of the nation’s firm domestic power capacity. The context for the automated market system changed almost overnight.

Belgium’s reactor shutdowns left it extremely dependent on power imports, and nervous about the flow-based upgrade and its work-around. Amidst tightened supplies, such as Belgium foresaw during its winter demand peak, Belgian officials feared that Euphemia’s flow-based optimization could leave them short of power. 

In September 2014 TSOs decided to defer the flow-based upgrade to this March. Their announcement suggested the delay was about not further stressing out the Belgians, allowing that, “the implementation of a fundamentally new capacity allocation methodology generally contains a risk, even if thoroughly prepared and tested.”

The Belgians continued to be stressed however, and earlier this month, the TSOs deferred Euphemia’s upgrade once more. Belgium’s power regulator is now making it clear that they want further improvements to the software.

“Scenarios considered to be highly improbable when the project started 5 to 7 years ago have now become realistic. We felt that it was necessary to reevaluate the algorithm. They will have to develop a new patch,” says Annemarie De Vreese, spokesperson for Belgium's energy regulator, the Commission de Régulation de l'Électricité et du Gaz (CREG).

To sense how worried Belgium has been about electrical shortages this winter, consider the mobile app and website that Brussels-based TSO Elia created to keep consumers informed about blackout risk. They provide 7-day color-coded forecasts of blackout threat levels, providing advanced warning if pre-planned rolling blackouts might be required.

No blackouts have occured so far because mild weather has enabled Belgium to meet demand with imports from the Netherlands and France, and because one of its three troubled reactors came back online in December. There would have already had 25 days of blackouts if this winter had been as cold as the 2010-2011 winter, according to a spokesperson for Belgium’s energy ministry quoted by, the news site for Flemish public broadcaster VRT.

According to De Vreese, the tweaks required to make flow-based optimization safe for tight markets could be fairly straightforward. She expects a proposed fix to be available for review next month. The TSOs issued an official statement last week foreseeing sign-off by national regulators by late April. 

De Vreese says the fix is about more than just reassuring Belgium. She says that other regulators, including those in Germany and France, recognize that their states also face a future with less firm domestic power generation. Germany is phasing out nuclear reactors and also seeing conventional power plants shut down as rising solar generation depresses wholesale power prices. And France plans to increase reliance on renewable generation while reducing reliance on nuclear reactors

In other words Belgium’s misfortune this winter may be a glimpse of a generally less secure electrical future. “Belgium is a sort of forerunner,” says De Vreese. 

If Euphemia’s designers can get its non-intuitive trading glitch sorted out, flow-based optimization of the European power market may yet be part of the solution.

U.S. Electricity Demand Flat Since 2007

The U.S. economy has grown 8 percent since 2007, but the annualized electricity demand growth has been zero over that same period. That’s the first time in recent memory that U.S. energy use remained flat over multiyear span during which the economy expanded.

The third annual Sustainable Energy in America fact book from Bloomberg New Energy Finance found that electricity demand growth, which has slowed since 1990, has come to a grinding halt.

“There has been an outright decoupling between electricity growth and economic growth,” the report states. Furthermore, it notes, the U.S. economy has become less energy-intensive.

Although the trend has been going on for nearly a decade, the slowdown or rollback of energy efficiency policies, especially at the state government level, could mean the flat line representing electricity use will start to creep back up in the coming years.

Read More

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More