Energywise iconEnergywise

Weakness of Indian Nuclear Regulation Manifest in Reactor Accident

It is no secret that India still lacks a politically independent nuclear oversight authority that is well separated from the industry it oversees. The Fukushima nuclear catastrophe was a recent reminder of just how important it is to have independent nuclear oversight, a lesson already driven home a generation before by the serious U.S. accident at Three Mile Island (TMI). The stubborn refusal of India's government to set up the kind of regulatory authority that is so obviously needed means, in effect, that one cannot have real confidence in a nuclear program that could in principle be one of the world's most important.

A telling but little-known and little-discussed example of what can happen under weak regulatory circumstances was a serious accident that took place at India's Narora reactor in March 1993, an incident that "came close to joining Chernobyl and Fukushima in the annals of industrial civilization," as writer Madhusree Mukerjee put it in a recent review of M.V. Ramana's The Power of Promise: Examining Nuclear Power in India (Penguin/Viking, 2012).

In that accident, which Ramana describes and discusses in detail in his book, two blades broke off a steam turbine and ruptured cooling system pipes, leading to a hydrogen leak, hydrogen fire, and oil fire. Though operators managed to shut down the plant manually, they then had to desert the smoke-filled control room for 13 hours, so that during that time they were "flying blind." Fearing that heat still being generated in the dormant fuel rods might lead to a recriticality, workers heroically climbed atop the reactor vessel to pour reactivity-quenching boron into the core.

All that is disconcerting enough, but what is really disconcerting about the story as Ramana tells it, is that the plant's owner-operator and Indian regulatory authorities were well-aware of issues having to do with the fragile turbine blades and possible oil fires well before the accident and yet did nothing to address those concerns. What is more, there is little or no evidence they did anything after the accident, either. Further incidents, less serious, took place.

Neither Ramana nor Mukerjee, both Ph.D. physicists who have contributed to Spectrum magazine, discusses the Narora reactors' somewhat unusual design and whether the reactors might have a feature that contributed crucially to the Chernobyl catastrophe. The two Narora units are pressurized heavy-water reactors, which means they are a kind of hybrid of the standard U.S. pressured water reactor and Canada's CANDU heavy water reactor. The CANDU, like the RBMK units at Chernobyl, has in certain operating regimes what is called a "positive reactivity coefficient," meaning that if coolant is lost at certain power levels, reactivity drastically escalates rather than de-escalating. It was this feature and operators' disregard of it that was the basic cause of the Chernobyl explosions (as I described in detail in the June 1989 issue of MIT's Technology Review).

What Ramana does do additionally, however, is deliver some telling stories about how the two questionable Narora reactors got built in the first place. When it became clear that the regional grid system was not really big enough to justify and support construction of a nuclear power plant, Indian planners took inspiration from the story about Mohammed and the mountain, as Ramana nicely puts it: That is, instead of deciding not to build a reactor too big for the grid, they instead cooked up hair-brained schemes to make the grid much bigger—with considerable technical support from Oak Ridge National Laboratory, let it be said.

From its earliest inception, as Mukerjee spells out in her review, India's Atomic Energy Commission and Department of Atomic Energy (DAE) have reported directly to the prime minister, enabling them to function largely in secrecy. Thus, when it comes to nuclear safety, "DAE never shares its emergency plans with locals," "does not reveal the health records of its workers," "does not even monitor the health of temporary workers," and "never reveals the quantities of radioactive substances released into the environment by accidents or routine operations."
 

photo: NPCIL

200 000 EV Fast Chargers by 2020?

The number of fast-charging stations for electric vehicles (EVs) will balloon to 200 000 by 2020 from about 2000 today, according to a new report from IHS Automotive.

The explosive growth is already underway, the study finds. The number of DC chargers are expected to triple to nearly 6000 between this year alone. Fast chargers use high-voltage DC power and can charge a car in less than a half hour instead of the hours it takes with lower voltage AC chargers.

Read More

Energy Secretary Elaborates on Obama's Climate Action Plan

If the most notable thing about President Obama's recent climate speech was the way he gave it, then the most telling thing about his energy secretary's elaboration on that speech, yesterday in New York City, was the size of the audience he attracted. On the last Monday of the North American summer, you'd think that anybody with the means would be in the mountains or at the beach. But some 200 people showed up to hear Secretary of Energy Ernest Moniz talk about the U.S. climate action plan at Columbia University, many of them standing around the walls.

Plainly, climate policy matters to members of the more highly educated public.

Moniz's approach to the subject was matter-of-fact and undramatic. Regarding the general issue of human-induced climate change, he said he has been accurately quoted before as having said that he "was not here to debate what's not debatable." As for the impacts of global warming on Gulf oil extraction, power stations, water availability, fuel transportation, and electricity transmission, he drew attention to a July energy department report on climate change and energy infrastructure.

Reminding his audience that the most bang for the buck can be obtained from improvements in energy efficiency, Moniz said that the department is formulating new standards for appliances on a schedule, starting with microwave ovens and halide lighting, to be followed by commercial refrigeration and electric motors.

Taking exception to the charge that the administration is waging a "war on coal," he said that taking action to reduce carbon pollution from coal is required by the president's "all of the above" approach to climate action. He said $8 billion has been earmarked for carbon capture and sequestration (CC&S).

Noting that energy subsidies almost inevitably prompt allusions to the Solyndra bankruptcy, Moniz said that Tesla obtained a high-risk, half-billion-dollar loan from the energy department in mid-2009. Now Tesla, having paid off that loan ahead of schedule, is making a car that's been rated the world's best and safest—one that's attracting, despite its high price, large numbers of orders abroad.

On renewables, Moniz presented slides showing sharp increases in deployment and sharp decreases in installed prices for wind, solar, LED lighting, and batteries. The cost of photovoltaic modules has come down so much, he said, that future solar gains will have to come largely from improvements in non-PV system components—a conclusion spelled out in a recent report from the Lawrence Berkeley Laboratory, "Tracking the Sun VI."

It turns out that "the future is not always ten years away," he said, surveying those dramatic gains.

On the controversial subject of natural gas fracking, the matter of overwhelmingly greatest concern to his Columbia University audience, Moniz observed that because of the revolution in gas, recent U.S. reductions in greenhouse gas emissions have been much greater than they otherwise could have been. We are already half-way to meeting Obama's 2020 pledge for such reductions, and half of that advance is attributable to utilities' switching generation from coal to gas. Still, he said, serious safety and water issues must be effectively addressed: "saying they are manageable is not the same thing as saying they are being managed."

On the next most controversial subject, nuclear waste, he said the administration supports a consensual, dual-track approach involving consolidation of spent fuel in regional dry cask facilities and development of a national repository for long-term geologic storage. Regarding the two new nuclear plants being constructed in South Carolina and Georgia with Federal loan guarantees, Moniz said the department will be watching closely to see if they are built on schedule and on budget.

Photo: Kevin Lamarque/Reuters

How Much Recoverable Oil Do We Have?

Oil's availability is of course of immediate concern to every driver, especially at a time when gasoline prices are high once again. The much greater concern, however, is whether we are reaching a limit where oil can no longer be recovered at prices consumers are willing to pay.

If something like that turns out to be true—a scenario that generally goes by the name of "peak oil"—then long-term economic growth may be constrained across the industrial world. At the same time, to look at the brighter side of the picture, long-term carbon emissions may be lower than previously projected.

As it happens, expert opinion is radically divided on this key issue.

A recent report from analysts at Lux Research, "Evaluating New EOR [Enhanced Oil Recovery] Technologies in Oil Industry Mega-projects," proposes that by means of EOR, the industry may be able to tap up to 10.2 trillion barrels of unconventional oil, over and above 1.4 to 1.6 trillion barrels of conventional oil. (Lux puts the number for conventional oil reserves at 1.6 tbl; a year ago, IEEE Spectrum cited an estimate of 1.4 tbl, based on work by Michael Klare.)

Klare, a professor of peace and world security studies at Hampshire College in Massachusetts, seems to be in general accord with Lux's view that the age of oil is far from over. Writing in the Huffington Post, the left-liberal online publication, Klare said that "humanity is not entering a period that will be dominated by renewables. Instead, it is pioneering the third great  carbon era, The Age of Unconventional Oil and Gas." According to Lux, EOR techniques can boost recovery of oil in existing fields from an average of 25 percent today to up to 65 percent. Klare, citing International Energy Agency estimates, says that investment in such techniques will exceed US$ 22 trillion between now and 2035—three times the investment in renewable technology—and that world demand for oil will grow 26 percent in that period.

An article that appeared in the July 13 issue of Eos (the transactions of the American Geophysical Union) presented a radically different view of things. Taking a more economic view of what it means for oil to be recoverable, scientist James W. Murray and analyst Jim Hansen suggest that oil pricesand with them oil productionalready have arrived at the limit of what consumers worldwide are willing to pay. "Global production of crude oil and condensates…has essentially remained on a plateau of about 75 million barrels per day since 2005 despite a very large increase in the price of oil," say Murray and Hansen. (The latter is not to be confused with famous climate scientist Jim Hansen, of the Goddard Institute for Space Studies at Columbia University.) In effect, they suggest, prices have reached a level where consumers seek alternatives or conserve, rather than pay more; if oil prices go significantly higher, then the effect is to plunge the industrial world into recession, lowering demand.

The silver lining, Murray and Hansen suggest, is that the expert bodies like the Intergovernmental Panel on Climate Change (IPCC) may have over-estimated future carbon emissions resulting from oil combustion. It will be interesting to see, when the next major IPCC assessment appears next month, how it handles that issue.

Where do I stand personally on this immensely important and controversial question? I cannot claim to be an expert, but for what it's worth, my impressions correspond more closely to those of Murray and Hansen than to those of Lux, Klare, and the IEA.

Photo: At Chevron's Kern River oil field in Bakersfield, Calif., U.S., enhanced production technologies such as steam flooding have made it possible to extract oil once considered economically unfeasible to obtain. Ken James/Bloomberg/Getty Images

Solar Panels Return to the White House

It took President Obama three years to return solar panels to the rooftop of the White House, but the real saga began long before he took office.

Back in 2010, former Energy Secretary Steven Chu said that the administration would install between 20 and 50 solar panels. Despite the pledge, however, the White House did not respond to offers for free solar photovoltaic systems from companies such as Sungevity, according to Renewable Energy World.

Now, in 2013, President Obama has found new resolve to discuss climate change and a more resilient energy landscape. Earlier this summer, the president delivered a speech calling for stricter regulations on existing coal-fired power plants, more wind and solar generation on public lands, and more energy-efficient buildings in both the public and private sector.

At the time, he said the changes would start with the federal government, especially in the realm of improving energy efficiency; for instance, he called for new efficiency targets for federal buildings.

Obama is now taking that message back to his own home, installing solar PV as “part of an energy retrofit that will improve the overall energy efficiency of the building,” a White House official told the Washington Post.

Of course, this all goes back much further than Obama's time in office. President Ronald Regan removed solar panels in 1986 that Jimmy Carter had installed in 1979. President George W. Bush also put a solar array on a small building on the White House grounds in 2003 to help heat the pool.

“Better late than never—in truth, no one should ever have taken down the panels Jimmy Carter put on the roof way back in 1979,” Bill McKibben, director of the climate group 350.org, told the Washington Post. “But it’s very good to know that once again the country’s most powerful address will be drawing some of that power from the sun.”

Today, solar panels are 97 percent cheaper than they were when Carter was in office, but the U.S. still has far higher soft costs—such as permits for installation and interconnection fees—than some other countries, such as Germany.  

Although the panels are already being installed, there is no word yet on the final panel count or the total energy output. President Obama has pledged that 20 percent of the federal government’s energy use will be powered by renewables by 2020.

Photo: Chuck Kennedy

Supercomputing a Quieter Wind Turbine

Noise created by giant wind turbines is high on the list of barriers to renewable energy deployment, with NIMBY and health complaints threatening or at least delaying a number of projects around the world. But noise also is related to efficiency, and now the research division for turbine manufacturing giant GE says it has figured out how to reduce noise and boost output. Win all around, apparently.

GE worked with Sandia National Laboratories in Albuquerque, New Mexico, to engineer new designs that would reduce noise. They used the Red Mesa supercomputer, which, when it first began operations in 2010, could reach speeds (500 teraflops) that made it the 10th fastest computer in the world. GE used it to run a Stanford University-created program called a high-fidelity Large Eddy Simulation "to predict the detailed fluid dynamic phenomena and resulting wind blade noise," according to a GE press release. After three months of continuous runs with the Large Eddy Simulation, the researchers apparently had "valuable insights that were used to assess current engineering design models, the assumptions they make that most impact noise predictions, and the accuracy and reliability of model choices."

That's a bit vague, what's unequivocal is the bottom line: a turbine rotor design that's one-decibel quieter equates to a 2-percent increase in annual energy yield, GE says. And with 240 gigawatts of wind power forecasted to be installed around the world in the next five years, that 2-percent increase could be worth 5 GW.

Aside from efficiency, reducing noise could cut down on NIMBY fights when it comes to getting wind projects built, and could possibly allow the turbines to be built slightly closer to where people live. Reports of health problems thanks largely to turbine noise (as well as "shadow flicker") remain largely anecdotal, with bigger studies suggesting that conditions such as "wind turbine syndrome" are likely overblown. But it's clear that cutting down on noise would benefit pretty much everybody, whether or not they live near turbines.

Image: GE

A Smart Phone Uses as Much Energy as a Refrigerator?

When you plug your smart phone into the wall, it draws a negligible amount of energy compared with other household electronics such as your set-top box or refrigerator.

But add in the amount of electricity it takes to move data across networks to deliver a total of, say, an hour of video to your smart phone or tablet each week, and over a year it adds up to more power than two new Energy Star refrigerators consume in a year.

And though phones and other electronics and appliances are becoming ever more efficient, that efficiency does not offset the proliferation of these devices around the world.

A new paper, “The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure, and Big Power," [PDF] investigates the energy draw of information-communications technologies (ICT) and how they are dwarfing what we traditionally think of as energy hogs in the home. The paper was commissioned by the U.S. National Mining Association and the American Coalition for Clean Coal Electricity.

The global ICT ecosystem uses about 1500 terawatt-hours of electricity annually, which is equal to the electricity used by Japan and Germany combined. That figure will only increase, especially as cloud architecture overtakes wired networks.

Read More

Blackout Threat Unmitigated a Decade After the Northeast Went Dark

The Northeast Blackout struck seven U.S. states and Ontario ten years ago today, prompting mandatory standards to prevent such a cascading power outage from happening again. Then, two years ago, it did. Arizona, California and Mexico’s Baja California took the hit in 2011, but the story was much the same. In both cases, inadequate information and planning and human error left the power grid unstable enough that a single downed power line unleashed an electrical tsunami that swamped neighboring lines and darkened millions of homes and businesses.

Power system experts who study blackouts say that they see a similar pattern in most cascading outages. They cite other recent notables, such as Western Europe in 2006, Brazil in 2009, and twice in India last year. The commonality is evidence to the experts that cascading failures are a dangerous facet of modern power grids that remains all but impossible to predict or prevent. “Large blackouts are likely to recur at regular intervals,” says Ian Dobson, a cascading failures expert and electrical and computer engineering professor at Iowa State University.

Worse still, bigger and more frequent blackouts may be coming. The Northeast Blackout, the worst in U.S. history, shut off 61 800 megawatts (MW) of power consumption. But a 100-year blackout—the largest for which there is a 1 percent chance of occurring every year—would be three times bigger than that event in 2003. According to University of Vermont electrical engineering professor Paul Hines and colleagues at Carnegie Mellon University, who made the calculation with algorithms used in natural disaster planning, such a blackout would interrupt 186 000 MW or roughly one quarter of all electrical service in the continental U.S.

Meanwhile, more frequent blackouts are also likely, a result of the increasing incidence of extreme weather such as thunderstorms, hurricanes, and blizzards predicted by climate models. Weather-related outages are already on the rise according to a report this week from the White House Council of Economic Advisors analyzing U.S. Department of Energy (DOE) stats on outages affecting 50 000 or more power customers. The report identifies more than 80 such weather-related outages per year, on average, from 2008 to 2012 -- more than double the average frequency observed during the previous five years.

The White House also cites efforts underway to improve grid reliability and resilience. After the 2003 blackout, Congress mandated that utilities comply with grid maintenance and operating standards set by the North American Electric Reliability Corporation, an industry-funded non-profit. As a result, utilities do things like trim trees under power lines more often. And $4.5 billion in economic stimulus funding allocated for grid upgrades accelerated the deployment of advanced grid technologies, such as sensors called phasor measurement units that give controllers a real-time picture of power flows across the grid.  

Jeff Dagle, chief electrical engineer at the DOE’s Pacific Northwest National Laboratory and a member of the task force that investigated the 2003 blackout, points to a speed-up in state-estimation software. Operators rely on such modeling to understand how their grid is behaving and foresee the impact of losing key components, so faster models enable faster evasive actions. “Ten years ago these commonly took 15 minutes. Today many would deem five minutes to be slow,” says Dagle.

The problem, say Dagle and other grid experts, is that power lines and power stations are aging at the same time that increasing levels of renewable power generation are straining the grid. “We’re putting more demands on the system,” says Dagle. In the calculus of improvements, degradation, and increasing demands, he says “it’s really hard to know [whether] we’re more or less reliable today.”

At the same time, it is unclear whether the "smart" upgrades touted by the White House will help or hurt. That’s because tools to quantify the risk of cascading failures are still immature. And until the risk can be accurately measured, grid engineers can’t know precisely how to reduce it.

Dobson says the research is coming now that the power engineering community increasingly recognizes cascading failures as a distinct and recurring problem—a concept that still elicited protests from power engineers in the aftermath of the 2003 blackout. Dobson cites work by the IEEE Power Engineering Society’s Cascading Failures Working Group, of which he and Hines are members.

One novel method he unveiled last year assesses the likelihood of cascading outages by counting the number of subsequent lines that trip off each time a power line goes down on a given grid. Using a database from the Bonneville Power Authority (the only U.S. utility whose line-trip data is publicly-available), Dobson showed he could quantify the likelihood that a line outage would propagate and how far it would go, on average. But the tool remains too weak to identify trends or evaluate specific engineering solutions. “The events are rare so it takes years of data to get a reasonable estimation,” says Dobson.

Until such tools are mature it will be difficult to target funds to those upgrades most likely to reduce blackout risk, say Dobson and others. Even grid upgrades underway could be threatened if their impact on blackout risk is poorly understood. “If newly-introduced technology gets blamed for a future blackout there could be pressure to run it below capacity, or even shut it off entirely. That would be quite a pity if we could have mitigated the failure risk instead,” says Dobson.

Photo: Spencer Platt/Getty Images

IBM Launches Advanced Renewable Forecasting Tool

Weather has always been an important factor in planning for the needs of the electrical grid, but lately it has become even more crucial with the proliferation of grid-scale wind and solar resources.

To meet the needs of utilities that are installing large amounts of renewable energy, IBM just launched software that brings together data analytics and cutting-edge weather prediction. The Hybrid Renewable Energy Forecasting, or HyRef, incorporates cloud imaging technology, sky-facing cameras, and operational and environmental sensors to build customized models of renewable outputs.

“If you can’t do the weather forecast better, you’re done,” says Lloyd A. Treinish, chief scientist of IBM's Deep Thunder program, which aims to improve local weather forecasting through the use of high-performance computing.

Wind farms, for example, have had sensors on the turbines, but they’re used to monitor the turbines and not the weather because the data has been too contaminated to be of use, Treinish says.

IBM took its expertise in renewables and weather forecasting, including its micro-forecasts used in Deep Thunder, to develop HyRef. One of the biggest challenges was taking inputs from the front of the wind blades, rather than behind it. Taking measurements from the front of the blade increased the acoustic noise, which had to be filtered out to get an accurate reading.

Once IBM had data that was free of contamination, it built a statistical model that could drill down to the individual turbine scale and provide forecasts for 15-minute intervals or up to a month in advance. “With accuracy and precision, you have that much better information so then you can have inputs that offer far greater fidelity for power output,” says Treinish.

The first client to use the software, Jibei Electricity Power Company Limited—a subsidiary company of the State Grid Corporation of China—hopes to increase the integration of renewable power generation by 10 percent. The utility is part of State’s Grid’s the Zhangbei 670 MW demonstration project, the world’s largest utility-scale renewable power plant, which combines wind and solar with energy storage and transmission.

“Clients keep telling us forecasts aren’t good enough,” Treinish says of current weather forecasts for renewables. IBM’s approach is to customize the software for each utility’s needs, whether that’s a large-scale solar array or offshore wind farm. In the case of Jibei, the utility is interested in day-ahead forecasting for its wind resources.

“We have some of the most high-resolution weather modeling software, and we bring that to bear,” says Treinish. IBM also relishes the multi-disciplinary challenge of building better forecasting tools, which combine expertise from math disciplines, computing, atmospheric physics and other sciences.

Much of the research for HyRef was borrowed from solving other weather-related challenges, such as flood forecasting for cities or outage detection for grid operators. “We’re data scavengers,” says Treinish. “We’ll use whatever we can use to make the forecast better.”

The interest in HyRef, so far, has been in areas around the globe that have a high penetration of renewable energy and are already having intermittency problems, according to Stephen Callahan, a partner in IBM’s global business services for the energy and utilities industry. For some clients, the goal is not just better forecasting for the individual utility, but bringing that accuracy to market at the wholesale level so that the economic value of renewable energy can be priced more effectively. “That is the threshold we’re all working to get to,” says Callahan.

Photo: Yagi Studio/Getty Images

Detecting and Correcting Methane Leakage: Is a Technical Fix Ahead?

It's fashionable and sensible in the complicated domains where technology and policy intersect to be suspicious of any narrowly conceived solution. "There's no technical fix," goes the usual refrain (which I certainly have voiced plenty of times myself). In the case of methane leakage from natural gas production and distribution systems, however, there really may be a combination of technical fixes on the horizon. Because of methane's high global warming potential relative to carbon dioxide, detecting and correcting methane leakage is going to be very important in the years ahead.

Two and a half years ago, the Maryland company Earth Networks announced it would start building a network of sensors to directly monitor greenhouse gas emissions on a regional basis, working with the Scripps Institution of Oceanography, and others. Each processing unit consists of a US $50 000 box, along with elaborate calibration software that Earth Network has developed with scientific partners at Scripps, NIST and NOAA, and connects by a tube with sensors installed 100 meters up on some existing tower [photo]. The initial plan called for putting about 100 units in the United States, 25 in Europe, and 25 other places in the world.

Earth Networks CEO Robert Marshall says that 28-30 units are now installed, mostly in the U.S. Northeast but also in the Los Angeles area and some other places. One of the company's units has taken over the famous job of monitoring global CO2 atop Hawaii's Mauna Loa and produced the definitive measurement several months ago of the Keeling curve's crossing the 400 ppm threshold.

Earth Networks, originally known as AWS Convergence Technology, makes its living primarily by delivering very finely grained weather forecasts and severe weather alerts. Its well-know trademarked product, WeatherBug, is widely used by clubs, schools, and planners of major sports and entertainment events. Before the terrible Moore, Oklahoma tornado last spring, the company's networks detected dense in-cloud lightning, an early warning signal of the catastrophic twister. As for the greenhouse gas monitoring network that Earth Networks is installing, this work is being done at present on what you might call a pro bono basis—or, if you prefer, as a speculative venture anticipating high future demand for the new service.

These days, says Marshall, there is little demand for GHG monitoring at the national level because there is no real issue of national compliance at present. The system established by the Kyoto Protocol in 1997 has lost traction, and the world awaits a new system of mandatory GHG cuts, to be formulated at a conference in Paris at the end of 2015. But in the meantime, Marshall points out, two regional GHG reduction programs have been established in the United States—in the Northeast and in California—and many cities here and overseas are adopting objectives that will require monitoring.

As the GHG detection networks get built out, they will be able to determine how much methane is escaping from major fracking fields and from aging urban gas distribution systems, to name the two most important situations giving rise to acute concern. At present, reports of emissions from gas fields are often based on one-day spot checks done by aircraft flyovers. "There's nobody else out there doing what we do," says Marshall. "Permanently installing sensors that take data continually and can monitor emissions from an entire gas field, as opposed to just individual wells."

Once the Earth Networks GHG detectors are able to provide alerts to situations where methane leakage is high, then newly developed portable monitors can be used to pinpoint the exact spots where leaks are highest, so that corrective measures can be taken. One such portable device, a "gasbot" developed in Sweden, was described in a recent post here; another, described in a recent New York Times article, was developed by instrument maker Picarro, the same company that makes the Earth Networks GHG boxes. Thus, the region-wide and portable monitoring devices have complementary roles to play so that, to the extent methane leakage turns out to be a really serious problem, it may also turn out to be a fixable problem.

Photo: Earth Networks

Advertisement

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Advertisement
Load More