Energywise iconEnergywise

Time to Rightsize the Grid?

Last week a team of systems scientists known for counter-intuitive insights on power grids delivered a fresh one that questions one of the tenets of grid design: bigger grids, they argue, may not make for better grids. Iowa State University electrical engineering professor Ian Dobson and physicists David Newman and Ben Carreras make the case for optimal sizing of power grids in last week's issue of the nonlinear sciences journal Chaos. 

In a nutshell, the systems scientists use grid modeling to show that grid benefits such as frequency stabilization and power trading can be outweighed by the debilitating impacts of big blackouts. As grids grow larger, they become enablers for ever larger cascading blackouts. The Northeast Blackout of 2003 was a classic case. From a tripped line in northern Ohio, the outage cascaded in all directions to unplug more than 50 million people from western Michigan and Toronto to New York City.

This week's findings are more conceptual, however, than some news outlets would have us believe. NBC News, in an online article entitled Researchers Suggest It's Time to Downsize Power Grid, misjudged the Chaos report as a call to break up the dual grids that interconnect most of eastern and western North America. "It’s not possible to really make that statement," says Carreras, who runs Oak Ridge, TN-based consulting firm BACV Solutions and is a visiting professor at Madrid's Universidad Carlos III.

B.A. Carreras/BACV Solutions

NBC misinterpreted Carreras et al's simulations showing that grids with just 700-1000 nodes (over 15 times smaller than North America's big grids) maximize interconnection benefits while minimizing blackout costs (see above chart). The researchers say this could indicate that some real grids are too large, but there are two big reasons to be cautious about drawing conclusions. 

Carreras stresses that the model nodes are not necessarily representative of those on a real grid. Many of the group's simulations, for example, use scale models of the Western grid in which each node in the model represents, on average, 10 nodes on the real grid. 

Newman, a physics professor at the University of Alaska in Fairbanks, notes that the specific models used in this Chaos study were idealized, homogeneous systems. As such, he says, they bear little resemblance to the heterogeneity of real grids with their diversity of voltage levels, branching patterns and other features. "700-1000 nodes was the optimal size for the artificial network we had constructed," says Newman.

Media hype is a problem that has dogged this team of system scientists since they gained notoriety over a decade ago by identifying cascading failures as an innate feature of power grids. Their simulations, which a Spectrum cover story profiled ten years ago, show that economic pressure to maximize return on investment loads power grids to levels that leave them at heightened risk of costly blackouts.

The researchers delivered a complex systems view of blackouts that they hoped would spur novel thinking about the costs and benefits in grid design, and novel approaches to blackout prevention. But their message was often misinterpreted as an attack on the quality of grid engineering, or an argument that trying to prevent blackouts was futile. 

Carreras et al argue that this week's report has important conceptual value, if one gets beyond the hype. For one thing, blackout risks should be factored into the cost-benefit calculation when grid planners consider expanded interconnection. This could be applicable in developing countries as well as in Europe, which recently expanded its grid to include Turkey's and is considering extensions to North Africa and Russia.  

It's also possible that grid design could be engineered to enable extended interconnection without expanding cascading blackout risk. Newman points to the possibility that weak links could be deliberately placed within grids to confine cascading blackouts to their region of origin. The team's next step, says Newman, is to study just that possibility by simulating and optimizing heterogeneous networks. 

A correction to this post was made on 23 April 2014: Ian Dobson is at Iowa State University, not the University of Iowa as originally reported.

Tequila Sunrise: Big Benefit from Co-Locating Agave Crops and Solar Power

Solar power in the desert has problems: big land use requirements, and the need for scarce water to clean the panels and suppress dust. In an unrelated story, biofuels production has problems: life cycle greenhouse gas emission issues, and land use questions again. How about solving both sets of problems at once? Stanford researchers have modeled the co-location of solar panels with agave plants used to make ethanol, and found it to be a winning combination.

The idea of "agrivoltaics", or combined solar power and agricultural production, has been floating around for a while now. It's an idea that springs at least partially from the modern distaste for "monoculture", or the growing of a single crop over huge swaths of land. The reasoning: Instead of "growing" only solar power on a plot of land, why not use the space between and underneath the photovoltaic panels to also grow crops? There are some projects in France that have tried this, and a post-Fukushima Renewable Energy Village in Japan also features crops underneath PV. There are also experiments at the University of Massachusetts, and some small-scale "solar farm" installations in Wisconsin.

It seemed unlikely, however, that the idea of studding the land around the solar plants that have started cropping up across the arid deserts of the American southwest would take root. But the Stanford folks, led by post-doctoral researcher Sujith Ravi, realized that the water required for a solar plant could actually make the desert more hospitable to agriculture as well. To test the idea, they chose agave plants, biofuel sources that are already quite hardy and require little water to survive.

They found that by combining a PV plant with agave production, a given area could yield more energy for the same amount of water than either PV or agave alone. The study, published in the journal Environmental Science & Technology, showed a "high-yield" scenario where only 0.42 liters of water would be needed to produce one mega-joule of energy.

"It could be a win-win situation," Ravi said in a press release. "Water is already limited in many areas and could be a major constraint in the future. This approach could allow us to produce energy and agriculture with the same water." This marriage of agave and solar panels is especially compelling for two reasons. First, because agave plants require roughly the same amount of water needed to keep solar panels clean and to suppress dust, it's possible to use the water dripping off a newly cleaned solar panel to nourish the plant. And a 2011 study found that the plants, also used to make tequila, perform just as well or better than corn, sugarcane, and switchgrass in terms of lifecycle greenhouse gas emissions and other parameters.

Aside from planting crops beneath existing solar panels, there are other ways to think about combining PV with plants. In a report on the topic, the National Renewable Energy Laboratory encouraged farmers to consider locating solar panels in the unused corners of their center-pivot farm plots. In Colorado alone, those currently underused spots could generate enough power to meet all of the state's electricity needs. Clearly, farming and solar power should become much better friends in the future.

Scientists Discover Efficient Way to Turn Carbon Monoxide Into Ethanol

Biofuels, once hailed as a planetary savior and alternative to oil and gas, have not quite fulfilled that destiny. Traditional, mass-produced biofuels from crops such as corn carry a litany of problems, including land use issues and questions of life cycle emissions. If we could generate usable fuels from more benign sources, it could go a long way toward solving a host of energy and environmental problems. A team at Stanford University reports today in Nature that they have a novel way to produce ethanol from carbon monoxide (CO) gas using a metal catalyst made of copper nanocrystals.

"We have discovered the first metal catalyst that can produce appreciable amounts of ethanol from carbon monoxide at room temperature and pressure—a notoriously difficult electrochemical reaction," said senior study author and Stanford chemistry professor Matthew Kanan in a press release.

Copper is the only material known to electroreduce CO down to generate fuels, but it does so at extremely low efficiencies. Kanan's group improved this with a nanocrystalline form of copper produced from copper oxide; this new material improves the efficiency of the reactions dramatically.

The researchers built a fuel cell, including a cathode made of the new copper nanocrystals, and suspended it in CO-saturated water; a small voltage applied across the fuel cell generates the resulting ethanol products. The Faraday efficiency using the oxide-derived material was 57 percent, meaning more than half of the current used went toward producing ethanol and acetate. Standard copper particles, meanwhile, produced hydrogen almost exclusively (Faraday efficiency of 96 percent) and very little ethanol.

In an e-mail, Kanan said a few years is probably enough to turn this basic work into prototype devices, outside of the lab, that can produce meaningful amounts of fuel. "Some of the technical issues include reformulating the catalyst such that it can be dispersed on high-surface area electrodes, and engineering an electrochemical cell that delivers CO to the catalyst at a high rate," he said. The long-term stability of the oxide-derived copper catalyst is still in question as well. Kanan declined to offer guesses on eventual costs of the device, given its still lab-bound status; copper, however, is not particularly expensive, as catalysts go.

If the details of this were actually to work out and at an acceptable cost, it could be enormous. Though there have been changes proposed recently in the biofuels mandate in the United States, we're still producing billions of gallons of the stuff each year, virtually all of it from corn. The big idea with the new catalyst would be to power the fuel cell using renewable energy rather than from fossil fuels; ideally, one would just grab carbon dioxide out of the atmosphere and turn it into CO.

"There is good technology for converting CO2 to CO using an electrical energy input, although it requires high temperatures," Kanan told me. "There has been much work by many groups including ours to develop a low temperature electrochemical process for converting CO2 to CO, and a number of good catalysts have been found recently. I don't think the CO2-to-CO would be a limiting factor."

Energy Efficiency Grows as Clean Energy Investment Falters

Global investment in clean energy fell 11 percent in 2013. Despite the downward shift, there are still some bright spots that highlight the future of the world’s clean tech industries.

Investment in solar, wind, biofuels, biomass, energy efficiency and energy storage was US $254 billion in 2013, according to a new report [PDF] from Pew Charitable Trusts.

While the stars of the market, wind and solar, have slipped, the unused kilowatt—aka energy efficiency—saw a 15 percent growth in the past year. Investment might be down overall, but 2013 was still a record setting year that also saw energy storage take a foothold in the market.

Solar and wind, with more than $170 billion in investment combined, still make up the lion’s share of the clean tech industry. But energy efficiency, which includes smart meters and energy storage, was the only sector that saw increased investment, with a total of nearly $4 billion in 2013. Most of the efficiency investment was in the United States, where there is an increased focus on saving energy at the state and federal level.

“While there was an overall decline in investment, there are signs that the sector is reaping the rewards of becoming a more mature industry,” Phyllis Cuttino, director of Pew's clean energy program, said in a statement. “Prices for technologies continue to drop, making them increasingly competitive with conventional power sources. Key clean energy stock indexes rose significantly in 2013, with public market financing up by 176 percent.”

Although the United States led in energy efficiency, Asia is leading the clean tech charge overall with 10 percent growth. China dominated with more than $54 billion in investments in 2013, including a near four-fold increase in solar growth.

“With extensive manufacturing capacity in the solar and wind sectors, growing domestic markets, and unequaled national targets for renewable energy, China is poised to be a leader in the world’s clean energy marketplace for many years to come,” the report authors wrote. Even so, China’s investment was down 6 percent from 2012.

China’s slight decline was offset by the growth in the Japanese market, which is driven by feed-in tariffs for wind and solar. Those incentives were presented as a way to advance renewables as an alternative to nuclear power that went offline in the wake of the 2011 Fukushima nuclear disaster. Japanese clean tech investment was up 80 percent in 2013 to nearly $30 billion, putting it third behind China and the United States.

Overall, the European clean tech market has dropped considerably, driven by tighter investment in Germany and Italy in particular. The U.K. is one bright spot for clean energy in Europe, with 13 percent growth in 2012. Most of the growth came in the wind sector, but the UK is also second in the G-20 in terms of “other renewables” because of its investment in biomass.

In the Americas, Canada jumped ahead with a nearly 50 percent growth in investment, also mostly driven by wind. Ontario, in particular, has a goal of completely shutting down its coal-fired electricity generation. But solar was up too, attracting $2.5 billion of the country’s $6.5 billion investment.

Canada, the U.K., and Japan were the only G-20 countries that saw growth, but non G-20 markets grew by 15 percent overall. “Markets for clean energy technologies in fast-growing developing countries are prospering, because these economies view distributed generation as an opportunity to avoid investments in costly transmission systems,” said Pew's Cuttino.

Distributed solar is expected to keep growing in the United States and Japan. Mexico and Turkey each have legislation that could jump start the clean tech industries, according to the report. South Korea is investing in efficiency to manage peak demand. China will continue to lead, however, with goals of 18 gigawatts of wind and 14 gigawatts of solar in 2014. 

“In view of industry maturation,” the Pew authors wrote, “Bloomberg New Energy Finance projects a 2014 rebound in worldwide investment and installation of renewable energy.”


Image: Pew Charitable Trusts

White House Taps ARPA-E to Boost Methane Detection

In this month's issue of IEEE Spectrum we spotlight the methane emissions overlooked by the U.S. EPA's greenhouse gas inventory, and the satellite-based detector launching next year to map this "missing methane." Last week the White House acknowledged EPA's missing methane problem, and laid out a strategy to combat it. While promising to improve EPA's inventory, including more use of top-down methane measurement, the White House also promised federal investment in ground-based methane sensing to plug leaky natural gas systems thought to be the source of much of the missing methane.

Action can't come soon enough according to the Intergovernmental Panel on Climate Change (IPCC), which on Monday unveiled its latest report on Climate Change Impacts, Adaptation, and Vulnerability. The IPCC said "widespread and consequential" impacts are already visible and world leaders have only a few years to change course to avoid catastrophic warning. Methane is a major contributor according to the scientific body's update on the physical basis for climate change, released last fall, which deemed methane to be up to 44 percent more potent as a warming agent than previously recognized.

The White House says that the U.S. Department of Energy's ARPA-E high-risk energy R&D fund will contribute by seeking to improve natural gas sensors, which are presently sensitive or cheap but not both. ARPA-E is preparing a new funding program that the White House says will "deliver an order-of-magnitude reduction on the cost of methane sensing, thus facilitating much wider deployment throughout all segments of natural gas systems."  

One contestant for funding could be robotic systems such as the Swedish-developed Gasbot profiled by Spectrum last year. Gasbot, a project from Sweden's Örebro University, uses a mobile robot from Kitchener, Ont.-based Clearpath Robotics equipped with a laser-based remote gas sensor to map methane concentrations across a potential leak site. Orebro doctoral student Victor Hernandez says the Gasbot team has implemented improvements since Spectrum's coverage, including the addition of an anemometer to help determine where detected emissions are coming from.

Using a robot might reduce labor costs and accelerate the process of mapping a site, such as a natural gas plant or a landfill, and Hernandez says a market survey conducted last year has confirmed commercial interest in Gasbot. But Örebro's package doesn't come cheap in its present incarnation. The gas sensor alone costs about 10 000 (US $13 760), he says, and the Clearpath A-200 robot is another $12 000 or so.

Another contestant could be the laser science research group at Rice University, in Houston, which has recently demonstrated two novel strategies for building compact, sensitive and potentially low-cost methane detectors. The best developed relies on recently miniaturized mid-infrared quantum cascade lasers and cheap piezo-electric devices to detect the laser-excited heating of traces of methane gas—traces as thin as 13 parts per billion (ppb) according to group leader Frank Tittel, a professor of electrical engineering. His newer system uses advanced optics to more than double the methane sensitivity.

Tittel's group has already proven its devices at a Houston landfill through a NASA program designed to calibrate space-based measurements of methane and other pollutants. He projects that the piezo-electrically tuned sensor could be scaled down and mass produced to deliver a $1000 system the size of a smart phone. The key, says Tittel, is mass production of the lasers, which currently cost $12 000.

Tittel says his group has teamed up with Newton, N.J.-based Thorlabs, which makes the required quantum cascade lasers as well as the electronics, mechanical stabilizers, and optics to build an integrated product.

Thorlabs appears to be keen. The company presented at an ARPA-E methane technology workshop last year, and declared its intention to "grow the [mid-infrared laser] market by reducing component costs." 

Message to missing methane: You may soon have nowhere to hide.

Hybrids Are a Perfect Fit for Traffic Jams of India and China

If you can't beat stop-and-go traffic in India and China, you might as well join it, but only if you're driving a hybrid car. The resulting fuel savings of about 50 percent are greater than what drivers experience in the United States and could make a big difference in countries experiencing a rapid growth in the number of cars on the road, according to recent U.S. Department of Energy research. 

Read More

China Wants to Make a Splash in Ocean Energy

The UK has some serious competition in the race to tap the ocean for energy.

China is increasing spending on tidal power, according to the Wall St. Journal and its largest project could outmatch any planned development in Europe. Currently, the UK is the leader in marine power, with a goal of 2000 megawatts installed by 2020

The largest potential project in China is a US $30 billion tidal wall that could have an installed power base of about a gigawatt. The dam-like structure has turbines with curved blades that allow marine life to swim through while harnessing the energy in the water. The project has domestic and international backing, including from the Dutch government and eight Dutch companies, according to the Wall St. Journal.

Read More

Why a Floating Electric Car Is Not an Ideal Disaster Vehicle

When disasters strike, people are often stranded until specialized transport comes to the rescue. When you think of missions that are deployed after floods, hurricanes and tsunamis, what comes to mind is usually military helicopters, people in small nimble boats, and rugged vehicles.

One Japanese company, however, wants to make a 460-kilogram, all-electric, floating car the vehicle of choice in low-lying, flood-prone regions. The car is targeting the Asian market, but the most recent report from the UN’s International Panel on Climate Change found that there are more regions across the globe that face an increasing number of heavy precipitation events. The frequency and intensity of such heavy rain events is likely increasing in North American and Europe, the report notes.

While a floating, no-emissions car sounds cool, it is unclear how the vehicle, the Fomm Concept One car, would actually fare in a large-scale disaster.  

The idea for the amphibious vehicle, which has a water jet generator to propel it through water-covered roads, came after the company’s president, Hideo Tsurumaki, saw the devastation caused by the tsunami that hit Japan in 2011, according to Grist.

But the car is likely to take range anxiety to a whole new level. The Concept One is not unlike other small all-electric vehicles that run on lithium ion batteries, but for trips across water-ravaged territories, its 100-kilometer range could be very limiting indeed.

The car would have to be fully charged ahead of a disaster. It would also need to be in a location that already has electric vehicle charging infrastructure and yet also not too far from where the disaster strikes, unless it was brought in by another vehicle, such as a flatbed truck.

Although a fleet of cars could operate through extended hours, it’s unlikely that electrical infrastructure would be available to recharge cars once battery levels are running low in the case of a widespread emergency.

If the car actually needs to turn into an amphibious vehicle, it will require some maintenance work afterwards, Fomm warns. The company has yet to specify exactly what that will entail or just how much floating it can do before needing a stop off at a service station. 

According to AutoBlog Green, the floating function is only for emergencies and the company noted that the car’s movement is limited when it’s floating. If the car was completely submerged by severe flooding, it sounds it might not be of much use at all afterwards.

Lastly, the car is small. It’s billing itself as the “world’s smallest class four-seater electric vehicle.” But in reality it's smaller than Daimler-Benz's Smart car. So, in the case of an emergency, you might only be helping the one or two other people you could shoehorn into the tiny passenger cabin.

There are plenty of other technology breakthroughs that could be far more practical in disaster recovery, such as cyborg cockroaches, robot snakes and bouncing balls.

Although it’s billed as a “water-world spaceship,” the Fomm concept car is probably more useful in areas where there is minor road flooding on a regular basis, rather than a solution for an actual watery world.  

Fromm expects commercial production of the car to begin in 2015. 

Rainwater Microturbines Purify Water, Make Some Electricity

The parts of the world that lack for consistent electricity also, unfortunately, often lack for consistent clean water. Students at the Technological University of Mexico have come up with a simple system to at least partially address both needs using a microturbine and collected rainwater.

They named the device Pluvia, and tested it in a large, poverty-stricken part of Mexico City called Iztapalapa. Rainwater is collected by funneling it into a gutter on the rooftop, or by adding sheeting to simulate a slope. The water passes through a filter, specifically designed to clean rainwater during the first two weeks of the rainy season, which has higher acidity and contaminants; that water is stored in a tank. A pump then helps the remaining water flow past the small turbine, which generates the electricity.

Read More

World's Highest Wind Turbine Will Hover Above Alaska

The title for world's largest wind turbine is constantly up for grabs as manufacturers build higher and bigger to capture more energy from the passing air.

One turbine in Alaska, however, will now spin high above the rest. Altaeros Energies will launch its high-altitude floating wind turbine south of Fairbanks to bring more affordable power to a remote community. Ben Glass, CEO of Altaeros told The New York Times that the company expects to provide power at about $0.18 per kilowatt-hour, about half the price of off-grid electricity in Alaska. 

Unlike its earth-bound brethren, the airborne turbine is not intended to supply power for large electric grids. Instead, its sweet spot is serving far-flung villages, military bases, mines, or disaster zones. Various researchers have been developing floating wind turbines for years, but the 18-month project in Alaska will be the first longer-term, commercial project to test the technology, according to Altaeros.

Altaeros’ Buoyant Airborne Turbine (BAT) is an inflatable, helium-filled ring with a wind turbine suspended inside. It will float at a height of 300 meters, where winds tend to be far stronger than they are on the ground. The altitude of the BAT is about double the hub height of the world’s largest wind turbine.

The BAT has a power capacity of 30 kilowatts and will create enough energy to power about 12 homes, the company says. But that’s just the beginning. It can also lift communications equipment such as cellular transceivers or meteorological devices and other sensing equipment. Altaeros said additional equipment does not affect the energy performance of the turbine. 

The technology can be deployed in under 24 hours, because it does not require cranes or underground foundations. Instead it uses high-strength tethers, which hold the BAT steady and allow the electricity to be sent back to the ground. A power station on the ground controls the winches that hold the tethers and pulls in the power from the turbine before sending it on to a grid connection. Altaeros has tested its BAT prototype in 70 kilometer-per-hour winds, but because it uses the same technology as other industrial blimps that are rated to withstand hurricane-level winds, it might be able to withstand stronger gusts.

Altaeros says there is a US $17-billion remote power and microgrid market that could benefit from the technology. Many off-grid sites, including small islands, mining sites or military bases, rely on expensive diesel generators to provide some or all of their power needs. There are many projects that are trying to develop integrated solutions to tackle this market, particularly microgrids that integrate some type renewable energy.

The Boston-based startup is hardly alone in flying power stations, either. Last year, Google X purchased Makani Power that makes airborne wind turbines that resemble small airplanes. At the time, Google told TechCrunch that the appeal of Makani was that “They’ve turned a technology that today involves hundreds of tons of steel and precious open space into a problem that can be solved with really intelligent software.” Other airborne wind companies include WindLift, SkySails, Sky Windpower, and NTS

The $1.3 million project in Alaska is financed by Alaska Energy Authority’s Emerging Energy Technology Fund and RNT Associates International, which is owned by the former chairman of the Indian conglomerate Tata Group, which includes Tata Power, India’s largest integrated power company.


Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More