Energywise iconEnergywise

Sun and solar flares

NASA to Test Upgraded Earth Models for Solar Storm Threat

NASA relies on spacecraft to help keep watch for solar storms erupting on the sun. But on the ground, the space agency’s Solar Shield project aims to improve computer simulations that predict if those solar storms will create electrical hazards for power plants and transmission lines.

Read More

How to Pinpoint Radioactive Releases: Put the Genie Back in the Bottle

In 2014, a 55-gallon drum of nuclear waste exploded at the underground Waste Isolation Pilot Plant (WIPP) in New Mexico, the world’ s
only underground repository for transuranic waste (radioactive isotopes with mostly very long half lives). The amount of radioactive material released into the atmosphere was below existing security limits, but the facility has been closed down for recovery since the incident.

One positive result came from this accident: Robert Hayes, a nuclear engineer at North Carolina State University in Raleigh, was subequently able to demonstrate how it is possible to pinpoint the site of the release as well as to measure the amount of released nuclear material. Hayes used sophisticated software, meteorological data supplied by the National Oceanic and Atmospheric Administration (NOAA), and information from air sampling and monitoring equipment located several kilometers from the WIPP facility. His research will be published in the April 2016 issue of the journal Health Physics.

Hayes’ approach offers a wide field of possibilities for investigating and locating releases of radioactive material into the atmosphere—whether accidentally, from, say, a vent at a nuclear facility, or deliberately dispersed into the atmosphere by a dirty bomb. In fact, the principle is simple: By attempting to get the genie back into the bottle (figuratively of course), you find out where the bottle is.

Let Hayes explain. When an amount of radioactive material is released into the atmosphere, it will spread out in directions defined by meteorological conditions, mainly wind. By collecting data from different sampling stations in an area, and by looking at how the concentration of the radioactive material evolves over time, it is possible to create maps of how and in which direction it has moved. Now, by looking at recent meteorological data, it is possible to, in a sense, back extrapolate the cloud formation using computer modeling until it is concentrated above the site of the radiological release. Hayes does note, however, that this method will work only if you know when the release of radionuclei happened. A nuclear explosion, for example, can be precisely pinpointed because the time of the explosion can be verified independently with seismic or optical data.  

When you don’t know the time of the release, you have to use some tricks.  “You still back-extrapolate, but you make the assumption that it took place an hour ago, two hours ago, three hours ago. For these periods of time, you then back-extrapolate, and once you find a location that is credible—let’s say, a nuclear facility, then you pretty much know where the radiation came from,  and you have the time when the radioactivity was released,” says Hayes.

The location and release time aren’t the only things that can be deduced from these measurements. “If you already know the time and location, you have a much better scenario because you are able to extrapolate much more accurately and find out how much radioactive material actually was released and the time profile of that release,” says Hayes.

Detecting and locating releases by nuclear facilities in areas, such as North Korea, that are not accessible to investigators, presents an additional difficulty. The proliferation of similar radioactive isotopes released into the atmosphere by nuclear bomb tests during the last century makes it a challenge to sniff out the new stuff without the accompanying information.   But one clue researchers are trying to exploit is the difference in the ratio of radioactive isotopes between recently released nuclear material and that of older nuclear fallout. Those differences can be detected because the materials have different half lives. 

“We are looking at differences in those ratios, and they are the ‘fingerprints’ of recent events. For example, in North Korea, radioactive releases by nuclear facilities would be difficult to prevent, and this would provide a potential fingerprint that can be identified off-site,” says Hayes.

Beetles, Cacti, and Killer Plants Inspire Energy Efficiency

What do you get when you mix a desert beetle, a pitcher plant, and a cactus? Pick the right parts and you get an extremely slippery surface with an uncanny capacity to condense and collect water, according to research reported today in the journal Nature.

The advance could be a big deal for the energy world because, when it comes to energy efficiency, condensation lies somewhere between a necessary evil and a major drag. Nuclear, coal, and thermal solar power plants, for example, require large heat exchangers to condense the steam exiting from their turbines so that they can raise a new round of hotter steam. For other devices, such as wind turbines and refrigerator coils, condensation is the first step towards energy-sapping ice formation. 

Over the past decade rapid advances in biologically inspired materials have raised hopes for accelerating or guiding condensation via engineered surfaces, promising unprecedented control over water.

At least one bio-inspired surface treatment is in commercial use: The NeverWet surface protectant that paint manufacturer Rust-Oleum began selling in 2013, for example, contains nanoparticles that self-assemble to form a water-shedding surface roughness akin to that of a lotus leaf

The research published today by Harvard University materials scientists pushes this biomimicry trend to the max in their triply-bio-inspired surface [at left in video above]. Its most novel element are asymmetric 0.9-millimeter-tall mounds, inspired by the condensation-promoting bumps adorning the backs of desert beetles such as Namibia’s Stenocara darkling beetles [photo].

Photo: Martin Harvey/Alamy
Darkling beetle harvesting fog in the Namibian desert.

Joanna Aizenberg, whose Harvard lab produced the work, says their paper provides the first theoretical framework explaining how convex bumps promote the condensation of water vapor into droplets—a trick that helps some desert beetles harvest moisture from fog. 

Droplets formed on the Harvard team’s mounds glom together and roll off via a side ramp modelled after the water-droplet-guiding concavity of cactus spines. The final element are nano-pores across the entire surface infused with lubricant to create a slippery surface modelled after the trap of the carnivorous pitcher plant, in which prey literally hydroplane to their demise. 

Harvard has already refined that final element—lubricant infused nano-porosity—into an impressive water-hustling technology in its own right. Their so-called SLIPS design is en route to commercialization via SLIPS Technologies, a spin-off launched in 2014 with US $3 million in venture funding and a joint development agreement with German chemicals giant BASF. 

SLIPS Technologies chief technology officer Philseok Kim, a coauthor on today’s paper, says SLIPS-produced surface treatments and films are presently being tested on a skyscraper in New York City to document its ability to prevent ice and snow accumulation. Other applications are coming, including an anti-fouling coating for marine environments.

Harvard’s latest surface, however, is several times better at condensing water into droplets and hustling those droplets away than SLIPS alone [at right in the video above]. That could be of big value for heat exchange surfaces in power plant condensers, says Aizenberg, because droplets form an insulating barrier that slows further condensation: “You want to make sure they’re effectively removed from the surface as fast as possible.”

In fact, argues Aizenberg, the water condensing performance may be fast enough to enable a directly bio-inspired application: moisture harvesting systems for remote communities. Their surface may be able to condense moisture and collect it fast enough, even in arid environments where water droplets evaporate quickly. 

Tak-Sing Wong, a materials scientist at Pennsylvania State University who is designing his own bio-inspired slippery-surfaces, says the Harvard work could double the efficiency of power plant heat exchangers. “Forming droplets that can shed off of the surface is very important because it takes heat away immediately. The amount of water collected will be proportional to the heat that’s taken away from the surface,” says Wong.

Another energy-related device in line to benefit are refrigerators, which periodically heat their coolant coils in a constant battle against frost buildup. Wong says bio-inspired coatings could ultimately reduce refrigerators’ electrical consumption by 30 percent or more.  

Commercialization could come quick, he says, because these sophisticated designs need not be difficult to manufacture. Mature embossing and imprinting methods can produce millimeter-scale bumps in a wide range of metals and other materials, while the nanotexturing required for SLIPS can be etched into materials with acids or high-temperature steam. With the right application Wong bets that the first commercial uses will begin in as little as two to five years.

Achieving Paris Climate Targets Could Save Nearly 300,000 American Lives

When the world's nations set a goal in Paris last year of limiting the average temperature increase to no more than 2 degrees Celsius, the impact on health wasn't front and center. But, a study published today in Nature Climate Change suggests that if the United States reduces emissions from the transportation and electricity sectors in order to meet those targets, 295,000 American lives could be saved by 2030.

Read More

Supercapacitor On-a-Chip Now One Step Closer

In 2010 Spectrum reported a new approach for creating chip-scale supercapacitors on silicon wafers, proposed by researchers at Drexel University in Philadelphia and the Université Paul Sabatier in Toulouse, France. In an article published in Science, the researchers described how to make supercapacitor electrodes from porous carbon that could stick to the surface of silicon wafers so that they could be micromachined into electrodes for on-chip supercapacitors.

Now the same team has finally succeeded in doing just that. 

In a paper published in this week’s Science, researchers from the two initial teams report creating efficient porous carbon electrodes that really stick to the surface of a silicon wafer. They made layers of porous carbide derived carbon (CDC) that are completely compatible with all treatments used in the semiconductor industry, says Patrice Simon, a researcher at Université Paul Sabatier who has researched porous CDC electrodes over the last ten years and co-authored both the 2010 and this week’s paper in Science.

In an initial experiment circa 2010, the researchers exposed a 5 mm-thick ceramic plate of titanium carbide to chlorine at 500° C. They obtained a porous carbon layer on its surface that would fit the bill for electrode material for a supercapacitor. This layer, called carbide derived carbon contained pores that could store large numbers of ions from the electrolyte of a supercapacitor. Deposited on the surface of a silicon wafer, CDC, the researchers theorized, could be micromachined into electrodes.    

But to make their theory reality, the researchers had to resolve several issues, such as creating a CDC layer that would actually stick to the surface of a silicon wafer and that would be capable of absorbing sufficient ions to allow storing energy in a supercapacitor.

Fast-forward to today, when the researchers report that they made mechanically stable porous carbon films by fine-tuning the chlorination process. Using a standard sputtering technique, they deposited a 6.3 micrometer thick layer of titanium carbide on the thin insulating SiO2 layer of a standard silicon wafer. Then they exposed this layer to chlorine at 450ºC. The chlorination reaction progressed from the top of the titanium carbide (TiC) layer to the bottom, while chlorine atoms snatched up the titanium atoms, leaving empty 0.6-nanometer pores in the carbon layer.  

"We stopped the chlorination before the whole titanium carbide layer became transformed into CDC," says Simon. To their surprise the researchers found that the thin residual one-micrometer TiC layer caused the CDC film to adhere strongly to the SiO2 layer.  As a bonus, the TiC layer, being an electric conductor, allows electrons to polarize the CDC layer, which then attracts ions into its pores to compensate for the charge, says Simon.

Using standard microfabrication methods, the team fabricated 2 by 2 mm supercapacitors with 18 interdigitated CDC electrodes on a 7.62 cm  silicon wafer and reported an electric capacity of 170 farads per cm2, which allows an energy storage that outperforms current state-of-the-art micro-supercapacitors.

A second, and welcome, surprise was that if you chlorinated completely the initial titanium carbide layer, changing the entire layer into CDC, you could simply peel this layer off the silicon wafer. “The film doesn’t become brittle and remains mechanically stable,” says Simon.  This opens the way to the creation of flexible supercapacitors that could be glued onto flexible materials, such as polyethylene, says Simon.     

New Materials Push Solar-to-Hydrogen Closer

The term “solar energy” usually conjures up visions of blue glass rectangles, absorbing sunlight and turning it into electricity. But there’s another way to take advantage of the sun’s power—use it to create hydrogen fuel.

Read More

Fusion Stellarator Wendelstein 7-x Fires Up for Real

Today the German Chancellor Angela Merkel, at a ceremony at the Max Planck Institute for Plasma physics in Greifswald in Germany, pressed a button that caused a two-megawatt pulse of microwave radiation to heat hydrogen gas to 80 million degrees for a quarter of a second.

No, she was not setting off some new kind of hydrogen bomb. She was inaguriating the fusion reactor Wendelstein 7-X, the world’s largest stellarator, by generating its first hydrogen plasma. 

Read More

Japan Building World's Largest Floating Solar Power Plant

Kyocera Corp. has come up with a smart way to build and deploy solar power plants without gobbling up precious agricultural land in space-challenged Japan: build the plants on freshwater dams and lakes.

The concept isn’t exactly new. Ciel et Terre, based in Lille, France, began pioneering the idea there in 2006. And in 2007, Far Niente, a Napa Valley wine producer, began operating a small floating solar-power generation system installed on a pond to cut energy costs and to avoid destroying valuable vine acreage.

Kyocera TCL Solar and joint-venture partner Century Tokyo Leasing Corp. (working together with Ciel et Terre) already have three sizable water-based installations in operation near the city of Kobe, in the island of Honshu’s Hyogo Prefecture. Now they’ve begun constructing what they claim is the world’s largest floating solar plant, in Chiba, near Tokyo.

The 13.7-megawatt power station, being built for Chiba Prefecture’s Public Enterprise Agency, is located on the Yamakura Dam reservoir, 75 kilometers east of the capital. It will consist of some 51,000 Kyocera solar modules covering an area of 180,000 square meters, and will generate an estimated 16,170 megawatt-hours annually. That is “enough electricity to power approximately 4,970 typical households,” says Kyocera. That capacity is sufficient to offset 8,170 tons of carbon dioxide emissions a year, the amount put into the atmosphere by consuming 19,000 barrels of oil.

Three substations will collect the generated current, which is to be integrated and fed into Tokyo Electric Power Company’s (TEPCO) 154-kilovolt grid lines.

The mounting platform is supplied by Ciel et Terre. The support modules making up the platform use no metal; recyclable, high-density polyethylene resistant to corrosion and the sun’s ultraviolet rays is the material of choice. In addition to helping conserve land space and requiring no excavation work, these floating installations, Ciel et Terre says, reduce water evaporation, slow the growth of algae, and do not impact water quality.

To maintain the integrity of the Yamakura Dam’s walls, Kyocera will anchor the platform to the bottom of the reservoir. The company says the setup will remain secure even in the face of typhoons, which Japan experiences every year.

Kyocera, a Kyoto-based manufacturer of advanced ceramics, has branched out into areas like semiconductor packaging and electronic components, as well manufacturing and operating conventional solar-power generating systems. Now, several Kyocera companies are working together to create a niche industry around floating solar installations.

The parent company supplies the 270-watt, multicystalline 60-cell solar modules (18.4-percent cell efficiency, 16.4-percent module efficiency); Kyocera Communications Systems undertakes plant engineering, procurement and construction; Kyocera Solar Corp. operates and maintains the plants; and, as noted above, the Kyocera TCL Solar joint-venture runs the overall business.

“Due to the rapid implementation of solar power in Japan, securing tracts of land suitable for utility-scale solar power plants is becoming difficult,” Toshihide Koyano, executive officer and general manager of Kyocera’s solar energy group told IEEE Spectrum. “On the other hand, because there are many reservoirs for agricultural use and flood-control, we believe there’s great potential for floating solar-power generation business.”

He added that Kyocera is currently working on developing at least 10 more projects and is also considering installing floating installations overseas.

The cost of the Yamakura Dam solar power station is not being disclosed. But a Kyocera spokesperson told Spectrum that although the cost of the floating support modules making up the platform is higher than that of platforms used in land-mounted installations, “Implementation costs for floating solar plants and ground-mounted systems are about the same,” given that there is no civil engineering work involved.

The Yamakura Dam plant is due to begin operation by March 2018.

NOAA Model Finds Renewable Energy Could be Deployed in the U.S. Without Storage

The majority of the United States's electricity needs could be met with renewable energy by 2030—without new advances in energy storage or cost increases. That’s the finding of a new study conducted by researchers from the National Oceanic and Atmospheric Administration (NOAA). The key will be having sufficient transmission lines spanning the contiguous U.S., so that energy can be deployed from where it’s generated to the places where its needed.

Reporting their results today in Nature Climate Change, the researchers found that a combination of solar and wind energy, plus high-voltage direct current transmission lines that travel across the country, would reduce the electric sector's carbon dioxide emissions by up to 80 percent compared to 1990 levels.

Conventional thinking around renewable energy has been that it is too variable to be broadly implemented without either fossil fuels to fill in the gaps or a significant ability to store surplus energy, says Sandy MacDonald, co-lead author of the paper and previously the director of NOAA's Earth System Research Laboratory. However, MacDonald thought that previous estimates had not used accurate weather data and so he wanted to design a model based on more precise and higher resolution weather data.

In the study, the team used historical and projected carbon dioxide emission and electricity cost data from the International Energy Agency, which projects that U.S. electricity will cost 11.5 cents per kilowatt hour, on average, in 2030, and that carbon dioxide emissions will be 6 percent above 1990 levels.

They designed a model called National Electricity with Weather System that took into consideration demand across one-hour time increments as well as generation capability. The main constraint of the model was that it had to use existing technologies.

The researchers ran three different scenarios: one that assumed renewables at a low cost and natural gas at a high cost; a second that accounted for a low cost of natural gas and a high cost of renewable energy; and a third that assumed mid-range prices for both.

For all three scenarios, both carbon emissions and price were reduced. The low-cost renewables/high-cost natural gas scenario resulted in the greatest reduction in carbon emissions (78 percent below 1990 levels), and average electricity prices at 10 cents per kilowatt hour. The mid-range model resulted in a 61-percent drop in emissions, and electricity prices at 10.2 cents per kilowatt hour. The high-cost renewables/low-cost natural gas scenario reduced carbon emissions by 33 percent; electricity cost 8.6 cents per kilowatt hour.

MacDonald says that some of the findings were a bit counterintuitive and highlighted the cost benefits of having a national electricity grid. For instance, the model chose to implement very few offshore wind generators, finding that building transmission lines from wind generating plants in North Dakota to New York, was in fact cheaper than building wind off the coast of New York, despite the longer distance it would have to travel. However, “as you start to make the geographical area smaller, it does pick up offshore wind because you're restricting access to some areas,” MacDonald says.

The study also looked at land and water use requirements, finding that water consumption in the electric sector could be reduced by 65 percent. The amount of land that would need to be converted for use by renewables would be 6,570 square kilometers, or about .08 percent of the U.S. Land use has proven to be a contentious issue for energy development, and although the model prohibited renewable energy development on protected lands, urban areas and steep slopes, and restricted natural gas development to sites where a fossil fuel plant existed in 2012, there could still be hurdles.

Christopher Clack, co-lead author of the paper and now a research scientist at the University of Colorado, says that one major difference between the group's model and other recent research evaluating the impacts of renewable energy is the resolution at which they evaluate weather data and the timescale at which they anticipate implementing renewables.

For instance, a recent study published in the Proceedings of the National Academy of Sciences, came to the conclusion that water, wind, and solar could supply the bulk of U.S. energy needs by 2050.

That study focused on more than just the electric sector; plus those researchers designed their model using weather data at a resolution of 250 kilometers. The NOAA team's weather resolution was much higher, at 13 kilometers. “The resolution of the weather data is a key point,” Clack says, given how variable wind can be over even small regions.

Another key difference is that the PNAS study predicts 2050 costs for energy and energy storage, while the NOAA model “optimizes in a similar way that markets operate today,” Clack adds.

“We are agnostic to technology and allow the model to select the cheapest mix. What we find is the cheapest mix, at a national scale, is large amounts of wind and solar, enabled by cost effective transmission,” he says.

And although the model still selects some amount of natural gas, hydro, and nuclear, it is based on technology available today and doesn't make far-out predictions of price, MacDonald adds.

Clack and MacDonald acknowledge that there will be significant hurdles in implementing such a model. Although electricity markets in various states do coordinate with each other now, they don't to the extent that would be required for the model to work.

But, MacDonald likened the model to the U.S. interstate highway system, which was overlaid atop regional, state and local systems. “If you have a national transmission network, it's less expensive and more reliable,” he says.

“In the coming years, different entities will have to work out what their energy mix will be to meet legislation requirements that are in the works,” Clack says. “The idea of this tool is to give options to help policy makers decide what's best.”

Environmentally-Friendly Liquid Battery

A new liquid battery that is more environmentally friendly than its existing counterparts could help lead to safe, inexpensive storage of renewable energy for power grids, researchers in Shanghai say.

The new battery also has a much longer cycle life and much greater power than any current rechargeable battery, the scientists add.

The sun and wind are variable sources of power. As such, utility companies want massive rechargeable battery farms that can store the surplus energy from these renewable power sources for use when the sun goes down and the wind does not blow.

Read More

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More