Energywise iconEnergywise

First New U.S. Nuclear Reactor in Two Decades to Begin Fueling in Tennessee

Yesterday, U.S. federal regulators approved an operating license for Unit 2 of Tennessee Valley Authority's Watts Bar nuclear power plant; it's only taken 19 years and almost 4.5 billion dollars. The Gen II plant should be producing power by the end of the year, and it shouldn’t bother you in the least that we mostly stopped building Gen II reactors sometime in the mid ‘90s. 

Read More

Startup Time for Fukushima's Frozen Wall. Here’s Why it Should Work

Japan's TEPCO is about to flip the switch on the infamous ‘ice wall’ intended to divert flowing groundwater around its crippled reactors at Fukushima and thus help stem the contamination of fresh groundwater at the site. The widely mischaracterized and maligned installation—which is a barrier of frozen soil rather than a wall of ice—has every chance of delivering the hoped for results, say radiation cleanup experts at U.S. national laboratories and feedback from initial system tests.

"The frozen barrier is going to work,” predicts Brian Looney, senior advisory engineer at the U.S. Department of Energy's Savannah River National Laboratory in South Carolina and co-author of an independent assessment of TEPCO’s frozen barrier. The report, produced in collaboration with researchers at Looney's lab and at Pacific Northwest National Laboratory, was completed in February but only released late last month; it found the system’s design to be sound and within the bounds of prior practice. 

Since the 2011 meltdowns at Fukushima, TEPCO has been fighting a losing battle with groundwater that flows downhill towards the sea, permeates Fukushima’s fractured reactor buildings, and contacts their melted-down nuclear fuel. Until recently, TEPCO was sucking 300-400 tons of contaminated water out of the buildings every day, adding continuously to the site’s ballooning fields of radioactive water storage tanks. The frozen barrier is one of a suite of measures intended to stem that tide. 

In September, TEPCO started operating a system of ‘subdrains’ to capture groundwater around the reactors before it enters the buildings. After months of negotiations, fishermen’s groups agreed to allow TEPCO to treat the lightly contaminated water and then discharge it to the sea. The subdrain could cut groundwater inundation of the reactor buildings in half, leaving “only” about 150 tons per day according to World Nuclear News.

The frozen barrier is a more definitive solution, intended to completely isolate the reactor buildings from groundwater by encircling them with a 1.5-kilometer-long, 30-meter-deep wall of frozen soil. It was designed by Japanese engineering and construction firm Kajima Corp., which began piecing together its infrastructure in June 2014. Since then, Kajima has perforated the ground surrounding Fukushima’s four reactors with 1571 bore holes, lined them with chiller pipes, and hooked up those pipes to refrigeration plants that pump out brine at an icy -30 degrees Celsius.

This groundwater isolation technology has been applied hundreds of times since the 1950s at mines and at deep excavations for tall buildings. But at Fukushima, it has been a magnet for scorn. The negativity stems from an understandable dearth of confidence in TEPCO and misreporting by media outlets. 

Many news outlets accidentally conflated Kajima's frozen barrier with a distinct and ill-fated TEPCO effort to freeze 5,000 to 6,000 tons of contaminated seawater in a utility trench adjacent to one of the reactors. CleanTechnica’s August 2014 report, "TEPCO Concedes Failure of Fukushima Ice Wall”, was one of many to describe the failed trench freeze as a section of the frozen barrier. 

And this confusion persists. Al Jazeera opined in March that the frozen barrier had “turned out to be another of the cleanup’s dramatically costly and utterly ineffective schemes.” It is also one of many new outlets to erroneously assert that Kajima's design was beyond the scale or applications of prior frozen walls.

The U.S. national labs’ analysis is as complimentary of Kajima’s design as the media’s opinions have been skeptical. According to Looney, their report found that Kajima’s design is “within the envelope of experience for successful barriers.” The team identified installations larger than Kajima’s design, such as a 3.66-km-long freeze wall at an open pit gold mine in Ontario that was four times deeper than Fukushima's, as well as urban projects that had more buried structures to work around (or through) than is the case at Fukushima. 

Site-specific analysis by Looney et al, meanwhile, projects that the freezing energy to be deployed by Kajima’s system will be equal to the task of managing Fukushima’s hydrological conditions. That finding is affirmed by initial testing of about 60 meters of the barrier in May 2015; the ground chilled as predicted.

The national labs study did identify small areas that could potentially resist freezing. But it also identified 10 to 12 available fixes, and also concluded that small leaks would be of little consequence given the suite of other groundwater control systems TEPCO has in place. “The goal of the barrier is to minimize flow to the reactors. You don’t actually need 100 percent effectiveness to reach that goal,” says Looney. 

While skepticism over the barrier’s water-stopping capability reigns in the popular media, Japanese regulators have been fretting over whether it might prove overly effective. Japan’s Nuclear Regulation Authority (NRA) has been holding back TEPCO from beginning the freeze while it assesses the risk of groundwater levels plummeting within the perimeter as the freeze kicks in, drawing highly radioactive water out of the reactors and contaminating the soil below.

Looney argues that this scenario is unlikely. It will take 1-2 months of refrigeration to establish each section of the barrier and at least six months of refrigeration before it has a measurable impact on groundwater levels within the frozen wall's perimeter, according to Looney. He adds that extensive monitoring should give TEPCO several months' notice of potential groundwater imbalances. 

But to minimize the risk of a groundwater crash, the NRA has requested that TEPCO first freeze the barrier’s side and downhill segments, saving the uphill segment (which Kajima finished first) for last. According to Looney and recent media reports, the final piping should be complete and the system should be ready to switch on within weeks.

If the barrier kicks in next year, concluding what will have been a five-year battle against groundwater, TEPCO will then be in a position to attack a still-tougher foe: its shattered reactors’ melted fuel. Naohiro Masuda, president of TEPCO’s cleanup subsidiary, Fukushima Daiichi Decommissioning Company, told Japanese state broadcaster NHK in March that TEPCO has “no idea” what the physical state of the fuel is (though it's got robots on the hunt) and no idea how to get it out. As Masuda put it: “We have to remove it remotely from 30 meters above. But we don’t have that kind of technology yet. It simply doesn’t exist.”

Only 15% of California's Big Solar Projects Are on the Right Kind of Land

The real estate agent’s mantra is well known: location, location, location. But location is important, too, when considering where to site utility-scale solar projects, and most of California's projects or planned projects are in less-than-ideal spots, according to a new study. As a result, these projects may have negative impacts on the environment and will not be as cost-effective or as carbon neutral as they could be.

Researchers from Stanford University and the University of California’s Riverside and Berkeley campuses identified 161 planned or proposed large-scale utility solar and applied an algorithm to determine how compatible they are with their location.

The results, which were published today in the Proceedings of the National Academy of Sciences, found that only 15 percent of sites were on compatible land.

Read More

Batteries Running on Shrooms

Mushrooms have energized many a marinara sauce, not to mention a few vivid hallucinations, but soon the fungi may be powering your Prius or getting your Galaxy phone to run longer. Engineers at the University of California have shown that mushrooms can create long-lasting, environmentally friendly anodes for lithium-ion batteries.

Read More

Nuclear Cybersecurity Woefully Inadequate

The risk of a major cyberattack on the nuclear industry is rising, potentially leading to blackouts or even meltdowns, researchers say.

The 2010 Stuxnet worm's infiltration of Iran's nuclear program was the most dramatic cyberattack the nuclear sector has ever seen. But it was not the only one. In one case in 2003, the Slammer worm infected the Davis-Besse nuclear power plant in Ohio, leaving reactor core safety data unavailable for nearly five hours. In another example from 2014, hackers stole blueprints of at least two nuclear reactors and other sensitive data from Korea Hydro and Nuclear Power Co., then demanded money from the company in exchange for not releasing potentially important files.

Read More

Arizona Utility Blinks in Bitter Battle Over Rooftop Solar

Arizona’s biggest utility, Arizona Public Service, is withdrawing its bid to jack up monthly fees for rooftop solar users in its territory. The retreat, tendered last week to the Arizona Corporation Commission (ACC), capped an eventful month in the high-stakes battle between utilities and solar advocates that's raging across Arizona rooftops. The party with the most bruises is not Arizona Public Service (APS), however, but the ACC itself. The elected body referees the state's power markets, but all five of its commissioners now face accusations of bias that challenge their ability to fairly adjudicate the rooftop solar dispute.

Arizona’s solar dispute is hot, but not unique. Across the United States utilities are fighting to contain or eliminate “net metering” policies that pay rooftop solar users retail prices for the surplus power that their panels export to the grid (thus offsetting retail charges for power they consume at night). Utilities argue that solar customers rely heavily on the grid but, under net metering, pay little or nothing to maintain it. Over the past year all of Arizona’s utilities levied or proposed new fees for customers installing rooftop solar systems. APS’s proposal worked out to about $21 per month.

Solar advocates argue that rooftop solar provides a variety of benefits to the grid—such as reducing consumption of fossil fuels and lessening reliance on distant power plants. They see fees from utilities such as APS, which owns fossil-fueled and nuclear power plants, as unfair competition. 

In August, ACC staff sided with solar advocates’ call to defer consideration of proposed fees so they could be reviewed in the broader context of the utility’s overall business. When the ACC commissioners voted to overrule, calling for immediate hearings on solar fees, San Francisco-based solar installer Sunrun and two former commissioners filed challenges with the ACC, alleging bias

The bias filings allege that the two commissioners elected in 2014 allegedly benefited from $3.2 million in secret campaign donations to independent groups by APS. The filings also cite a third commissioner elected in 2012 for inappropriate public comments about rooftop solar users. (Earlier challenges accuse the remaining two commissioners of bias based on lobbying activities prior to their election in 2012.)

APS presents these attacks as a bid by solar advocates to avoid debating the proposed utility fees on the merits. APS writes:

They have retreated to procedural tactics and character attacks designed to discredit elected officials and undermine the integrity of the Arizona Corporation Commission.The obvious goal is to paralyze the Commission.

However, allegations of improper campaign contributions by APS have been swirling in the Arizona media for over a year. APS acknowledges that it is politically active, and has refused to confirm or deny the allegations. 

Under state law independent groups financing political advertisements in Arizona are not obligated to reveal their donors, so tracking such “dark money” spending is difficult. The Arizona Republic, Phoenix’ leading daily newspaper, reported last month that two commissioners are seeking ACC staff advice as to whether the ACC can compel utilities such as APS to reveal their political contributions

According to the Republic the commissioners elected in 2012 benefited from contributions from the Arizona Chamber of Commerce and Industry, including money from Arizona Public Service. And it writes that the two commissioners elected last year benefited from "independent political campaigns widely believed to be financed with so-called dark-money from APS.” 

In March 2015 an organization tracking campaign finance contributions revealed that a foundation led by a former APS chairman and CEO that is normally dedicated to supporting Arizona State University had inexplicably given $100,000 to a "shadowy" nonprofit called Save Our Future Now. That group spent $2.4 million on TV ads attacking pro-solar ACC candidates in 2014. 

Green Flow Battery Based on Cheap, Nontoxic Reagents

Flow batteries are an interesting alternative to conventional batteries because they can store charges in the form of a liquid electrolyte that can be kept in tanks. Only the size of the tanks limits the amount of energy that can be stored. Utility companies and energy engineering firms have been eying these devices because they might replace storage batteries, devices that: have a limited lifetime; are known to be fire hazards; require metals such as lithium, that are limited in supply; and can only store energy in the electrode material, which has a fixed volume. What stands in the way of the wide implementation of flow batteries, in spite of the fact that they are commercially available, is that the compounds they use are expensive, toxic, and corrosive. Additionally, the energy storage capacity per unit volume of the electrolyte is low, typically just squeaking past 20 watt-hours per liter.

Recently IEEE Spectrum reported on a flow battery that has a better performance and uses a basic electrolyte instead of an acidic one, keeping a zinc compound in solution.  Now a team of researchers at Harvard University have reported in the 25 August issue of Science that they’ve created a version that uses two alkaline electrolytes that contain quinone and ferrocyanide—both widely available and non-toxic compounds—in solution. The researchers reported that after 100 charge-discharge cycles, the battery’s stored energy capacity had degraded less than 1 percent.  

Michael Aziz, who led the research group, realized that if the negative points of today’s flow batteries—cost and toxicity—could be overcome, the flow battery could become a commercially viable alternative for the storage now badly needed for intermittent energy sources such as solar and wind.  

“This looks like a compelling value proposition if you can find inexpensive chemicals that work well,” says Aziz.  “We noticed that there is a molecule in plants that takes the electrons from chlorophyll, and it forms an electron shuttle in photosynthesis that ports electrons over and over, without any sign of degradation. That is exactly the functionality you want for the battery,” says Aziz.  

However, the molecule did need some work; it was not soluble, and the reduction potential was not the right value.  “All these things can be changed,” he noted. “We found ways to render the molecule soluble, and change the voltage, so we have something that works and that is highly soluble.” 

The team made it clear that it was headed in this direction last year, when the researchers published a paper in Nature describing how they paired up this compound with bromine, which is a toxic substance. Aziz explains that, “We switched to alkaline chemistry because of the availability of a positive electrode material that is stable and soluble in base, but not in acid, and that is ferrocyanide.” Ferrocyanide is a widely available compound, used as a food additive and which, paradoxically, is not toxic because the cyanide groups are so strongly bonded to the iron atoms already present that they cannot attack the iron atoms in hemoglobin. “So now we have fulfilled our promise by coming through with non-toxic molecules on both sides [of the ion-selective membrane],” says Aziz. “We now have an entirely non-toxic chemistry.” 

Flow cells need electrolytes that keep these compounds in solution with extreme pH values so that electrons and ions can flow easily.  Most current flow batteries use acids, but the use of a base has other advantages. “Base is just less corrosive than acid, and this allows us to contain these electrolytes with much less expensive materials,” says Aziz.

At this point, about 95 percent of stored energy in the United States is in the form of water pumped up into a reservoir, which can be released to generate power by driving turbines when flowing back down. But in flat or arid areas, this storage option is not available, and it is here that flow batteries could play an important role, argues Aziz. “We are looking at a technology that can be used where pumped hydro cannot—in the middle of a city, on rooftops, near windfarms and solar farms,” he says.  However, reaching this goal will require further work.  “We need to prove that these molecules can last many thousands of cycles of oxidation and reduction, without doing anything else.” 

Is industry interested? When they published their first paper in Nature last year, there was a lot of interest from companies. According to Aziz, “Most of them said, this is really interesting, call us as soon you get rid of the bromine.”

Carbon Polluters Fund XPrize to Repurpose Their Emissions

XPRIZE—the organization behind grand technology challenges such as the race to space won in 2004 by SpaceShipOne and current contests to land a Lunar rover and a Star Trek-style medical tricorder—unveiled a competition today that tackles a more mundane yet critical challenge: transforming carbon dioxide emissions from power plants into saleable products to help slow or reverse climate change. The competition's $20 million kitty has been raised from major carbon emitters: a coalition of oil and gas producers producing high-carbon oil from Alberta’s oilsands, and New Jersey-based electric utility NRG Energy. 

Entrants will have until early 2020 to develop CO2 conversion technologies on two tracks: one targeting flue gas emissions from coal-fired power plants, and a second targeting the less concentrated emissions from natural gas-fired generators. The technologies that convert the most CO2 into products with the highest net value will win. 

XPRIZE Chairman and CEO Peter Diamandis said in a statement that the Carbon XPRIZE confronts the fact that our "age of unprecedented technological progress and prosperity” is powered primarily by fossil fuels. According to the statement, competing technologies could incorporate CO2 into such products as chemicals, cement and other building products, and transportation fuels. 

Some carbon dioxide is already being repurposed today, but the price is high. A few oil producers are using post-industrial CO2 as a working fluid to loosen up aging oil reservoirs, simultaneously boosting the flow of oil to the surface and storing the fossil carbon underground. Researchers in China are exploring a twist on such ‘enhanced oil recovery’ to produce water, proposing to sequester CO2 captured from a coal-fired power plant in Tianjin while yielding an estimated 1.4 million cubic meters of deep water annually.

Of course, burning the extra oil produced via enhanced oil recovery releases more fossil CO2, clawing back some of the environmental benefit. And a supply of CO2 for such projects is hard to come by due to the high cost of equipping power plants for carbon capture and operating the equipment.

The Carbon XPRIZE seeks to catalyze carbon capture by turning CO2 molecules into products with higher added value. Scientists are exploring the possibilities already. Austrian researchers have, for example, demonstrated the use of enzymes and electricity to convert COinto alcohol-based fuels. And last year a demonstration plant in San Antonio began capturing CO2 from a cement plant and converting it into minerals and chemicals, including sodium carbonate, hydrochloric acid and bleach.

Unfortunately the environmental benefits of synthesizing CO2 into something new remain dubious because capturing CO2 and chemically refashioning it requires considerable energy. In the case of the alcohol fuels, the energy requirements of the chemical processing completely negate the climate protection achieved by recycling CO2.

The ultimate irony of the Carbon XPRIZE is that it could turn out winners that still do not pencil out economically or environmentally. Meanwhile, the oil producers backing it via Canada’s Oil Sands Innovation Alliance, a Calgary-based trade group, are sitting on advanced production technology that promises to profitably slash their emissions at the source. 

Oilsands emissions are rising as the industry shifts from open-pit mines that scrape Alberta’s tarry bitumen off the surface to operations that attack deeper deposits by drilling wells and injecting steam underground to melt the bitumen and pump it to the surface. But options that could shrink that footprint exist. 

Calgary-based oilsands process developer N-Solv has proven the effectiveness of a low-energy process at a 250 to 300 barrel-per-day demonstration site near Fort McMurray that uses butane to dissolve bitumen instead of steam. This month, the company celebrated production of its 60,000th barrel of heavy oil since starting the demonstration plant in 2014, and announced it had won its own prize—a place among 2015 honorees for Canada's Clean50 Awards

Eliminating steam production from natural gas makes the overall process cheaper while cutting carbon emissions per barrel by as much as 80 percent. “We can be as clean or cleaner than conventional oil,” says John Nenniger, N-Solv’s founder and chief technology officer. 

Oilsands operators have conducted their own experiments with solvent-based production over the past five years, but implementation is lagging. Nenniger says oilsands producers neglected the technology when oil prices were high because they could make a profit with the older and dirtier steam technology. Weak Canadian climate policies meant they were not obligated to take a risk on the cleaner approach. And now that oil prices are low and oilsands projects are losing money, capital for new operations is scarce. 

”It’s so frustrating from my perspective because every other industry is so aggressively competing to get to the bottom of the supply cost curve,” says Nenniger. “The oilsands industry says stuff, but they don’t actually do anything. Investment has been 10 to 20 fold below what it should have been.”

Canadian Prime Minister Stephen Harper pulled his country out of the Kyoto Protocol in 2011, arguing that complying with the treaty's prescribed greenhouse gas reductions would hurt Canada's energy-intensive economies. Change may be coming, however. Harper, who hails from Alberta and has strong support from the oil and gas sector, finds himself in a tight race for re-election in October. 

Thomas Mulcair of the New Democratic Party, a former Quebec environment minister who is leading in nationwide polls, unveiled plans this weekend for a cap and trade program to cut carbon emissions 80 percent by 2050—the same goals established by climate policy leaders such as the European Union and California. As of 2013, Canada's emissions were 18 percent above 1990 levels.

Scotland and Ireland Consider a Linked Renewable Energy Future

The governments of Scotland, the Republic of Ireland, and Northern Ireland plan to coordinate the development of offshore renewable energy projects in their shared ocean water. The goal is to build an interconnected network of offshore wind, tidal, and wave generation and transmission in the Irish Sea, the straits of Moyle, and the western coast of Scotland.

The countries launched a feasibility study five years ago. It culminated last week in a series of reports including: a business plan; recommendations for how to implement projects; three proposed projects to serve as initial proof of concepts; and a spatial plan that provides guidance to potential developers regarding the best places to install offshore wind, tidal, and wave energy projects.

The area between Ireland and Scotland has the potential to generate around 16.1 gigawatts of renewable energy, including 12.1 GW from offshore wind and 4.0 GW from wave and tidal energy. The ISLES project's initial goal is to connect 6.2 GW of that potential generation by 2020.

Read More

Researchers Tweak Artificial Photosynthesis for More Efficient Hydrogen Production

A team of researchers from Germany and the U.S. have announced a new record value of 14% for the efficiency of water splitting by solar energy in a single cell.  The previous record, 12.4%, was achieved 17 years ago by the National Renewable Energy Laboratory  and the value in subsequent experiments with a technology called artificial photosynthesis has hovered around that figure.  The researchers published this result last week in Nature Communications.

These figures should not be confused with the light conversion percentages of photovoltaic cells, explains Thomas Hannappel of the Technical University Ilmenau in Germany, who was the academic advisor for the researchers.   “The percentages refer to the hydrogen efficiency,  that is you compare the light energy captured by the photovoltaic cell to the energy that can be supplied by burning the produced hydrogen,” says Hannappel. 

Artificial photosynthesis can be achieved by two different approaches:  The first approach is a photovoltaic cell that supplies the current to the electrodes in a separate cell that splits water molecules into their constituents, hydrogen and oxygen.  In the second approach, the photovoltaic cell also acts as an electrode in contact with water, and the voltage it produces splits the water directly.   Having these two functions, photoelectricity generation  and electrolysis into one unit makes it more usable, says Hannappel.  “We have a greater range for cost reduction with one unit than with two different units,” says Hannappel. 

However, still a lot of research will be necessary to reach this stage.  “One of the referees during the publication process asked us, ‘Is this just a matter of dunking a high-efficiency PV cell into a solution and then getting out hydrogen?’ ” remembers Matthias May of the Helmholz Zentrum Berlin, whose doctoral dissertation dealt with this research.  Indeed, the fact that you have to deal with a liquid-solid interface, the electrolyte, and the photosensitive semiconductor and a its interface with the catalyst surface is not a trio that gets along easily with each other. 

First, the researchers opted for using III-V semiconductors for the photovoltaic material.  Not the cheapest and experimentally easiest choice, but these materials, made from elements residing in the third and fifth column of the periodic table, are more efficient in converting light into electricity than silicon.  To achieve the required voltage, the researchers used tandem cells in which two layers with different band gaps convert photons from the entire solar spectrum into electricity.  

Now, to make the two percent improvement required some experimental ingenuity.  “We tuned the surface of our III-V solar cell on a subnanometer scale, transforming aluminum-indium phosphide into phosphate species and then depositing the catalyst on top,” says May.  “What was important here was that the photochemical transformation process was all done in situ.  This means that this interface never saw ambient air before the catalyst was deposited. That was very important because otherwise you will get charge-carrier recombination centers at the interface and this will reduce your overall device efficiency.”

Also the stability of these devices, complicated by chemical interactions between the electrolyte and the photovoltaic surface is still a far cry from current voltaics, although their prototypes ran for 40 hours.  “One year ago we had stabilities of a couple of seconds, and we have improved that by three-four orders of magnitude, so we are optimistic that we can improve that by another three or four orders of magnitude.”

Ultimately, higher efficiencies, starting with 18–20% will allow the conversion of solar energy into hydrogen  to become part of the burgeoning hydrogen economy.  “In Germany we have a company that uses windmills connected to electrolyzers and they inject the hydrogen directly into the methane gas grid; you can do that up to five volume percent without changing the grid.  This also forms some storage capacity if you have an overcapacity of electricity in the grid,” says May.

Advertisement

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More