Energywise iconEnergywise

Consequences of Japan’s Energy Market Reforms Not Easy to Predict

This April 1st will be no laughing matter for Japan’s 10 regional power suppliers. The Japanese government has chosen All Fools Day to start the second phase of its energy reforms: the liberalization of the electricity retail market. The third phase, the unbundling of transmission and distribution from the generation of electricity, is to follow in 2020.

The first phase got under way last April when the government established the Organization for Cross-regional Coordination of Transmission Operators (OCCTO), an independent institution to promote and coordinate wide-area operations.

“It is OCCTO’s job to plan how to lay out an efficient transmission network across the country,” Hiroshi Ohashi, professor of economics at the University of Tokyo, told IEEE Spectrum. “It must also decide how much reserve power each region should have, as well as write up new rules governing the use of transmission rights.”

Reform became a priority, according to a 2015 Ministry of Economy, Trade and Industry report, after the Great East Japan Earthquake of 11 March 2011 revealed that: the regional monopolies were unable to transmit electricity beyond their fiefdoms; they had become inflexible in changing their energy mix; and they had difficulties in increasing the ratio of renewable energy they could use.

Now, come 1 April, not only will the Big Ten electricity providers have free reign to compete in each other’s domains, they’ll also face competition from more than 150 new start-up energy groups, as well as 30 existing small competitors. All these new entrants want a share of a retail electricity market worth more than 6.5 trillion yen (US $67 billion), representing 40 percent of the country’s total energy usage.

The gas industry will also undergo similar deregulation in April 2017. With that in mind, gas companies have become a significant presence among the electric energy start-ups by creating strategic partnerships and alliances. They’ve linked up with each other and with: the major mobile telecom firms NTT Docomo, Softbank, and KDDI’s au; major trading companies; convenience-store chains such as Lawson; and other industry sector leaders.

New entrants’ share of the electricity market, as of last summer, had already reached roughly 10 percent, according to the Institute of Energy Economics, Japan (IEEJ), a private think tank in Tokyo. By early February, 106,000 customers had applied to switch suppliers—though 99 percent of these switches will occur within the regions currently dominated by Tokyo Electric Power Co. (TEPCO) and Kansai Electric Power Co. (KEPCO, which covers Osaka and Kyoto in western Japan).

Being the two largest power suppliers partly accounts for this concentration of applicants in the TEPCO and KEPO regions, as does the fact that both suppliers have a number of nuclear power stations that remain off line in the wake of the Fukushima Daiichi Power Station accident in March 2011.

“Nuclear power generation represents a major share of their energy mix, so the costs for these utility companies are rather high,” says Masakazu Toyoda, chairman and CEO of IEEJ, who briefed the press recently on the ongoing energy reforms.

Among developed economies, Japan has come late to the energy deregulation party, so it has been able to study how energy liberalization has gone in the United States and Western Europe. One area of concern noted by Toyoda is that those regions’ reforms have not brought about adequate investment in the reformed industries. This is attributed to the uncertainty over energy sources and prices, and the fact that power-generation facilities have high fixed costs, which require a lengthy time to recover.

“This [issue] has required some form of government intervention such as [introducing] the capacity market,” notes Toyoda.

A capacity market mechanism is the creation of some form of compensation paid to the suppliers for building energy capacity beyond what is presently needed, but based on future projections, may be required in a few years or in the case of a natural disaster.

Japan, too, will require some kind of capacity market mechanism, says Ohashi. “If only to maintain existing conventional generating facilities, which face lower levels of utilization over the long term because of increasing penetration of solar and other renewable generation.”

Another area where the government may need to step in—and is already doing so—is revising the feed-in-tariffs (FITs), which were set at high levels after the Fukushima Daiichi nuclear accident led to the shutdown or decommissioning of all its 55 nuclear plants. But this led to a surge in new producers, most mainly focused on solar generation. However, the former regional monopolies have balked at taking on more new applicants, voicing concern about their ability to maintain stable energy supplies.

According to IEEJ data, resource-poor Japan charged industrial customers 20.4 cents per kilowatt-hour in 2012; residential customers paid 29.01 cents. Thanks in part to tariff reductions, these charges had decreased to 18.8 cents and 25.3 cents respectively by 2014. By comparison, in the resource-rich United States, the average industrial customer was paying 7 cents per kWh, and the average household paid 12.5 cents in 2014.

“FIT has been heavily biased towards unstable, renewable power supplies, especially solar,” says Ohashi. “As the government decreases FIT prices, this will become less of an important issue. But it will take time to see the consequences of this policy.”

FIT adjustments aren’t the only point of uncertainty. Given the various challenges Japan faces in transforming its energy market, it is safe to say that no one can foresee the consequences of the changes that are about to get underway on All Fools Day.

Japan’s Nuclear Energy Comeback Takes a Tumble

Just when it seemed Japan was poised to get its nuclear plants up and running again after the 2011 accident at Fukushima Daiichi brought about the shutdown of all the country’s nuclear operations, a series of mishaps has raised doubts over the government’s ability to achieve its goal of supplying 20-22 percent of Japan’s energy needs with nuclear power by 2030.

Last month, TEPCO, the regional electric utility that operated the Fukushima plant, issued a press release admitting that according to the results of a recent investigation, staffers had not followed guidelines requiring them to quickly declare a meltdown following the Daiichi accident.

“In the course of our investigations, it was discovered that TEPCO’s internal manual at the time clearly stated that a core meltdown was to be determined if the percentage of core damage exceeded 5%,” states the release. It goes on to say that, “We have confirmed that there were events where it may have been possible to issue notifications and reports more promptly immediately after the tsunami hit on March 11, 2011.”

Two days before last month’s TEPCO announcement, Kansai Electric Power Co. (KEPCO, which serves the Osaka and Kyoto regions) revealed that it had found a leak on 20 February in the filtering system of the Unit 4 reactor at its Takahama Nuclear Plant in Fukui Prefecture, some 500 kilometers west of Tokyo. A contaminated pool of water was also discovered. The incident happened during preparations to restart the reactor after Japan’s Nuclear Regulatory Authority’s (NRA) had deemed it safe to go back on line.

“Subsequently, the puddle was wiped [up] and it was confirmed that there was no remaining contamination,” the KEPCO announcement explained.

Convinced that all was well, KEPCO started up the reactor on 26 February. It shut down automatically three days later due to a “main transformer/generator internal failure,” the company reported.

But the biggest blow came on 9 March, when the District Court in Otsu, Shiga Prefecture, located near the Takahama plant—though unprecedentedly not in the same prefecture—ordered the immediate shutdown of Units 3 and 4. The decision came after it agreed with a group of local plaintiffs that the plant did not satisfy all the NRA safety requirements. The Unit 3 reactor had gone back online in January.

Despite this setback, KEPCO has a reasonable chance of having the injunction overturned, says Takayuki Terai, director of the Institute of Engineering Innovation at the University of Tokyo. According to Terai, KEPCO’s saving grace might come from the higher court believing that the scientific grounds for the district court’s decision are not strong and involve the impractical idea of “absolute safety.”

The court’s decision also left open the possibility that, because the differences between what happened and what should have happened in the immediate aftermath of the Fukushima Daiichi accident had not yet been fully resolved and made clear, this cast doubts on the validity of NRA’s safety standards.

“We do take issue with that,” said Shunichi Tanaka, head of the NRA, at a press conference on 22 March. “We made a thorough analysis of the causes of the accident, and we believe we have (incorporated this knowledge) in our new set of regulations.”

In addition, Tanaka said the NRA had consulted with countries around the globe and the International Atomic Energy Agency about international safety practices. “So my current position is that just because the decision (to shut down the plant) came out of the District Court of Otsu, it doesn’t mean that we need to change the regulations at this point in time.”

Nevertheless, says the University of Tokyo’s Terai, “Should there be more legal actions of this kind inside and outside the prefectures where the plants are located, the power companies would face serious problems in starting up their nuclear power plants.”

Given that some 30 lawsuits and petitions for injunctions have been reported in the press, such an outcome seems likely. Currently, the NRA is reviewing 20 nuclear reactors in 16 power stations to see if they meet the new regulatory rules. Meanwhile, the Takahama closures leave just two reactors in operation—both at the Sendai plant run by Kyushu Electric Power Co., also in western Japan.

Clearly, the power companies’ missteps are not helping the NRA’s efforts to rebuild trust with citizens—a critical factor in winning the necessary approval of local governments. Tanaka admitted as much, saying, “I can’t really dispute that.” He added, “Rebuilding trust and confidence is about building a good track record of operations over the long term. So the utilities need to understand that fully.”

One way they can do it is to get some plants up and running without fumbling, says Terai. “Then they can reduce electricity prices. They and the government also need to explain the risks and benefits of nuclear power generation to the citizens and mass media in an easy to understand manner.”

ARPA-E Funding Personal Climate Control Systems with Robots, Foot Coolers, and More

ARPA-E is the Advanced Research Projects Agency-Energy, a DARPA-type government agency that funds energy-related high-risk, high-reward projects. Like most government agencies, they tend to bury their projects in strained, dull-sounding acronyms like DELTA, which stands for Delivering Efficient Local Thermal Amenities. But ARPA-E's DELTA project is, fortunately, much more interesting than it sounds: DELTA is funding a bunch of different ways in which climate control can be moved from building-level to personal level.

The problem that ARPA-E wants to solve with DELTA is the ridiculous amount of energy that we waste heating and cooling buildings that are, statistically speaking, almost entirely unoccupied. When you turn on the heat or the AC, you're dumping energy into changing the temperature of an entire structure, when all you really care about is the little area of comfort surrounding you. And if some people like it warmer and some people like it colder, one of those two groups is doomed to misery. It's a terrible, ridiculous system.

A much better approach would be to develop technologies for highly localized and customizable temperature control. Why bother heating or cooling an entire building, when all of the people inside it can instead customize their own little climate bubble to their ideal temperature? This is what ARPA-E's DELTA project is all about. Here’s a look at three different technologies from ARPA-E's annual summit that are in the process of moving from prototype to commercial reality within the next two years.

Read More
Sun and solar flares

NASA to Test Upgraded Earth Models for Solar Storm Threat

NASA relies on spacecraft to help keep watch for solar storms erupting on the sun. But on the ground, the space agency’s Solar Shield project aims to improve computer simulations that predict if those solar storms will create electrical hazards for power plants and transmission lines.

Read More

How to Pinpoint Radioactive Releases: Put the Genie Back in the Bottle

In 2014, a 55-gallon drum of nuclear waste exploded at the underground Waste Isolation Pilot Plant (WIPP) in New Mexico, the world’ s
only underground repository for transuranic waste (radioactive isotopes with mostly very long half lives). The amount of radioactive material released into the atmosphere was below existing security limits, but the facility has been closed down for recovery since the incident.

One positive result came from this accident: Robert Hayes, a nuclear engineer at North Carolina State University in Raleigh, was subequently able to demonstrate how it is possible to pinpoint the site of the release as well as to measure the amount of released nuclear material. Hayes used sophisticated software, meteorological data supplied by the National Oceanic and Atmospheric Administration (NOAA), and information from air sampling and monitoring equipment located several kilometers from the WIPP facility. His research will be published in the April 2016 issue of the journal Health Physics.

Hayes’ approach offers a wide field of possibilities for investigating and locating releases of radioactive material into the atmosphere—whether accidentally, from, say, a vent at a nuclear facility, or deliberately dispersed into the atmosphere by a dirty bomb. In fact, the principle is simple: By attempting to get the genie back into the bottle (figuratively of course), you find out where the bottle is.

Let Hayes explain. When an amount of radioactive material is released into the atmosphere, it will spread out in directions defined by meteorological conditions, mainly wind. By collecting data from different sampling stations in an area, and by looking at how the concentration of the radioactive material evolves over time, it is possible to create maps of how and in which direction it has moved. Now, by looking at recent meteorological data, it is possible to, in a sense, back extrapolate the cloud formation using computer modeling until it is concentrated above the site of the radiological release. Hayes does note, however, that this method will work only if you know when the release of radionuclei happened. A nuclear explosion, for example, can be precisely pinpointed because the time of the explosion can be verified independently with seismic or optical data.  

When you don’t know the time of the release, you have to use some tricks.  “You still back-extrapolate, but you make the assumption that it took place an hour ago, two hours ago, three hours ago. For these periods of time, you then back-extrapolate, and once you find a location that is credible—let’s say, a nuclear facility, then you pretty much know where the radiation came from,  and you have the time when the radioactivity was released,” says Hayes.

The location and release time aren’t the only things that can be deduced from these measurements. “If you already know the time and location, you have a much better scenario because you are able to extrapolate much more accurately and find out how much radioactive material actually was released and the time profile of that release,” says Hayes.

Detecting and locating releases by nuclear facilities in areas, such as North Korea, that are not accessible to investigators, presents an additional difficulty. The proliferation of similar radioactive isotopes released into the atmosphere by nuclear bomb tests during the last century makes it a challenge to sniff out the new stuff without the accompanying information.   But one clue researchers are trying to exploit is the difference in the ratio of radioactive isotopes between recently released nuclear material and that of older nuclear fallout. Those differences can be detected because the materials have different half lives. 

“We are looking at differences in those ratios, and they are the ‘fingerprints’ of recent events. For example, in North Korea, radioactive releases by nuclear facilities would be difficult to prevent, and this would provide a potential fingerprint that can be identified off-site,” says Hayes.

Beetles, Cacti, and Killer Plants Inspire Energy Efficiency

What do you get when you mix a desert beetle, a pitcher plant, and a cactus? Pick the right parts and you get an extremely slippery surface with an uncanny capacity to condense and collect water, according to research reported today in the journal Nature.

The advance could be a big deal for the energy world because, when it comes to energy efficiency, condensation lies somewhere between a necessary evil and a major drag. Nuclear, coal, and thermal solar power plants, for example, require large heat exchangers to condense the steam exiting from their turbines so that they can raise a new round of hotter steam. For other devices, such as wind turbines and refrigerator coils, condensation is the first step towards energy-sapping ice formation. 

Over the past decade rapid advances in biologically inspired materials have raised hopes for accelerating or guiding condensation via engineered surfaces, promising unprecedented control over water.

At least one bio-inspired surface treatment is in commercial use: The NeverWet surface protectant that paint manufacturer Rust-Oleum began selling in 2013, for example, contains nanoparticles that self-assemble to form a water-shedding surface roughness akin to that of a lotus leaf

The research published today by Harvard University materials scientists pushes this biomimicry trend to the max in their triply-bio-inspired surface [at left in video above]. Its most novel element are asymmetric 0.9-millimeter-tall mounds, inspired by the condensation-promoting bumps adorning the backs of desert beetles such as Namibia’s Stenocara darkling beetles [photo].

Photo: Martin Harvey/Alamy
Darkling beetle harvesting fog in the Namibian desert.

Joanna Aizenberg, whose Harvard lab produced the work, says their paper provides the first theoretical framework explaining how convex bumps promote the condensation of water vapor into droplets—a trick that helps some desert beetles harvest moisture from fog. 

Droplets formed on the Harvard team’s mounds glom together and roll off via a side ramp modelled after the water-droplet-guiding concavity of cactus spines. The final element are nano-pores across the entire surface infused with lubricant to create a slippery surface modelled after the trap of the carnivorous pitcher plant, in which prey literally hydroplane to their demise. 

Harvard has already refined that final element—lubricant infused nano-porosity—into an impressive water-hustling technology in its own right. Their so-called SLIPS design is en route to commercialization via SLIPS Technologies, a spin-off launched in 2014 with US $3 million in venture funding and a joint development agreement with German chemicals giant BASF. 

SLIPS Technologies chief technology officer Philseok Kim, a coauthor on today’s paper, says SLIPS-produced surface treatments and films are presently being tested on a skyscraper in New York City to document its ability to prevent ice and snow accumulation. Other applications are coming, including an anti-fouling coating for marine environments.

Harvard’s latest surface, however, is several times better at condensing water into droplets and hustling those droplets away than SLIPS alone [at right in the video above]. That could be of big value for heat exchange surfaces in power plant condensers, says Aizenberg, because droplets form an insulating barrier that slows further condensation: “You want to make sure they’re effectively removed from the surface as fast as possible.”

In fact, argues Aizenberg, the water condensing performance may be fast enough to enable a directly bio-inspired application: moisture harvesting systems for remote communities. Their surface may be able to condense moisture and collect it fast enough, even in arid environments where water droplets evaporate quickly. 

Tak-Sing Wong, a materials scientist at Pennsylvania State University who is designing his own bio-inspired slippery-surfaces, says the Harvard work could double the efficiency of power plant heat exchangers. “Forming droplets that can shed off of the surface is very important because it takes heat away immediately. The amount of water collected will be proportional to the heat that’s taken away from the surface,” says Wong.

Another energy-related device in line to benefit are refrigerators, which periodically heat their coolant coils in a constant battle against frost buildup. Wong says bio-inspired coatings could ultimately reduce refrigerators’ electrical consumption by 30 percent or more.  

Commercialization could come quick, he says, because these sophisticated designs need not be difficult to manufacture. Mature embossing and imprinting methods can produce millimeter-scale bumps in a wide range of metals and other materials, while the nanotexturing required for SLIPS can be etched into materials with acids or high-temperature steam. With the right application Wong bets that the first commercial uses will begin in as little as two to five years.

Achieving Paris Climate Targets Could Save Nearly 300,000 American Lives

When the world's nations set a goal in Paris last year of limiting the average temperature increase to no more than 2 degrees Celsius, the impact on health wasn't front and center. But, a study published today in Nature Climate Change suggests that if the United States reduces emissions from the transportation and electricity sectors in order to meet those targets, 295,000 American lives could be saved by 2030.

Read More

Supercapacitor On-a-Chip Now One Step Closer

In 2010 Spectrum reported a new approach for creating chip-scale supercapacitors on silicon wafers, proposed by researchers at Drexel University in Philadelphia and the Université Paul Sabatier in Toulouse, France. In an article published in Science, the researchers described how to make supercapacitor electrodes from porous carbon that could stick to the surface of silicon wafers so that they could be micromachined into electrodes for on-chip supercapacitors.

Now the same team has finally succeeded in doing just that. 

In a paper published in this week’s Science, researchers from the two initial teams report creating efficient porous carbon electrodes that really stick to the surface of a silicon wafer. They made layers of porous carbide derived carbon (CDC) that are completely compatible with all treatments used in the semiconductor industry, says Patrice Simon, a researcher at Université Paul Sabatier who has researched porous CDC electrodes over the last ten years and co-authored both the 2010 and this week’s paper in Science.

In an initial experiment circa 2010, the researchers exposed a 5 mm-thick ceramic plate of titanium carbide to chlorine at 500° C. They obtained a porous carbon layer on its surface that would fit the bill for electrode material for a supercapacitor. This layer, called carbide derived carbon contained pores that could store large numbers of ions from the electrolyte of a supercapacitor. Deposited on the surface of a silicon wafer, CDC, the researchers theorized, could be micromachined into electrodes.    

But to make their theory reality, the researchers had to resolve several issues, such as creating a CDC layer that would actually stick to the surface of a silicon wafer and that would be capable of absorbing sufficient ions to allow storing energy in a supercapacitor.

Fast-forward to today, when the researchers report that they made mechanically stable porous carbon films by fine-tuning the chlorination process. Using a standard sputtering technique, they deposited a 6.3 micrometer thick layer of titanium carbide on the thin insulating SiO2 layer of a standard silicon wafer. Then they exposed this layer to chlorine at 450ºC. The chlorination reaction progressed from the top of the titanium carbide (TiC) layer to the bottom, while chlorine atoms snatched up the titanium atoms, leaving empty 0.6-nanometer pores in the carbon layer.  

"We stopped the chlorination before the whole titanium carbide layer became transformed into CDC," says Simon. To their surprise the researchers found that the thin residual one-micrometer TiC layer caused the CDC film to adhere strongly to the SiO2 layer.  As a bonus, the TiC layer, being an electric conductor, allows electrons to polarize the CDC layer, which then attracts ions into its pores to compensate for the charge, says Simon.

Using standard microfabrication methods, the team fabricated 2 by 2 mm supercapacitors with 18 interdigitated CDC electrodes on a 7.62 cm  silicon wafer and reported an electric capacity of 170 farads per cm2, which allows an energy storage that outperforms current state-of-the-art micro-supercapacitors.

A second, and welcome, surprise was that if you chlorinated completely the initial titanium carbide layer, changing the entire layer into CDC, you could simply peel this layer off the silicon wafer. “The film doesn’t become brittle and remains mechanically stable,” says Simon.  This opens the way to the creation of flexible supercapacitors that could be glued onto flexible materials, such as polyethylene, says Simon.     

New Materials Push Solar-to-Hydrogen Closer

The term “solar energy” usually conjures up visions of blue glass rectangles, absorbing sunlight and turning it into electricity. But there’s another way to take advantage of the sun’s power—use it to create hydrogen fuel.

Read More

Fusion Stellarator Wendelstein 7-x Fires Up for Real

Today the German Chancellor Angela Merkel, at a ceremony at the Max Planck Institute for Plasma physics in Greifswald in Germany, pressed a button that caused a two-megawatt pulse of microwave radiation to heat hydrogen gas to 80 million degrees for a quarter of a second.

No, she was not setting off some new kind of hydrogen bomb. She was inaguriating the fusion reactor Wendelstein 7-X, the world’s largest stellarator, by generating its first hydrogen plasma. 

Read More

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More