Four-Junction Solar Cell Claims Efficiency Record
Researchers in Australia have built a solar cell with light conversion efficiency far in excess of commercial solar cells.
Researchers in Australia have built a solar cell with light conversion efficiency far in excess of commercial solar cells.
Fuel cells convert the chemical energy stored in fuels such as hydrogen into electricity. They do so by reacting the fuel with oxygen or another oxidizing agent that can strip electrons from the fuel. An electrolyte—commonly a polymer or ceramic—interposed between the fuel and oxidizer helps shuttle ions within the fuel cell.
Fuel cells are typically more efficient and environmentally friendly than heat engines, such as the internal combustion engines that usually power cars. However, fuel cells are often limited by how well their electrolytes can prevent electrons from leaking through them at the interface where the fuel and the oxidizing agent meet. Such electron conduction not only reduces fuel cell power output, but it can also lead to catastrophic fractures in the electrolyte.
A new battery material based on nanowires that can be recharged hundreds of thousands of times, could lead to commercial batteries for smartphones, laptops, appliances, cars, and spacecraft that have greatly enhanced lifespans, researchers say.
Scientists have long sought to create batteries using nanowires—strands only nanometers, or billionths of a meter, wide. Their tremendous surface area when compared with their volume makes them spectacular at storing and transferring electrical charge.
“Nanowires enable battery technology with higher power; you're able to get more current out of a battery the same size as [today’s] batteries, and make batteries smaller and get the same performance out of them as [the ones that are commercially available],” says Reginald Penner, an electrochemist at the University of California, Irvine.
The one drawback of nanowires has been their fragility, Penner says. Their thinness increases their susceptibility to dissolving and fragmenting under repeated cycles of discharging and recharging.
But now, Penner and his colleagues have developed a way to counteract that fragility. The researchers coated gold nanowires in manganese dioxide shells and formed an electrode by encasing hundreds of these nanowires together in a gel made from the same molecule found in acrylic glass. “The gel has the consistency of peanut butter,” Penner says. The scientists detailed their findings in 20 April online edition of the journal Energy Letters.
In experiments, the scientists cycled their electrode on and off up to 200,000 times over the course of three months without detecting any loss of energy storage capacity or power and without fracturing any nanowires. Typically, nanowire-based battery materials die after 7,000 or so cycles at the most, the researchers say. “It's surprising that such a simple change can trigger such a profound change in cycle stability,” Penner says.
The researchers suspect that the highly viscous and elastic acrylic glass plasticizes the manganese oxide shell, granting it flexibility and preventing it from cracking. A control sample of gold and manganese nanowires without the acrylic glass lasted only about 8,000 cycles.
Future research will explore if using a gel electrolyte can enhance other kinds of nanowires, Penner says. Subsequent work can also investigate other kinds of gels, and determine what effects will result from modifying the viscosity and other properties of the gels.
The populous island of Kyushu in southwest Japan has been shaken by hundreds of earthquakes and aftershocks over the past eight days, and there is no immediate end in sight to Mother Nature’s upheavals.
The tremors have impacted manufacturing for some companies in the auto and electronics industries, while concerns are growing over the safety of Japan’s two active nuclear reactors (the only two presently online), which are located about 120 km south of where the main shaking is occurring.
Nanophotonic technology could be the key to driving up the efficiencies of solar cells, making them feasible for widespread global deployment, say researchers from the FOM Institute for Atomic and Molecular Physics (AMOLF) in the Netherlands.
The researchers published a review article in Science today describing current solar technologies and their limitations with regards to efficiency. Silicon-based solar, which is now considered a mature technology, occupies about 90 percent of the photovoltaic market, the researchers wrote. Yet, over the last few years, silicon solar cells have realized only modest gains in efficiency, stalling out in the 20-percent range.
But, according to lead author Albert Polman, advances in nanophotonics could help increase efficiencies for single-junction solar cells to 40 percent and higher, and do so cost effectively. In addition, he said, the technology could be compatible not just with silicon, but any type of solar material.
“It's really an upcoming field,” Polman says.
Efficiency and cost are the two main barriers on solar, and often, one is compromised for the sake of the other, Polman says. Using less material, such as for thin-film solar cells, brings costs down, but drags efficiency down right along with it.
Nanophotonics can be applied to existing solar technologies to harness light more effectively to increase efficiency.
When sunlight hits a solar panel, a good amount of the potential energy is lost due to it being reflected and scattered, Polman says. But nanostructures incorporated into a panel can re-direct the scattered light within the solar cell, “so that the light travels back and forth within the cell and is trapped inside it,” he adds.
In the research described in the Science article, Polman's team calculated that the theoretical maximum efficiency for a single-junction monocrystalline silicon solar cell is 29.4 percent, although the majority of commercial silicon panels are multicrystalline silicon, which have efficiencies of around 20.8 percent. But that’s on paper. Thus far, the highest recorded experimental efficiency is 25.6 percent for monocrystalline silicon and 21.3 percent for multicrystalline silicon.
Other materials don't fare much better. Solar cells made from gallium arsenide (GaAs) have the efficiency record for single-junction solar cells at 28.8 percent, but GaAs solar cells are expensive and mostly have niche applications for space and satellite technology, Polman says.
Meanwhile, less expensive materials like thin-film silicon, dye sensitized titanium dioxide, and organic solar, have not broken the 12-percent-efficiency mark.
Nanophotonic technology can help, though. Using printing techniques, nanstructures with improved light harnessing properties can be printed onto silicon-based solar cells, he said. Alternatively, cells can be designed with nanstructures incorporated into them from the beginning.
Polman's lab is currently conducting small-scale experiments using a printing technique to layer nanoscale structures onto silicon solar panels, he says, and is in the midst of building larger panels to test in the field.
Incorporating such nanostructures into silicon cells could help silicon reach beyond its maximum efficiency, but even greater gains will be realized when solar cells are built that combine different materials with nanostructures.
For instance, perovskite has recently been touted as a promising material for solar cell technology; demonstrations have shown that it can reach efficiencies of 20 percent. Polman says that layering perovskite on top of silicon could provide further advantages since the two materials capture different wavelengths of light. Earlier this year, researchers demonstrated that layering perovksite on top of a silicon solar cell boosted the efficiency by 7.3 percent.
Incorporating nanostructures could provide a further boost by allowing researchers to “engineer the scattering of the light in a clever way,” he says.
Looking ahead, Polman says he envisions solar cells that make use of not just two materials, but three or four materials with complementary properties and nanophotonics to make the most use of the incoming sunlight.
“Further advances in nanophotovoltaics will lead to enhanced photocurrents, and thus enhanced efficiency, in several different PV materials and architectures,” the AMOLF team wrote, enabling “very large-scale penetration into our energy system.”
Solar cells could someday generate electricity even during rainshowers with the help of graphene, scientists say.
Rain helps solar cells operate efficiently by washing away dust and dirt that block the sun’s rays. Still, photovoltaic cells depend on light to produce electricity, and so generate a negligible amount of power when there are clouds overhead.
To find buried oil reserves, surveyors have for decades used gravity meters, or gravimeters, along with other instruments. Gravimeters are hypersensitive versions of accelerometers: They measure extremely tiny changes in the acceleration due to gravity. These nanoscale changes can happen because of the presence of subterranean geological features like oil wells. The best gravimeters in use today are the size of a shopping basket, weigh a few kilograms, and cost around US $100,000, which limits their use.
But a new postage stamp–size device developed by Scottish researchers could make oil exploration faster, easier, safer, and more economical.
The new microelectromechanical (MEMS) device, along with all of its electronics, fits in a shoebox. And according to Richard Middlemiss—the physics and astronomy graduate student at the University of Glasgow who, along with other researchers at the school’s Institute for Gravitational Research, created the gadget—it could be shrunk down to the size of a tennis ball.
The team described the breakthrough in a paper recently published in Nature. They note that the gravimeter could be made in bulk from silicon wafers, so it should cost no more than a few thousand dollars.
Right now, oil surveying is done by driving or flying low over areas with a bulky gravimeter, measuring gravity’s effect on a mass attached to a string. Such surveys can’t assess rough terrain and “are expensive to do because of people time,” Middlemiss says.
He adds that, “With small, low-cost gravimeters, you could instead leave a network of sensors around a whole site for months.” He also imagines swarms of drones carrying ultra-light tennis-ball-size gravimeters flying low over otherwise inaccessible areas to do oil prospecting. “Instead of buying one device, you could buy a hundred; that would change the way surveys are carried out,” he says.
Middlemiss, University of Glasgow professor Giles Hammond, and their colleagues pushed gravity measurement forward by shrinking things down to the micrometer level. They carved a 15-millimeter-by-15-millimeter-by-7-micrometer piece of silicon to form a rectangular central mass attached to a square frame with three 7-µm-wide springs. The frame is hung between a white LED and a photodiode so that when the mass moves due to gravity, its shadow falling on the photodiode changes. The result: detectable changes in current.
The design is similar to the MEMS accelerometers used in smartphones. But the new gravimeter is a thousand times as sensitive as those, allowing it to detect extremely small changes in gravity.
To test the device, the team bolted the box to the floor in their building’s basement and used it to measure Earth tides, movements of the Earth’s crust due to the gravitational pull of the sun and moon. As the crust moved up and down by about 40 centimeters over the course of a day, changes in gravity moved the silicon mass in the gravimeter by 16 nanometers. That sensitivity was partly due to the thin springs, but also because the researchers enclosed it in a copper shield and vacuum box and maintained the temperature around it to within 1 milliKelvin.
Still, the device isn’t quite as sensitive as commercial gravimeters, Middlemiss says. So the researchers are working to improve its sensitivity even as they make it smaller. Later this year, they plan to test it in the field alongside its commercial counterparts. The team says it is now in talks with a geophysical exploration company and a survey instrument maker to take the technology closer to commercialization.
Besides oil exploration, the smaller, cheaper gravimeters could open up many other applications, including monitoring volcanoes by measuring magma levels under the crust, and studying geological formations and buried archeological features. As Hazel Rymer of The Open University puts it in an accompanying news and views piece: “Once these instruments become commercially available, the applications will be limited only by the user's imagination.”
This April 1st will be no laughing matter for Japan’s 10 regional power suppliers. The Japanese government has chosen All Fools Day to start the second phase of its energy reforms: the liberalization of the electricity retail market. The third phase, the unbundling of transmission and distribution from the generation of electricity, is to follow in 2020.
The first phase got under way last April when the government established the Organization for Cross-regional Coordination of Transmission Operators (OCCTO), an independent institution to promote and coordinate wide-area operations.
“It is OCCTO’s job to plan how to lay out an efficient transmission network across the country,” Hiroshi Ohashi, professor of economics at the University of Tokyo, told IEEE Spectrum. “It must also decide how much reserve power each region should have, as well as write up new rules governing the use of transmission rights.”
Reform became a priority, according to a 2015 Ministry of Economy, Trade and Industry report, after the Great East Japan Earthquake of 11 March 2011 revealed that: the regional monopolies were unable to transmit electricity beyond their fiefdoms; they had become inflexible in changing their energy mix; and they had difficulties in increasing the ratio of renewable energy they could use.
Now, come 1 April, not only will the Big Ten electricity providers have free reign to compete in each other’s domains, they’ll also face competition from more than 150 new start-up energy groups, as well as 30 existing small competitors. All these new entrants want a share of a retail electricity market worth more than 6.5 trillion yen (US $67 billion), representing 40 percent of the country’s total energy usage.
The gas industry will also undergo similar deregulation in April 2017. With that in mind, gas companies have become a significant presence among the electric energy start-ups by creating strategic partnerships and alliances. They’ve linked up with each other and with: the major mobile telecom firms NTT Docomo, Softbank, and KDDI’s au; major trading companies; convenience-store chains such as Lawson; and other industry sector leaders.
New entrants’ share of the electricity market, as of last summer, had already reached roughly 10 percent, according to the Institute of Energy Economics, Japan (IEEJ), a private think tank in Tokyo. By early February, 106,000 customers had applied to switch suppliers—though 99 percent of these switches will occur within the regions currently dominated by Tokyo Electric Power Co. (TEPCO) and Kansai Electric Power Co. (KEPCO, which covers Osaka and Kyoto in western Japan).
Being the two largest power suppliers partly accounts for this concentration of applicants in the TEPCO and KEPO regions, as does the fact that both suppliers have a number of nuclear power stations that remain off line in the wake of the Fukushima Daiichi Power Station accident in March 2011.
“Nuclear power generation represents a major share of their energy mix, so the costs for these utility companies are rather high,” says Masakazu Toyoda, chairman and CEO of IEEJ, who briefed the press recently on the ongoing energy reforms.
Among developed economies, Japan has come late to the energy deregulation party, so it has been able to study how energy liberalization has gone in the United States and Western Europe. One area of concern noted by Toyoda is that those regions’ reforms have not brought about adequate investment in the reformed industries. This is attributed to the uncertainty over energy sources and prices, and the fact that power-generation facilities have high fixed costs, which require a lengthy time to recover.
“This [issue] has required some form of government intervention such as [introducing] the capacity market,” notes Toyoda.
A capacity market mechanism is the creation of some form of compensation paid to the suppliers for building energy capacity beyond what is presently needed, but based on future projections, may be required in a few years or in the case of a natural disaster.
Japan, too, will require some kind of capacity market mechanism, says Ohashi. “If only to maintain existing conventional generating facilities, which face lower levels of utilization over the long term because of increasing penetration of solar and other renewable generation.”
Another area where the government may need to step in—and is already doing so—is revising the feed-in-tariffs (FITs), which were set at high levels after the Fukushima Daiichi nuclear accident led to the shutdown or decommissioning of all its 55 nuclear plants. But this led to a surge in new producers, most mainly focused on solar generation. However, the former regional monopolies have balked at taking on more new applicants, voicing concern about their ability to maintain stable energy supplies.
According to IEEJ data, resource-poor Japan charged industrial customers 20.4 cents per kilowatt-hour in 2012; residential customers paid 29.01 cents. Thanks in part to tariff reductions, these charges had decreased to 18.8 cents and 25.3 cents respectively by 2014. By comparison, in the resource-rich United States, the average industrial customer was paying 7 cents per kWh, and the average household paid 12.5 cents in 2014.
“FIT has been heavily biased towards unstable, renewable power supplies, especially solar,” says Ohashi. “As the government decreases FIT prices, this will become less of an important issue. But it will take time to see the consequences of this policy.”
FIT adjustments aren’t the only point of uncertainty. Given the various challenges Japan faces in transforming its energy market, it is safe to say that no one can foresee the consequences of the changes that are about to get underway on All Fools Day.
Just when it seemed Japan was poised to get its nuclear plants up and running again after the 2011 accident at Fukushima Daiichi brought about the shutdown of all the country’s nuclear operations, a series of mishaps has raised doubts over the government’s ability to achieve its goal of supplying 20-22 percent of Japan’s energy needs with nuclear power by 2030.
Last month, TEPCO, the regional electric utility that operated the Fukushima plant, issued a press release admitting that according to the results of a recent investigation, staffers had not followed guidelines requiring them to quickly declare a meltdown following the Daiichi accident.
“In the course of our investigations, it was discovered that TEPCO’s internal manual at the time clearly stated that a core meltdown was to be determined if the percentage of core damage exceeded 5%,” states the release. It goes on to say that, “We have confirmed that there were events where it may have been possible to issue notifications and reports more promptly immediately after the tsunami hit on March 11, 2011.”
Two days before last month’s TEPCO announcement, Kansai Electric Power Co. (KEPCO, which serves the Osaka and Kyoto regions) revealed that it had found a leak on 20 February in the filtering system of the Unit 4 reactor at its Takahama Nuclear Plant in Fukui Prefecture, some 500 kilometers west of Tokyo. A contaminated pool of water was also discovered. The incident happened during preparations to restart the reactor after Japan’s Nuclear Regulatory Authority’s (NRA) had deemed it safe to go back on line.
“Subsequently, the puddle was wiped [up] and it was confirmed that there was no remaining contamination,” the KEPCO announcement explained.
Convinced that all was well, KEPCO started up the reactor on 26 February. It shut down automatically three days later due to a “main transformer/generator internal failure,” the company reported.
But the biggest blow came on 9 March, when the District Court in Otsu, Shiga Prefecture, located near the Takahama plant—though unprecedentedly not in the same prefecture—ordered the immediate shutdown of Units 3 and 4. The decision came after it agreed with a group of local plaintiffs that the plant did not satisfy all the NRA safety requirements. The Unit 3 reactor had gone back online in January.
Despite this setback, KEPCO has a reasonable chance of having the injunction overturned, says Takayuki Terai, director of the Institute of Engineering Innovation at the University of Tokyo. According to Terai, KEPCO’s saving grace might come from the higher court believing that the scientific grounds for the district court’s decision are not strong and involve the impractical idea of “absolute safety.”
The court’s decision also left open the possibility that, because the differences between what happened and what should have happened in the immediate aftermath of the Fukushima Daiichi accident had not yet been fully resolved and made clear, this cast doubts on the validity of NRA’s safety standards.
“We do take issue with that,” said Shunichi Tanaka, head of the NRA, at a press conference on 22 March. “We made a thorough analysis of the causes of the accident, and we believe we have (incorporated this knowledge) in our new set of regulations.”
In addition, Tanaka said the NRA had consulted with countries around the globe and the International Atomic Energy Agency about international safety practices. “So my current position is that just because the decision (to shut down the plant) came out of the District Court of Otsu, it doesn’t mean that we need to change the regulations at this point in time.”
Nevertheless, says the University of Tokyo’s Terai, “Should there be more legal actions of this kind inside and outside the prefectures where the plants are located, the power companies would face serious problems in starting up their nuclear power plants.”
Given that some 30 lawsuits and petitions for injunctions have been reported in the press, such an outcome seems likely. Currently, the NRA is reviewing 20 nuclear reactors in 16 power stations to see if they meet the new regulatory rules. Meanwhile, the Takahama closures leave just two reactors in operation—both at the Sendai plant run by Kyushu Electric Power Co., also in western Japan.
Clearly, the power companies’ missteps are not helping the NRA’s efforts to rebuild trust with citizens—a critical factor in winning the necessary approval of local governments. Tanaka admitted as much, saying, “I can’t really dispute that.” He added, “Rebuilding trust and confidence is about building a good track record of operations over the long term. So the utilities need to understand that fully.”
One way they can do it is to get some plants up and running without fumbling, says Terai. “Then they can reduce electricity prices. They and the government also need to explain the risks and benefits of nuclear power generation to the citizens and mass media in an easy to understand manner.”
ARPA-E is the Advanced Research Projects Agency-Energy, a DARPA-type government agency that funds energy-related high-risk, high-reward projects. Like most government agencies, they tend to bury their projects in strained, dull-sounding acronyms like DELTA, which stands for Delivering Efficient Local Thermal Amenities. But ARPA-E's DELTA project is, fortunately, much more interesting than it sounds: DELTA is funding a bunch of different ways in which climate control can be moved from building-level to personal level.
The problem that ARPA-E wants to solve with DELTA is the ridiculous amount of energy that we waste heating and cooling buildings that are, statistically speaking, almost entirely unoccupied. When you turn on the heat or the AC, you're dumping energy into changing the temperature of an entire structure, when all you really care about is the little area of comfort surrounding you. And if some people like it warmer and some people like it colder, one of those two groups is doomed to misery. It's a terrible, ridiculous system.
A much better approach would be to develop technologies for highly localized and customizable temperature control. Why bother heating or cooling an entire building, when all of the people inside it can instead customize their own little climate bubble to their ideal temperature? This is what ARPA-E's DELTA project is all about. Here’s a look at three different technologies from ARPA-E's annual summit that are in the process of moving from prototype to commercial reality within the next two years.
IEEE Spectrum’s energy, power, and green tech blog, featuring news and analysis about the future of energy, climate, and the smart grid.
Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.
Aluminum-based device produces industrially useful compounds
Power sensors for distribution networks have inspired a $77-million DARPA program to build a suite of automated cyberdefenses for power grids