Energywise iconEnergywise

Gas stove ring alight with blue natural gas flame

Full Cost of Electricity: Modeling Natural Gas Prices Offers Insight for Future Fuel Prices

graphic link to the landing page for The Full Cost of Electricity

In any business investment, price forecasting plays an important role in determining the viability of the project. Energy investing is no different, especially when it comes to fuel for a generating project. Quicker to construct than coal-fired power plants, natural-gas-fired projects offer environmental gains, but fuel prices are one key—if volatile—element to their success. Our role in the Full Cost of Electricity (FCe-) project by the University of Texas at Austin Energy Institute? Come up with a model that forecasts the long-term level of natural gas prices.

There are several approaches to developing long-term forecasts for commodity prices, including many types of econometric models, equilibrium models, and expert survey forecasts. We use an approach that is based upon calibrating some of the commonly used stochastic process models [PDF] with data from the commodities markets. (For an explanation of commodities markets, spot prices, futures, and other market modeling terms, see this article. For more on our specific model details, equations, and math, see our white paper [PDF].)

Data, equations, and models in hand, we used a two-factor economic model (two factors as causes of some event) to develop forecasts and confidence ratings for both the risk-neutral price (investor ignores risk completely in the decision-making process) and the expected spot price (current market price for a physical commodity). The risk-neutral version of the model yields a slightly lower forecast. The historical data shows that from 2009 through 2014, spot prices oscillated around a mean price just under US $4 per million Btu, and then prices dropped significantly and rapidly in 2015 [see graph above].

When the model is calibrated to shale gas era futures price data, leaving out the very high historical prices before 2009, the expected spot price is forecasted to recover to only about $3.00 per million Btu over the next two years, but then grow at a modest rate to a price of $4.35 per million Btu by the end of the forecast horizon, 2025 in our calculation. This forecast aligns with the Low Oil Price case projections from the Energy Information Administration 2015 Energy Outlook, slightly lower than their Reference case.

Our research shows that the choice of the data set has some effect on the two-factor model parameter estimates and the resulting forecast. The longer term data set yields a slightly lower forecast, thanks to the long-term downward trend from the high prices realized in the middle to latter part of the decade from 2000 to 2010. With either data set, however, forecasts roughly align with the High Oil and Gas Resource and Low Oil Price scenarios from the 2015 EIA Energy Outlook, two outcomes that seem increasingly likely as judged by market sentiment [see graph above]. This market-based forecasting model provides the added benefits of simple updating (as new futures data becomes available) and a statistical basis for uncertainty analysis, through the confidence envelope around the future expected spot prices.

For more information, read the full report, “Market-Calibrated Forecasts for Natural Gas Prices” [PDF] on the University of Texas Energy Institute’s page for The Full Cost of Electricity.

Warren J. Hahn & James S. Dyer are professors at the University of Texas at Austin, McCombs School of Business.

John Goodenough, co-inventor of the lithium-ion battery, heads a team of researchers developing the technology that could one day supplant it

Will a New Glass Battery Accelerate the End of Oil?

Electric car purchases have been on the rise lately, posting an estimated 60 percent growth rate last year. They’re poised for rapid adoption by 2022, when EVs are projected to cost the same as internal combustion cars. However, these estimates all presume the incumbent lithium-ion battery remains the go-to EV power source. So, when researchers this week at the University of Texas at Austin unveiled a new, promising lithium- or sodium-glass battery technology, it threatened to accelerate even rosy projections for battery-powered cars.

“I think we have the possibility of doing what we’ve been trying to do for the last 20 years,” says John Goodenough, coinventor of the now ubiquitous lithium-ion battery and emeritus professor at the Cockrell School of Engineering at the University of Texas, Austin. “That is, to get an electric car that will be competitive in cost and convenience with the internal combustion engine.” Goodenough added that this new battery technology could also store intermittent solar and wind power on the electric grid.

Yet, the world has seen alleged game-changing battery breakthroughs come to naught before. In 2014, for instance, Japanese researchers offered up a cotton-based (!) new battery design that was touted as “energy dense, reliable, safe, and sustainable.” And if the cotton battery is still going to change the world, its promoters could certainly use a new wave of press and media releases, as an Internet search on their technology today produces links that are no more current than 2014-2015 vintage.

So, on whose authority might one claim a glass battery could be any different?

For starters, Donald Sadoway’s. Sadoway, a preeminent battery researcher and MIT materials science and engineering professor, says, “When John Goodenough makes an announcement, I pay attention. He’s tops in the field and really a fantastic scientist. So, his pronouncements are worth listening to.”

Goodenough himself says that when he first coinvented the lithium-ion battery in the 1980s, almost no one in the battery or consumer electronics industries took the innovation seriously. It was only Japanese labs and companies like Sony that first began to explore the world we all today inhabit—with lithium-ions powering nearly every portable device in the marketplace, as well as electric vehicles and even next-generation airliners.

In other words, who better than Goodenough to cocreate the technology that could one day supplant his mighty lithium-ion battery?

The new battery technology uses a form of glass, doped with reactive “alkali” metals like lithium or sodium, as the battery’s electrolyte (the medium between cathode and electrode that ions travel across when the battery charges and discharges). As outlined in a research paper and recent patent filing (of which Goodenough, 94, says more are forthcoming), the lithium- or sodium-doped glass electrolyte offers a new medium for novel battery chemistry and physics.

They find, for instance, that the lithium- or sodium-glass battery has three times the energy storage capacity of a comparable lithium-ion battery. But its electrolyte is neither flammable nor volatile, and it doesn’t appear to build up the spiky “dendrites” that have plagued lithium-ions as they charge and discharge repeatedly and can ultimately short out, causing battery fires. So, if the glass batteries can be scaled up commercially, which remains uncertain in this still-proof-of-concept-phase research, the frightening phenomenon of flaming or exploding laptops, smartphones, or EVs could be a thing of the past.

Moreover, says lithium-glass battery codeveloper Maria Helena Braga, a visiting research fellow at UT Austin and engineering professor at the University of Porto in Portugal, the glass battery charges in “minutes rather than hours.” This, she says, is because the lithium- or sodium-doped glass endows the battery with a far greater capacity to store energy in the electric field. So, the battery can, in this sense, behave a little more like a lightning-fast supercapacitor. (In technical terms, the battery’s glass electrolyte endows it with a higher so-called dielectric constant than the volatile organic liquid electrolyte in a lithium-ion battery.)

Moreover, Braga says, early tests of their technology suggest it’s also capable of perhaps thousands of charge-discharge cycles, and could perform well in both extremely cold and hot weather. (Initial estimates place its operating range between below -20º C and 60º C.) And if they can switch the battery’s ionic messenger atom from lithium to sodium, the researchers could even source the batteries more reliably and sustainably. Rather than turning to controversial mining operations in a few South American countries for lithium, they’d be able to source sodium in essentially limitless supply from the world’s seawater.

Sadoway says he’s eager to learn more about the technology as it continues to be developed. In particular, he’s paying attention not so much to how quickly the battery charges but how well it can retain its energy. “The issue is not can you do something at a high charge rate,” he says. “My big question is about capacity fade and service lifetime.”

But, Sadoway adds, perhaps the chief innovation behind Goodenough and Braga’s technology is the possibility that they’ve solved the flaming and exploding battery problem.

“Addressing the [battery] safety issue is, I think, a giant step forward,” he says. “People have been talking about solid-state electrolytes for 20 years. But I can’t point to a commercial product yet…. If he can give us an electrolyte that is devoid of these flammable, organic solvents, that’s salutary in my opinion.”

If Goodenough, Braga, and collaborators can ramp up their technology, there would clearly be plenty of upsides. Goodenough says the team’s anode and electrolyte are more or less ready for prime time. But they’re still figuring out if and how they can make a cathode that will bring the promise of their technology to the commercial marketplace.

“The next step is to verify that the cathode problem is solved,” Goodenough says. “And when we do [that] we can scale up to large-scale cells. So far, we’ve made jelly-roll cells, and it looks like they’re working fairly well. So I’m fairly optimistic we’ll get there. But the development is going to be with the battery manufacturers. I don’t want to do development. I don’t want to be going into business. I’m 94. I don’t need the money.”

A power plant at night shrouded in steam

Kemper County and the Perils of Clean Coal Technology

Politicians who talk about the future of “clean coal” as part of the U.S. energy mix need look no farther than the Kemper County Energy Facility in Mississippi to see both the promise and the peril that the technology has to offer.

Kemper is years behind schedule and billions of dollars over the $2.2-billion cost estimate given in 2010 when construction began. And a recent financial analysis paints a dim picture of the plant’s potential for profit.

Read More
three power plant chimneys emitting vapor cloud

How EPA Calculates the Cost of Environmental Compliance for Electricity Generators

graphic link to the landing page for The Full Cost of Electricity

People pay for electricity directly, out of pocket, when they pay their electric bill. But they may also pay in an indirect way, when they bear the environmental and health costs associated with pollution from electricity generation. With a new EPA administrator recently installed, how these costs are calculated is under new scrutiny. The University of Texas Energy Institute’s Full Cost of Electricity Study includes estimates of these environmental pollution costs as one part of the full system cost of electricity.

There is a well-established body of literature at the intersection of toxicology, epidemiology, and economics; it’s one that also governs how the Environmental Protection Agency estimates the benefits of regulations that reduce pollution from power plants. As part of the University of Texas Energy Institute’s Full Cost of Electricity (FCe-) Study my colleagues and I took a deep dive into the cost of these environmental externalities. Our goal: Describe in detail how the EPA estimates the dollar value of pollution reductions.

Whenever the EPA proposes a major new rule, it undertakes a rigorous analysis, comparing a benefit estimate with its estimate of the societal costs of complying with the proposed rule. Our analysis [PDF] illustrates how the EPA completed this kind of analysis for three recent and major rules targeting fossil-fueled power plants: the Cross State Air Pollution Rule (regulating pollutant transport to downwind communities), the Mercury and Air Toxics Rule, and the Clean Power Plan (regulating greenhouse gas emissions).

In each of these three rulemakings, the EPA concluded that the health and environmental benefits greatly exceeded compliance costs, even though in some cases compliance costs were in the billions of dollars.

These analyses are not without controversy. Many dispute the dollar value that the EPA places on a premature death, and many others disagree with the value assigned to a ton of carbon emissions. For the mercury rule and the greenhouse gas rule, benefits dwarf costs only because of so-called co-benefits—reduction of pollution other than the pollutant targeted by the rule.

These and other measurement issues are laid out in our white paper, “EPA’s Valuation of Environmental Externalities from Electricity Production” [PDF].

David Spence is a professor at the McCombs School of Business and School of Law, part of the University of Texas at Austin.

Software coders William Stevens, from left, Michael Harrison, and Brack Quillen work on computers at the Bit Source LLC office in Pikeville, Kentucky, U.S., on Monday, Feb. 1, 2016.

The Kentucky Startup That Is Teaching Coal Miners to Code

Coal’s role in American electricity generation is fast diminishing. A few large coal-mining companies declared bankruptcy last year, and several coal power plants have been shuttered. The biggest loss in all this has been felt by the tens of thousands of coal miners who have been laid off. But despite the U.S. president’s campaign pledges, those jobs are going to be hard to bring back. Besides competition from natural gas and cheaper renewables, coal mining, and mining in general, is losing jobs to automation.

But now, a small startup in the middle of Appalachian coal country has a forward-looking plan to put miners back to work. Pikeville, Ky.-based Bit Source has trained displaced coal industry veterans in Web and software development.

The retrained workers now design and develop websites, tools, games, and apps. Bit Source is proving that coal miners can, indeed, learn how to code, and do it well.

“Appalachia has been exporting coal for a long time,” says Justin Hall, the company’s president. “Now we want to export code. We’ve got blue-collar coders. It’s the vision of the future of work: How do you train a workforce and adapt it for technology.”

Read More
A torn dollar bill in the shape of the state of Texas

What Role Do Household Incomes Play in the Full Cost of Electricity?

graphic link to the landing page for The Full Cost of Electricity

Electricity is something many of us take for granted. Flip a switch and lights come on, air-conditioning fires up, and computers hum. But how much will new energy cost relative to the income of the people who will consume it? It’s not something most of us in the United States think about as we flip that switch, but it is something we need to understand as we build next-generation power plants and the grid to move that electricity to demand centers.

As part of The Full Cost of Electricity project of the University of Texas at Austin Energy Institute, we wanted to ensure that public and policy discussions had baseline information. So we asked a few simple questions: “How much are households paying for household energy overall? How much of this cost is for electricity? How does this cost compare to incomes?” To answer these questions, we used the data acquired by the U.S. Energy Information Administration, via its Residential Energy Consumption Survey, for the state of Texas [PDF].

Besides being our home, Texas is a microcosm: We have rural and urban areas, flat and hilly country, desertlike areas and coastline.

Here's what we found:

  • Twenty-two percent of Texas households are energy burdened, spending more than 8 percent of income on household energy, and 16 percent of households spend more than 10 percent
  • The vast majority of the cost of household energy is for electricity
  • Fifteen percent of Texas households spend more than 8 percent of household income on electricity alone, and 11 percent spend more than 10 percent
  • Higher incomes translate to higher household electricity consumption, but there are important differences between urban and rural households
  • Other than income, there are several demographic variables that explain whether or not a household spends more than 8 percent on energy. For example, a household where someone is at home during the work day is more likely to be energy burdened

For the details, read the full white paper, "Household Energy Costs for Texans" [PDF] on the Energy Institute’s page for The Full Cost of Electricity.

Carey W. King is the assistant director and a research scientist with the University of Texas Energy Institute.

No need for alarm over high Fukushima radiation levels, say experts

Nuclear Experts: High Radiation Estimates at Fukushima No Surprise to Us

With two robot-probe operations apparently encountering increasingly high radiation levels inside the crippled Fukushima Daiichi nuclear plant during the past three weeks, some media reports suggested the radiation count was climbing rapidly. It didn’t help temper that view when plant operator Tokyo Electric Power Company (TEPCO) had to prematurely halt the second operation last Thursday to yank out the robot probe. Radiation had begun to dim the view of the attached cameras, threatening to leave the robot blinded and therefore unable to retrace its steps and escape the rubble.

The first operation, conducted at the end of January, used a remote-controlled robot equipped with a camera attached to a 10.5-meter-long telescopic rod. Captured video and stills showed images of a dark mass of rubble inside the No. 2 reactor’s primary containment vessel near the pedestal that supports the reactor vessel.

Analysis of the images, meant to determine whether the rubble encountered is corium (a mix of melted fuel and other materials), is still ongoing.

A TEPCO official explained that nuclear engineers conducted radiation lab tests prior to the operations taking place. This enabled the engineers to study the images taken in the first probe and estimate the different radiation levels—the highest of which was estimated to be 530 sieverts an hour. An estimate based on images taken during the second probe put the level as high as 650 sieverts an hour. (To put those numbers in context, when you take an abdominal X-ray, you’re exposed to about 4 millisieverts of radiation.)

TEPCO says it is not particularly surprised at these numbers given that its probes were approaching the reactor vessel. “And these are not direct measurements, but are based on the amount of image noise produced,” a company official emphasized. “There is also a plus or minus error margin of 30 percent.” 

Will Davis, a former U.S. Navy reactor operator, and a communications consultant to the American Nuclear Society who has followed the Fukushima accident since it began, agrees with that conclusion. 

“I don’t think we can realistically make assumptions about rising and lowering radiation levels in these camera-based detection methods yet,” he told IEEE Spectrum. “Not only is the presence of localized [radiation] point-sources possible, but there is also the possibility that streaming of radiation is talking place. In other words, we cannot say that all of the radiation in the containment vessel is coming from one unified lump of damaged fuel in the reactor vessel, and perhaps from a second unified lump sitting under it.”

Davis added that it is only to be expected that the closer the robot probes get to the damaged reactor, the higher the dose rates will be. “This has been expected since the beginning. And the high recent readings—even with the chance of up to 30 percent error—only confirms what experts already knew.”

He pointed out that comparably high radiation levels had been recorded in the aftermath of the Three Mile Island and Chernobyl nuclear accidents. 

TEPCO sent in the two robot probes to pave the way for a third operation planned for later this month. This third probe will use a remotely controlled Scorpion robot equipped with a camera, a dosimeter, and a temperature gauge. 

By contrast, the main purpose of the second probe was to remove sediment. The robot was outfitted with a water cannon and a scraper tool, as well as three cameras. The hope was to blast a path for the Scorpion, which cannot easily maneuver over uneven surfaces.

Despite the operation being halted early due to the impact of radiation, the company official said no further preparatory probes were planned. 

The official added that the information gleaned so far was not regarded as a negative, but rather as an aid in helping the engineers who are conducting these operations. “They are combining and analyzing everything right now, and this will help them determine whether to use the Scorpion or not, and what the next best step is to be.”

The American Nuclear Society’s Davis noted that just getting through the approach and planning stages that will precede the removal of the damaged nuclear fuel inside the reactor vessels and the primary containment vessels “is going to take a very long time, probably many, many years.”

But he also pointed out that while the new estimated radiation levels gleaned from the probes may shock people not following the cleanup closely, “it is important to remember that they are extremely localized and have no impact whatsoever to anyone outside the nuclear plant.”

Pilot testing Quanta3's continuous methane monitoring system at a Texas drill pad

Congress to Curtail Methane Monitoring

Innovation in methane detection is booming amid tightened state and federal standards for oil and gas drillers and targeted research funding. Technology developers, however, may see their market diminished by a regulation-averse Republican Congress and president. Senate Republicans are expected to attempt to complete a first strike against federal methane detection and emissions rules as soon as this week.

Methane is a potent greenhouse gas responsible for an estimated one-fifth to one-one quarter of the global warming caused by humans since the Industrial Revolution, and oil and gas production puts more methane in the atmosphere than any other activity in the United States. Global warming, however, is not a moving issue for Republican leaders or President Donald Trump, who reject the scientific consensus on anthropogenic climate change.

What moves them are complaints from industries that “burdensome" regulations unnecessarily hinder job growth and—in the case of methane rules—domestic oil and gas output. The House of Representatives got the methane deregulation ball rolling on 3 February, voting along party lines to quash U.S. Bureau of Land Management rules designed to prevent more than a third of methane releases from nearly 100,000 oil and gas wells and associated equipment operating on federal and tribal lands.

The House vote is one of the first applications of the hitherto obscure Congressional Review Act of 1996, which gives Congress 60 legislative days to overturn new regulations. If the Senate concurs and President Trump signs, the resulting act will scrap the bureau's ban on methane venting and flaring and its leak-monitoring requirements. It will also restrict the bureau from ever revisiting those mandates.

Read More
Artist's concept of a NuScale nuclear power facility

NuScale Reactor Nears One Milestone, With More to Follow

The U.S. Nuclear Regulatory Commission (NRC) is expected to decide by mid-March whether to accept an application with no fewer than 12,000 pages of technical details that support a design for a small modular nuclear reactor design from NuScale Power.

As Winston Churchill might say, the milestone may not mark the beginning of the end but, just maybe, the end of the beginning.

That’s because the NRC’s act of accepting the application does nothing more than trigger a license certification review for the reactor. (The modular reactor might one day generate electric power for small cities, large hospitals, industrial facilities, and even remote water desalination plants.)

As part of its certification review, the NRC will follow a design-specific standard that lays out multiple requirements NuScale's design must meet. Completing that review and certification process could consume anywhere from 30 to 40 months.

It may be no surprise, then, that NuScale is the first small modular reactor (SMR) to have made it this far in the U.S. regulatory process. And it’s had some help.

Read More
Slow Nuclear Restart and Lukewarm Reception for Electricity Deregulation Is Forcing an Energy Mix Rethink

Pressure on Japan’s Government to Revamp Country's Energy Mix

Following the 2011 Fukushima Daiichi nuclear plant accident, the Japanese government drafted a plan for a new energy mix starting in 2013. The aim was to: improve the country’s energy security by supporting renewable and nuclear energy; lower costs by utilizing cheaper coal-fired power generation; and reduce CO2 emissions by leveraging renewables and optimizing efficient coal-fired and LNG powered generation. 

At the same time, the government has also been pushing ahead with liberalizing the energy market. It deregulated the electricity retail market last April; the gas market will follow suit this year. Unbundling of electric generation, transmission, and distribution is due to take place in 2020.

Yet, three years after the new energy plan was published, so little is working out as hoped; the government is expected to produce a revamped plan this year.

Ten months after deregulation of the electricity market last April, prices have hardly been altered, and only 3 percent of customers have changed suppliers. This despite the Big Ten power providers now having a free hand to compete in each other’s formerly protected regions, and despite hundreds of new competitors entering the market from various sectors such as telecommunications and the oil and gas industries.

As detailed in the Japanese press, reasons for the paucity of interest in changing suppliers include little difference in pricing, poorly defined benefits, and procedures that have proven off-putting 

At the same time, most of the 3 percent that have changed providers reside in the Tokyo and Osaka regions, by far the country’s most populated areas. The new power providers have also targeted these areas, meaning that customers residing outside the two megacities have few or no opportunities to choose a new provider, even if they were so inclined.

When it comes to nuclear power, the government had aimed for it to provide between 20 and 22 percent of the country’s energy mix by 2030. The government hoped to, at the same time, gradually ease back on this controversial energy source by decommissioning older reactors. Before the Daiichi accident, nuclear power accounted for about 29 percent of the country’s energy mix. But in the aftermath of the 2011 tsunami having devastated Fukushima Daiichi and the surrounding countryside, all 48 of the remaining reactors were closed down as a safety precaution.

Given the strong anti-nuclear attitude many Japanese now harbor following the Daiichi accident, 20-22 percent seemed overly optimistic in 2013. Today, it appears out of reach.

After Japan’s Nuclear Regulation Authority drew up a new set of safety standards following the accident—which it claims are the most stringent in the world—only five reactors have managed to obtain licenses to restart operations. Of these, just three are in operation, with the other two being stalled by court injunctions brought by local governments or citizens groups because of safety concerns. Similar injunctions are sure to follow as the power companies attempt to restart more reactors. 

“So, there are many difficulties in reaching the 20-22 percent figure,” says Professor Takeo Kikkawa of Tokyo University of Science Graduate School of Innovative Studies, who spoke to the press on 8 February. 

“On top of those issues, current Japanese law says that after 40 years of operation, reactors are to be decommissioned,” he said. This means 24 reactors, including the four at Fukushima Daiichi, would have to be decommissioned by 2030. So, even if all the remaining reactors seeking licenses to operate were successful in going back online, nuclear power would still only account for 15 percent of Japan’s energy mix, said Kikkawa. 

And should the government seek to extend the life of the reactors to 60 years, it would be breaking its public pledge to “decrease reliance on nuclear as much as possible,” he pointed out.

Because of all these issues, the government has no choice but to produce a new energy mix. 

Kikkawa, who was a member of the advisory committee the government relied on to come up with its 2013 energy plan, has put forward a new energy mix solution he believes can be achieved by 2030: nuclear 15 percent, renewables 30 percent, fossil fuels 40 percent, cogeneration 15 percent.

Yet given the apparent lack of answers to the many issues that exist today, even such a revamped plan may have to be revised yet again three years from now.

Advertisement

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More