Energywise iconEnergywise

A flying satellite with several green beam projecting down to a cliff of ice

Is This Twilight for the Golden Age of Earth Observation?

When leaders of the Congressional committees that approve NASA’s missions and budgets put forth their priorities in February, only space science and deep space exploration made the cut. Conspicuously absent was Earth science—a US $2 billion function within NASA that tracks our rapidly changing “home planet.”

Add in White House skepticism of climate science, and what experts call today’s “golden age” of monitoring Earth via satellite faces some serious challenges.

That age began in 2009, when President Barack Obama responded to a U.S. National Research Council warning that budget cuts had left the United States’ Earth observing system “at risk of collapse.” NASA, the lead federal agency for satellite development, saw its Earth science budget rise 56 percent between 2008 and 2016, and it placed eight new Earth-observing satellites in orbit during that period packing state-of-the-art sensors.

The data they deliver inform a widening range of activities—crop planning and management, wildfire risk assessment, extreme air pollution warnings, and more. NASA delivered 1.42 billion data products in 2015—174 times as many as it delivered in 2000—according to a November 2016 review by the agency’s Inspector General.

More missions are in the pipeline, such as NASA’s second Ice, Cloud, and land Elevation Satellite (ICESat-2), whose primary objectives are tracking melting polar ice sheets and glaciers and quantifying the carbon locked up in the globe’s forests.

ICESat-2, however, exemplifies both the present strength of the U.S. Earth observation program and a less visible weakness. To understand why, you need a sense of the ambitious nature of ICESat-2’s mission.

Rather than rerunning the first ICESat mission, which ended in 2009, NASA redesigned the laser altimeter to boost its impact. One laser beam firing sporadically became six beams firing 365 days a year; higher-precision digital photon counting replaced analog detection of beams bouncing back from Earth.

ICESat-2 should enable measurement of annual elevation changes in ice sheets at ± 4-millimeter accuracy (and better for other targets), and at 17 times the spatial resolution of its predecessor, according to Thorsten Markus, chief of cryospheric sciences at NASA’s Goddard Space Flight Center, in Maryland. Such data, he says, will elucidate some basic physical processes that elude climate models, and thus improve their predictions.

But pushing for the best has not come cheap. Instead of $300 million for an ICESat rerun, NASA’s estimate for ICESat-2’s development started at $559 million and has grown to $764 million. Including operations for up to seven years, the mission could cost nearly $1.1 billion, according to that November Inspector General report. Launch dates, meanwhile, have slipped from 2015 to 2018.

Delays and cost creep in ICESat-2 and other missions, as well as several failed launches, put a significant tarnish on Earth observation’s golden age. Extending existing missions to avoid gaps in observational data creates risk, according to NASA’s Inspector General: “More than half the Agency’s 16 operating missions have surpassed their designed lifespan and are increasingly prone to failures that could result in critical data loss….”

Similar risks confront the National Oceanic and Atmospheric Administration, a key partner in climate and weather observation, according to a February report by Congress’s watchdog agency the Government Accountability Office NOAA’s polar readings currently come from a dying NASA demonstration mission. If it fails before the agencies’ long-awaited Joint Polar Satellite System launches, it would degrade weather forecasts, “exposing the nation to a 15 percent chance of missing an extreme weather event forecast,” writes the GAO.

If the golden age of Earth observation harbored weak spots before the 2016 election, experts say the new administration introduces new risks. One is the $54 billion in belt-tightening proposed for federal agencies by President Donald Trump. In early March the Washington Post reported that the President will ask for 17 percent less funding for NOAA.

Another is potential interference with climate science. In February, Lamar Smith, chairman of the House Committee on Science, Space, and Technology, called for “rebalancing” of NASA’s portfolio. A former chairman, Robert Walker, now a lobbyist for space-related industries, built a similar plank into the space platform that he drafted for Trump’s campaign. Both men question human-induced climate change—a view held by many Republicans in Congress and Trump appointees.

Walker says expanded Earth observation under Obama came at the expense of other science programs, particularly deep space robotic missions. He also alleges that NASA science was “tainted" by a political agenda against fossil fuels, focusing on impacts from burning coal, oil, and natural gas and neglecting natural climate influences such as volcanic eruptions. “There’s an extremely complex system that involves a lot more than CO2,” he says.

The Intergovernmental Panel on Climate Change’s 2014 assessment, however, expressed “very high confidence” that volcanic eruptions caused only “a small fraction” of the warming observed since the Industrial Revolution. And it cites “robust evidence” from satellite data showing that natural factors have had “near-zero” effect since 1980.

The notion that human activities alter climate is not a political invention, but a scientific judgement based on gigabytes of data downloaded daily from a gilded era’s orbiting sensors. “It’s not a belief,” says NASA’s Markus. “That’s what the data show.”

Gas stove ring alight with blue natural gas flame

Full Cost of Electricity: Modeling Natural Gas Prices Offers Insight for Future Fuel Prices

graphic link to the landing page for The Full Cost of Electricity

In any business investment, price forecasting plays an important role in determining the viability of the project. Energy investing is no different, especially when it comes to fuel for a generating project. Quicker to construct than coal-fired power plants, natural-gas-fired projects offer environmental gains, but fuel prices are one key—if volatile—element to their success. Our role in the Full Cost of Electricity (FCe-) project by the University of Texas at Austin Energy Institute? Come up with a model that forecasts the long-term level of natural gas prices.

There are several approaches to developing long-term forecasts for commodity prices, including many types of econometric models, equilibrium models, and expert survey forecasts. We use an approach that is based upon calibrating some of the commonly used stochastic process models [PDF] with data from the commodities markets. (For an explanation of commodities markets, spot prices, futures, and other market modeling terms, see this article. For more on our specific model details, equations, and math, see our white paper [PDF].)

Data, equations, and models in hand, we used a two-factor economic model (two factors as causes of some event) to develop forecasts and confidence ratings for both the risk-neutral price (investor ignores risk completely in the decision-making process) and the expected spot price (current market price for a physical commodity). The risk-neutral version of the model yields a slightly lower forecast. The historical data shows that from 2009 through 2014, spot prices oscillated around a mean price just under US $4 per million Btu, and then prices dropped significantly and rapidly in 2015 [see graph above].

When the model is calibrated to shale gas era futures price data, leaving out the very high historical prices before 2009, the expected spot price is forecasted to recover to only about $3.00 per million Btu over the next two years, but then grow at a modest rate to a price of $4.35 per million Btu by the end of the forecast horizon, 2025 in our calculation. This forecast aligns with the Low Oil Price case projections from the Energy Information Administration 2015 Energy Outlook, slightly lower than their Reference case.

Our research shows that the choice of the data set has some effect on the two-factor model parameter estimates and the resulting forecast. The longer term data set yields a slightly lower forecast, thanks to the long-term downward trend from the high prices realized in the middle to latter part of the decade from 2000 to 2010. With either data set, however, forecasts roughly align with the High Oil and Gas Resource and Low Oil Price scenarios from the 2015 EIA Energy Outlook, two outcomes that seem increasingly likely as judged by market sentiment [see graph above]. This market-based forecasting model provides the added benefits of simple updating (as new futures data becomes available) and a statistical basis for uncertainty analysis, through the confidence envelope around the future expected spot prices.

For more information, read the full report, “Market-Calibrated Forecasts for Natural Gas Prices” [PDF] on the University of Texas Energy Institute’s page for The Full Cost of Electricity.

Warren J. Hahn & James S. Dyer are professors at the University of Texas at Austin, McCombs School of Business.

John Goodenough, co-inventor of the lithium-ion battery, heads a team of researchers developing the technology that could one day supplant it

Will a New Glass Battery Accelerate the End of Oil?

Electric car purchases have been on the rise lately, posting an estimated 60 percent growth rate last year. They’re poised for rapid adoption by 2022, when EVs are projected to cost the same as internal combustion cars. However, these estimates all presume the incumbent lithium-ion battery remains the go-to EV power source. So, when researchers this week at the University of Texas at Austin unveiled a new, promising lithium- or sodium-glass battery technology, it threatened to accelerate even rosy projections for battery-powered cars.

“I think we have the possibility of doing what we’ve been trying to do for the last 20 years,” says John Goodenough, coinventor of the now ubiquitous lithium-ion battery and emeritus professor at the Cockrell School of Engineering at the University of Texas, Austin. “That is, to get an electric car that will be competitive in cost and convenience with the internal combustion engine.” Goodenough added that this new battery technology could also store intermittent solar and wind power on the electric grid.

Yet, the world has seen alleged game-changing battery breakthroughs come to naught before. In 2014, for instance, Japanese researchers offered up a cotton-based (!) new battery design that was touted as “energy dense, reliable, safe, and sustainable.” And if the cotton battery is still going to change the world, its promoters could certainly use a new wave of press and media releases, as an Internet search on their technology today produces links that are no more current than 2014-2015 vintage.

So, on whose authority might one claim a glass battery could be any different?

For starters, Donald Sadoway’s. Sadoway, a preeminent battery researcher and MIT materials science and engineering professor, says, “When John Goodenough makes an announcement, I pay attention. He’s tops in the field and really a fantastic scientist. So, his pronouncements are worth listening to.”

Goodenough himself says that when he first coinvented the lithium-ion battery in the 1980s, almost no one in the battery or consumer electronics industries took the innovation seriously. It was only Japanese labs and companies like Sony that first began to explore the world we all today inhabit—with lithium-ions powering nearly every portable device in the marketplace, as well as electric vehicles and even next-generation airliners.

In other words, who better than Goodenough to cocreate the technology that could one day supplant his mighty lithium-ion battery?

The new battery technology uses a form of glass, doped with reactive “alkali” metals like lithium or sodium, as the battery’s electrolyte (the medium between cathode and electrode that ions travel across when the battery charges and discharges). As outlined in a research paper and recent patent filing (of which Goodenough, 94, says more are forthcoming), the lithium- or sodium-doped glass electrolyte offers a new medium for novel battery chemistry and physics.

They find, for instance, that the lithium- or sodium-glass battery has three times the energy storage capacity of a comparable lithium-ion battery. But its electrolyte is neither flammable nor volatile, and it doesn’t appear to build up the spiky “dendrites” that have plagued lithium-ions as they charge and discharge repeatedly and can ultimately short out, causing battery fires. So, if the glass batteries can be scaled up commercially, which remains uncertain in this still-proof-of-concept-phase research, the frightening phenomenon of flaming or exploding laptops, smartphones, or EVs could be a thing of the past.

Moreover, says lithium-glass battery codeveloper Maria Helena Braga, a visiting research fellow at UT Austin and engineering professor at the University of Porto in Portugal, the glass battery charges in “minutes rather than hours.” This, she says, is because the lithium- or sodium-doped glass endows the battery with a far greater capacity to store energy in the electric field. So, the battery can, in this sense, behave a little more like a lightning-fast supercapacitor. (In technical terms, the battery’s glass electrolyte endows it with a higher so-called dielectric constant than the volatile organic liquid electrolyte in a lithium-ion battery.)

Moreover, Braga says, early tests of their technology suggest it’s also capable of perhaps thousands of charge-discharge cycles, and could perform well in both extremely cold and hot weather. (Initial estimates place its operating range between below -20º C and 60º C.) And if they can switch the battery’s ionic messenger atom from lithium to sodium, the researchers could even source the batteries more reliably and sustainably. Rather than turning to controversial mining operations in a few South American countries for lithium, they’d be able to source sodium in essentially limitless supply from the world’s seawater.

Sadoway says he’s eager to learn more about the technology as it continues to be developed. In particular, he’s paying attention not so much to how quickly the battery charges but how well it can retain its energy. “The issue is not can you do something at a high charge rate,” he says. “My big question is about capacity fade and service lifetime.”

But, Sadoway adds, perhaps the chief innovation behind Goodenough and Braga’s technology is the possibility that they’ve solved the flaming and exploding battery problem.

“Addressing the [battery] safety issue is, I think, a giant step forward,” he says. “People have been talking about solid-state electrolytes for 20 years. But I can’t point to a commercial product yet…. If he can give us an electrolyte that is devoid of these flammable, organic solvents, that’s salutary in my opinion.”

If Goodenough, Braga, and collaborators can ramp up their technology, there would clearly be plenty of upsides. Goodenough says the team’s anode and electrolyte are more or less ready for prime time. But they’re still figuring out if and how they can make a cathode that will bring the promise of their technology to the commercial marketplace.

“The next step is to verify that the cathode problem is solved,” Goodenough says. “And when we do [that] we can scale up to large-scale cells. So far, we’ve made jelly-roll cells, and it looks like they’re working fairly well. So I’m fairly optimistic we’ll get there. But the development is going to be with the battery manufacturers. I don’t want to do development. I don’t want to be going into business. I’m 94. I don’t need the money.”

A power plant at night shrouded in steam

Kemper County and the Perils of Clean Coal Technology

Politicians who talk about the future of “clean coal” as part of the U.S. energy mix need look no farther than the Kemper County Energy Facility in Mississippi to see both the promise and the peril that the technology has to offer.

Kemper is years behind schedule and billions of dollars over the $2.2-billion cost estimate given in 2010 when construction began. And a recent financial analysis paints a dim picture of the plant’s potential for profit.

Read More
three power plant chimneys emitting vapor cloud

How EPA Calculates the Cost of Environmental Compliance for Electricity Generators

graphic link to the landing page for The Full Cost of Electricity

People pay for electricity directly, out of pocket, when they pay their electric bill. But they may also pay in an indirect way, when they bear the environmental and health costs associated with pollution from electricity generation. With a new EPA administrator recently installed, how these costs are calculated is under new scrutiny. The University of Texas Energy Institute’s Full Cost of Electricity Study includes estimates of these environmental pollution costs as one part of the full system cost of electricity.

There is a well-established body of literature at the intersection of toxicology, epidemiology, and economics; it’s one that also governs how the Environmental Protection Agency estimates the benefits of regulations that reduce pollution from power plants. As part of the University of Texas Energy Institute’s Full Cost of Electricity (FCe-) Study my colleagues and I took a deep dive into the cost of these environmental externalities. Our goal: Describe in detail how the EPA estimates the dollar value of pollution reductions.

Whenever the EPA proposes a major new rule, it undertakes a rigorous analysis, comparing a benefit estimate with its estimate of the societal costs of complying with the proposed rule. Our analysis [PDF] illustrates how the EPA completed this kind of analysis for three recent and major rules targeting fossil-fueled power plants: the Cross State Air Pollution Rule (regulating pollutant transport to downwind communities), the Mercury and Air Toxics Rule, and the Clean Power Plan (regulating greenhouse gas emissions).

In each of these three rulemakings, the EPA concluded that the health and environmental benefits greatly exceeded compliance costs, even though in some cases compliance costs were in the billions of dollars.

These analyses are not without controversy. Many dispute the dollar value that the EPA places on a premature death, and many others disagree with the value assigned to a ton of carbon emissions. For the mercury rule and the greenhouse gas rule, benefits dwarf costs only because of so-called co-benefits—reduction of pollution other than the pollutant targeted by the rule.

These and other measurement issues are laid out in our white paper, “EPA’s Valuation of Environmental Externalities from Electricity Production” [PDF].

David Spence is a professor at the McCombs School of Business and School of Law, part of the University of Texas at Austin.

Software coders William Stevens, from left, Michael Harrison, and Brack Quillen work on computers at the Bit Source LLC office in Pikeville, Kentucky, U.S., on Monday, Feb. 1, 2016.

The Kentucky Startup That Is Teaching Coal Miners to Code

Coal’s role in American electricity generation is fast diminishing. A few large coal-mining companies declared bankruptcy last year, and several coal power plants have been shuttered. The biggest loss in all this has been felt by the tens of thousands of coal miners who have been laid off. But despite the U.S. president’s campaign pledges, those jobs are going to be hard to bring back. Besides competition from natural gas and cheaper renewables, coal mining, and mining in general, is losing jobs to automation.

But now, a small startup in the middle of Appalachian coal country has a forward-looking plan to put miners back to work. Pikeville, Ky.-based Bit Source has trained displaced coal industry veterans in Web and software development.

The retrained workers now design and develop websites, tools, games, and apps. Bit Source is proving that coal miners can, indeed, learn how to code, and do it well.

“Appalachia has been exporting coal for a long time,” says Justin Hall, the company’s president. “Now we want to export code. We’ve got blue-collar coders. It’s the vision of the future of work: How do you train a workforce and adapt it for technology.”

Read More
A torn dollar bill in the shape of the state of Texas

What Role Do Household Incomes Play in the Full Cost of Electricity?

graphic link to the landing page for The Full Cost of Electricity

Electricity is something many of us take for granted. Flip a switch and lights come on, air-conditioning fires up, and computers hum. But how much will new energy cost relative to the income of the people who will consume it? It’s not something most of us in the United States think about as we flip that switch, but it is something we need to understand as we build next-generation power plants and the grid to move that electricity to demand centers.

As part of The Full Cost of Electricity project of the University of Texas at Austin Energy Institute, we wanted to ensure that public and policy discussions had baseline information. So we asked a few simple questions: “How much are households paying for household energy overall? How much of this cost is for electricity? How does this cost compare to incomes?” To answer these questions, we used the data acquired by the U.S. Energy Information Administration, via its Residential Energy Consumption Survey, for the state of Texas [PDF].

Besides being our home, Texas is a microcosm: We have rural and urban areas, flat and hilly country, desertlike areas and coastline.

Here's what we found:

  • Twenty-two percent of Texas households are energy burdened, spending more than 8 percent of income on household energy, and 16 percent of households spend more than 10 percent
  • The vast majority of the cost of household energy is for electricity
  • Fifteen percent of Texas households spend more than 8 percent of household income on electricity alone, and 11 percent spend more than 10 percent
  • Higher incomes translate to higher household electricity consumption, but there are important differences between urban and rural households
  • Other than income, there are several demographic variables that explain whether or not a household spends more than 8 percent on energy. For example, a household where someone is at home during the work day is more likely to be energy burdened

For the details, read the full white paper, "Household Energy Costs for Texans" [PDF] on the Energy Institute’s page for The Full Cost of Electricity.

Carey W. King is the assistant director and a research scientist with the University of Texas Energy Institute.

No need for alarm over high Fukushima radiation levels, say experts

Nuclear Experts: High Radiation Estimates at Fukushima No Surprise to Us

With two robot-probe operations apparently encountering increasingly high radiation levels inside the crippled Fukushima Daiichi nuclear plant during the past three weeks, some media reports suggested the radiation count was climbing rapidly. It didn’t help temper that view when plant operator Tokyo Electric Power Company (TEPCO) had to prematurely halt the second operation last Thursday to yank out the robot probe. Radiation had begun to dim the view of the attached cameras, threatening to leave the robot blinded and therefore unable to retrace its steps and escape the rubble.

The first operation, conducted at the end of January, used a remote-controlled robot equipped with a camera attached to a 10.5-meter-long telescopic rod. Captured video and stills showed images of a dark mass of rubble inside the No. 2 reactor’s primary containment vessel near the pedestal that supports the reactor vessel.

Analysis of the images, meant to determine whether the rubble encountered is corium (a mix of melted fuel and other materials), is still ongoing.

A TEPCO official explained that nuclear engineers conducted radiation lab tests prior to the operations taking place. This enabled the engineers to study the images taken in the first probe and estimate the different radiation levels—the highest of which was estimated to be 530 sieverts an hour. An estimate based on images taken during the second probe put the level as high as 650 sieverts an hour. (To put those numbers in context, when you take an abdominal X-ray, you’re exposed to about 4 millisieverts of radiation.)

TEPCO says it is not particularly surprised at these numbers given that its probes were approaching the reactor vessel. “And these are not direct measurements, but are based on the amount of image noise produced,” a company official emphasized. “There is also a plus or minus error margin of 30 percent.” 

Will Davis, a former U.S. Navy reactor operator, and a communications consultant to the American Nuclear Society who has followed the Fukushima accident since it began, agrees with that conclusion. 

“I don’t think we can realistically make assumptions about rising and lowering radiation levels in these camera-based detection methods yet,” he told IEEE Spectrum. “Not only is the presence of localized [radiation] point-sources possible, but there is also the possibility that streaming of radiation is talking place. In other words, we cannot say that all of the radiation in the containment vessel is coming from one unified lump of damaged fuel in the reactor vessel, and perhaps from a second unified lump sitting under it.”

Davis added that it is only to be expected that the closer the robot probes get to the damaged reactor, the higher the dose rates will be. “This has been expected since the beginning. And the high recent readings—even with the chance of up to 30 percent error—only confirms what experts already knew.”

He pointed out that comparably high radiation levels had been recorded in the aftermath of the Three Mile Island and Chernobyl nuclear accidents. 

TEPCO sent in the two robot probes to pave the way for a third operation planned for later this month. This third probe will use a remotely controlled Scorpion robot equipped with a camera, a dosimeter, and a temperature gauge. 

By contrast, the main purpose of the second probe was to remove sediment. The robot was outfitted with a water cannon and a scraper tool, as well as three cameras. The hope was to blast a path for the Scorpion, which cannot easily maneuver over uneven surfaces.

Despite the operation being halted early due to the impact of radiation, the company official said no further preparatory probes were planned. 

The official added that the information gleaned so far was not regarded as a negative, but rather as an aid in helping the engineers who are conducting these operations. “They are combining and analyzing everything right now, and this will help them determine whether to use the Scorpion or not, and what the next best step is to be.”

The American Nuclear Society’s Davis noted that just getting through the approach and planning stages that will precede the removal of the damaged nuclear fuel inside the reactor vessels and the primary containment vessels “is going to take a very long time, probably many, many years.”

But he also pointed out that while the new estimated radiation levels gleaned from the probes may shock people not following the cleanup closely, “it is important to remember that they are extremely localized and have no impact whatsoever to anyone outside the nuclear plant.”

Pilot testing Quanta3's continuous methane monitoring system at a Texas drill pad

Congress to Curtail Methane Monitoring

Innovation in methane detection is booming amid tightened state and federal standards for oil and gas drillers and targeted research funding. Technology developers, however, may see their market diminished by a regulation-averse Republican Congress and president. Senate Republicans are expected to attempt to complete a first strike against federal methane detection and emissions rules as soon as this week.

Methane is a potent greenhouse gas responsible for an estimated one-fifth to one-one quarter of the global warming caused by humans since the Industrial Revolution, and oil and gas production puts more methane in the atmosphere than any other activity in the United States. Global warming, however, is not a moving issue for Republican leaders or President Donald Trump, who reject the scientific consensus on anthropogenic climate change.

What moves them are complaints from industries that “burdensome" regulations unnecessarily hinder job growth and—in the case of methane rules—domestic oil and gas output. The House of Representatives got the methane deregulation ball rolling on 3 February, voting along party lines to quash U.S. Bureau of Land Management rules designed to prevent more than a third of methane releases from nearly 100,000 oil and gas wells and associated equipment operating on federal and tribal lands.

The House vote is one of the first applications of the hitherto obscure Congressional Review Act of 1996, which gives Congress 60 legislative days to overturn new regulations. If the Senate concurs and President Trump signs, the resulting act will scrap the bureau's ban on methane venting and flaring and its leak-monitoring requirements. It will also restrict the bureau from ever revisiting those mandates.

Read More
Artist's concept of a NuScale nuclear power facility

NuScale Reactor Nears One Milestone, With More to Follow

The U.S. Nuclear Regulatory Commission (NRC) is expected to decide by mid-March whether to accept an application with no fewer than 12,000 pages of technical details that support a design for a small modular nuclear reactor design from NuScale Power.

As Winston Churchill might say, the milestone may not mark the beginning of the end but, just maybe, the end of the beginning.

That’s because the NRC’s act of accepting the application does nothing more than trigger a license certification review for the reactor. (The modular reactor might one day generate electric power for small cities, large hospitals, industrial facilities, and even remote water desalination plants.)

As part of its certification review, the NRC will follow a design-specific standard that lays out multiple requirements NuScale's design must meet. Completing that review and certification process could consume anywhere from 30 to 40 months.

It may be no surprise, then, that NuScale is the first small modular reactor (SMR) to have made it this far in the U.S. regulatory process. And it’s had some help.

Read More
Advertisement

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Load More