Why the Way We Calculate TV Energy Efficiency is Wrong

And how the CTA is aiming to change it

3 min read

tv brightness screen
Photo: iStockphoto

Energy consumption is probably not the first thing anyone considers when buying a new TV. Compared with obsolete CRTs, today’s LED-illuminated TV displays use very little energy, between 94 and 267 kilowatt-hours per year for a typical 55-inch display. (A CRT television of less than half that size used about twice as much energy.)

But maybe you are someone who isn’t just shopping for screen size, styling, and smart TV features; maybe you care about energy. In the United States, there are two labels to look for. One is the EnergyGuide. This mandatory program, administered by the Federal Trade Commission (FTC), requires TV manufacturers to place a yellow label on the product, indicating the annual cost of operation in dollars, assuming typical use of 5 hours daily and  electricity priced at 11 cents per kilowatt hour. For the aforementioned 55-inch TV, an EnergyGuide estimate might be around $22 per year.

The other label comes from Energy Star, a voluntary program that recognizes the most energy efficient models of their generation. A joint effort by the U.S. Environmental Protection Agency (EPA) and the Department of Energy (DOE), Energy Star updates its standards as efficiencies improve across the board.

In each case, these programs rely on a standardized test method, mandated by the U.S. Department of Energy (DOE) to calculate TV energy consumption.

“In a nutshell,” says Dave Wilson, “you set up a TV, run a test signal—in this case, a set of video clips—and measure the power consumption.” Wilson is vice president of technology and standards for the Consumer Technology Association (CTA), an organization of companies that, among its many programs, runs the annual international CES.

“There are complexities,” he explains. For example, he says, “TVs with automatic brightness control are tested at several different levels of ambient light [to get a] general sense of their screen brightness and energy consumption over various operating conditions.”

The current method of energy use testing, while using standardized videos and lighting conditions, doesn’t completely compare the performance between different TV models from different manufacturers, Wilson explained. Simply, some TVs may produce brighter pictures than others in general, so their average brightness is higher. And this brightness—or luminance, in industry parlance—can vary dramatically between models and manufacturers.

The idea, explained Wilson, is to use a camera to capture the entire screen while the test videos play. Data from the image sensor in the camera will be used to calculate the average luminance.

“If a company can design TVs to generate a certain level of brightness using less power than a competitor generating the same brightness, the thinking is, they ought to get credit for that,” Wilson says.

The group will also look at ways of improving how other features are tested, he said.

The CTA, working with representatives from television manufacturers and other organizations, like the Natural Resources Defense Council and the American Council for an Energy-Efficient Economy, aims to develop a new, voluntary energy measurement standard that includes updated test techniques. CTA convened a working group and began discussions on that standard early this month.

Once the group has hammered out a standard, explains Doug Johnson, CTA vice president for technology policy, “We will present it to the federal agencies and recommend it be adopted as the new, official DOE test standard.”

The current protocols were locked in based on work done by standards organizations several years ago, Wilson says. “TV technology evolves constantly, and test procedures need to be updated to address new technologies and features.”

“We’re hoping to get this done in a few months,” he says. “We know that’s an aggressive schedule. We’ll see if we can pull it off.”

The Conversation (0)