5G’s Waveform Is a Battery Vampire

As carriers roll out 5G, industry group 3GPP is considering other ways to modulate radio signals

3 min read

A group of cellular antenna with a sign for "China Mobile" and "5G."
Power Players: As carriers launch 5G, some experts say the industry should have chosen a more efficient modulation option for the service.
Photo: China News Service/VCG/Getty Images

In 2017, members of the mobile telephony industry group 3GPP were bickering over whether to speed the development of 5G standards. One proposal, originally put forward by Vodafone and ultimately agreed to by the rest of the group, promised to deliver 5G networks sooner by developing more aspects of 5G technology simultaneously.

Adopting that proposal may have also meant pushing some decisions down the road. One such decision concerned how 5G networks should encode wireless signals. 3GPP’s Release 15, which laid the foundation for 5G, ultimately selected orthogonal frequency-division multiplexing (OFDM), a holdover from 4G, as the encoding option.

But Release 16, expected by year’s end, will include the findings of a study group assigned to explore alternatives. Wireless standards are frequently updated, and in the next 5G release, the industry could address concerns that OFDM may draw too much power in 5G devices and base stations. That’s a problem, because 5G is expected to require far more base stations to deliver service and connect billions of mobile and IoT devices.

“I don’t think the carriers really understood the impact on the mobile phone, and what it’s going to do to battery life,” says James Kimery, the director of marketing for RF and software-defined radio research at National Instruments Corp. “5G is going to come with a price, and that price is battery consumption.”

And Kimery notes that these concerns apply beyond 5G handsets. China Mobile has “been vocal about the power consumption of their base stations,” he says. A 5G base station is generally expected to consume roughly three times as much power as a 4G base station. And more 5G base stations are needed to cover the same area.

So how did 5G get into a potentially power-guzzling mess? OFDM plays a large part. Data is transmitted using OFDM by chopping the data into portions and sending the portions simultaneously and at different frequencies so that the portions are “orthogonal” (meaning they do not interfere with each other).

The trade-off is that OFDM has a high peak-to-average power ratio (PAPR). Generally speaking, the orthogonal portions of an OFDM signal deliver energy constructively—that is, the very quality that prevents the signals from canceling each other out also prevents each portion’s energy from canceling out the energy of other portions. That means any receiver needs to be able to take in a lot of energy at once, and any transmitter needs to be able to put out a lot of energy at once. Those high-energy instances cause OFDM’s high PAPR and make the method less energy efficient than other encoding schemes.

Yifei Yuan, ZTE Corp.’s chief engineer of wireless standards, says there are a few emerging applications for 5G that make a high PAPR undesirable. In particular, Yuan, who is also the rapporteur for 3GPP’s study group on nonorthogonal multiple-access possibilities for 5G, points to massive machine-type communications, such as large-scale IoT deployments.

Typically, when multiple users, such as a cluster of IoT devices would communicate using OFDM, their communications would be organized using orthogonal frequency-division multiple access (OFDMA), which allocates a chunk of spectrum to each user. (To avoid confusion, remember that OFDM is how each device’s signals are encoded, and OFDMA is the method to make sure that overall, one device’s signals don’t interfere with any others.) The logistics of using distinct spectrum for each device could quickly spiral out of control for large IoT networks, but Release 15 established OFDMA for 5G-connected machines, largely because it’s what was used on 4G.

One promising alternative that Yuan’s group is considering, non-orthogonal multiple access (NOMA), could deliver the advantages of OFDM while also overlapping users on the same spectrum.

For now, Yuan believes OFDM and OFDMA will suit 5G’s early needs. He sees 5G first being used by smartphones, with applications like massive machine-type communications not arriving for at least another year or two, after the completion of Release 16, currently scheduled for December 2019.

But if network providers want to update their equipment to provide NOMA down the line, there could very well be a cost. “This would not come for free,” says Yuan. “Especially for the base station sites.” At the very least, base stations would need software updates to handle NOMA, but they might also require more advanced receivers, more processing power, or other hardware upgrades.

Kimery, for one, isn’t optimistic that the industry will adopt any non-OFDMA options. “It is possible there will be an alternative,” he says. “The probability isn’t great. Once something gets implemented, it’s hard to shift.”

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions