This is a guest post. The views expressed here are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE.
Much of 5G's buzz centers on the potential it offers society: Lightning-fast downloads, barely-there latencies, and new and improved technologies like virtual reality and self-driving cars. Behind the buzz, however, one amazing fact is all but disregarded—the fundamental technological breakthroughs of 5G. Millimeter wave technology, small cell densification, and massive multiple-input multiple-output (massive-MIMO) antenna systems are paving the way for the next several decades of the wireless industry. Together, these technologies will futureproof wireless networks as we enter an era of wireless cognition and human-style computing. In fewer than 20 years, wireless networks will carry information at the speed of the human brain.
To fully appreciate just how revolutionary the 5G era will be, and the impact it will have on the wireless industry going forward, let's first consider the past 10 years. Over that period of time, the growth in data capacity and data consumption over the global cellular network has far outstripped Cooper's law. Coined by engineer Martin Cooper, the 'law' originally suggested that cellular telephone links would double in capacity about every 30 months (a factor of 16 each decade).
It's true that average download speeds have increased in obedience with Cooper's law from a few megabits per second in 2010 to about 50 Mbps in 2020. However, peak throughput data rates over the same time frame have scaled by more than a factor of 1000, from megabits per second to gigabits per second. In fact, the wireless industry trade association CTIA has shown that the total capacity carried over the US cellular network increased a staggering 96 times from 2010 to 2019, with the average smartphone user consuming 9.2 gigabytes of data per month in 2019.
What is astounding is that this nearly hundredfold increase in carried capacity happened, in the United States at least, with only 40 percent more available spectrum in the past 10 years as consumers embraced smart phones. What's more, it happened before the industry's adoption of the key 5G enablers: small cells, millimeter wave spectrum, and M-MIMO antenna technology. The vast spectrum resources available at frequencies above 100 GHz promise many orders of magnitude of additional spectrum in the U.S. alone in the decades to come.
Just like how Moore's law brought millions of times more processing power over four decades, the three technological pillars of 5G will unleash an exponential expansion that will bring vast new capacity and use cases in the coming decades. 5G will support average and peak data rate increases for each user, as well as carried traffic across the network, such that in ten years from now, industry metrics will surely be one hundred times today's levels—and truthfully, most likely closer to two or three hundred times. This means that the average smart phone user in 2031 is likely to consume more than a terabyte per month, and the typical 2 to 3 Gbps peak wireless download speeds in today's nascent 5G networks will in 2031 approach peak data rates of 1 terabit per second.
The Federal Communications Commission (FCC) saw the clear need for legislation that would enable wireless carriers to rapidly densify the network throughout the United States. The agency's 2018 Small Cells Order made small cells one of the three key technical pillars that will eventually support future terabit per second wireless communications.
The FCC also saw the need and potential of spectrum above 95 GHz when it opened that spectrum up through its Spectrum Horizons order in 2019. In England, the Office of Communications (Ofcom) followed the FCC's lead in opening up spectrum above 100 GHz for the first time in 2020, as well as pursuing more available spectrum in this sub-terahertz region.
Lastly, the third technology pillar for 5G, massive-MIMO, will shift base station antennas from having 2-by-2 antenna elements to 16-by-16 arrays and eventually 64-by-64 arrays and beyond, greatly multiplying a single base station's capacity. Carriers around the world are already deploying massive-MIMO in mid-band spectrum using time-division duplexing (which simply means that the signals between a base station and user share the exact same frequency but are spaced out in time so as to not collide with one another). The technology will find its way into millimeter wave and sub-terahertz wireless systems in the coming years.
Graduate students (from left) Shihao Ju, Ojas Kanhere, and Yunchou Xing (seated) take rooftop measurements of a 142 GHz frequency signal at NYU Wireless' Brooklyn campus.NYU Wireless
And here's the reason all of this really matters: Contrary to popular belief, wideband data transfers perform better at millimeter wave and terahertz frequencies than at the lower frequencies used in the first four generations of cellular technology. At NYU Wireless, we showed that, between sub-6 GHz and 140 GHz frequencies, the propagation path loss for an urban radio channel doesn't really differ at different frequencies, after accounting for the radiated signal's first meter of travel. This means that once a radio signal reaches what's called the "far field" (in other words, beyond the first meter or so), the frequency has surprising little impact on a signal's attenuation as it travels through urban and indoor channels—barring inclement atmospheric conditions like rain, or frequencies that are liable to be absorbed by molecular resonances, like for oxygen. This also means that the extensive work to densify cell sites today will pay a huge dividend in future networks that operate above 100 GHz. Networks won't likely require further densification, and today's new tower sites for 5G will be usable for decades to come without the need to build many more.
The wireless industry groupthink has traditionally considered omnidirectional antennas to be the norm in wireless communications. But beginning with 5G, wireless systems are using directional antennas that have high antenna gains and narrow beamwidths on both the mobile and the base station ends of each link. This offers more, not less, signal strength to each user as we move to millimeter wave, sub-terahertz, and ultimately terahertz frequencies.
Additionally, it's a myth that for a given distance traveled, radio energy becomes more lossy in free space as we increase above millimeter frequencies. We've shown that up to around 400 GHz, typical air only has a 10 decibels of loss per kilometer. This is merely a loss of only 1 decibel per 100 meters, which is about the range for a typical 5G small cell today. Even up to 900 GHz, we've shown that much of the spectrum suffers a 100 dB/km loss, or, for a small cell, only 10 decibels over 100 meters. 10 dB will be relatively easy to make up for with directional antennas at such high frequencies. While rain and foliage are problematic, once 5G cells are engineered to alleviate the effects of rain for transmissions up to 70 GHz, there is no further degradation beyond those frequencies up to 1 THz. This again shows that the efforts to deploy 5G will hold for decades to come. Snow is a different issue, but engineers must realize how the directional nature of future antenna technologies overcome the preconceived notions of greater loss at higher frequencies, as such thinking stems from a bygone era when omnidirectional antennas were prevalent.
The upshot of all this is that the additional link gain with these 5G technologies more than offsets any radio channel loss when site-specific deployment is used to avoid massive obstructions. Most things become reflective at and above millimeter wave frequencies, allowing more opportunities for directional antennas to find and combine signal paths. This improves the radio frequency power budget, which can then be used for providing much wider bandwidth channels with no worsening of a channel's signal-to-noise ratio compared to today's wireless systems. In turn, that means that wider bandwidth channels may be supported using the existing 5G tower infrastructure as we move up to wider bandwidth channels and higher carrier frequencies. Remarkably, the advantages offered by the three technical pillars of cell densification, wider bandwidth channels, and massive-MIMO will allow the engineering and deployment of 5G systems to carry forward for decades into the future as more spectrum is opened up on the sub-terahertz and terahertz bands. In fact, we predicted this nearly a decade ago.
Future 6G and 7G cellular networks will be able to use the same cellular infrastructure—and the same fundamental breakthroughs in radio circuits and antenna technologies—needed for 5G. Yet those cellular generations will do so with much greater data rates that will create massive data capacities and new use cases in the coming decades. This alone should motivate governments, funding agencies, wireless providers, and citizens to make the rollout of 5G a priority. On top of that, the vast capabilities of 5G and beyond should also motivate both the industry and governments to formulate new architectures, like Open RAN, as well as a new emphasis on security. There is a tremendous amount of opportunity at stake for building today's 5G network, with benefits to inure for decades to come as we increasingly come to depend on these invisible waves.
- With 5G Rollout Lagging, Research Looks Ahead to 6G - IEEE ... ›
- Here's What 6G Will Be, According to the Creator of Massive MIMO ... ›
- 6G Is Years Away, but the Power Struggles Have Already Begun - IEEE Spectrum ›
Theodore (Ted) S. Rappaport is the David Lee/Ernst Weber Professor of Electrical Engineering at the NYU Tandon School of Engineering (NYU-Tandon), and founding director of NYU WIRELESS, a multidisciplinary research center focused on the future of wireless communications and applications. He is a professor of computer science at New York University's Courant Institute of Mathematical Sciences and of radiology at the NYU School of Medicine. Earlier in his career, he founded the Wireless Networking and Communications Group (WNCG) at the University of Texas at Austin in 2002 and the mobile communications research center now called Wireless@ at Virginia Tech in 1990.