Two on-stage demonstrations at National Instruments’ NI Week, which was held in Austin, Texas, from 3–6 August, showed off working infrastructure prototypes for the 5G networks that will be the backbone of Big Data applications and the Internet of Things (the conference’s themes). 5G’s goals include hundred-fold performance improvements over today’s 4G systems, allowing the transmission of tens of megabits per second to thousands of mobile users, and offering peak rates well north of a gigabit per second within an office.
To open Day 1, researchers from Samsung showed off full-dimensional multiple-input, multiple-output (FD-MIMO), one of several parallel techniques for squeezing more information through the airwaves. FD-MIMO sculpts the signals fed to an array of transmission antennas to form virtual beams that can lock in on multiple receivers in three dimensions. This cuts interference from overlapping simultaneous transmissions to other receivers and increases the power of the signal that reaches the target.
The Samsung demo base station transmitted simultaneously at different data rates to four separate receivers. For demonstration purposes, the base station transmitted at 3.5 gigahertz (~86 millimeter wavelength), though production transmitters will likely use carriers in the tens-of-gigahertz range. (The receivers were configured on NI USRP RIOs, or Universal Software Radio Peripheral Reconfigurable IO, a transceiver that can be programmed to reproduce the characteristics of a variety of RF devices over a range of frequencies.)
Initially, in conventional broadcast mode, interference between the four streams garbled the signals and kept any of the data from getting through. Switching to FD-MIMO, however, modulated the signals produced by each of the base station’s 32 antennas to allow beamforming in three dimensions. The transmitter homed in on each of the four receivers to push through separate, clear signals. Throughputs measured at the receivers jumped from essentially zero to as much as 28 megabits per second. The Samsung engineers cautioned, though, that the demonstration was intended to show how much FD-MIMO can improve signal quality, not to showcase a fully blown 5G concept.
(For a quick look under the hood of Samsung’s FD-MIMO, see the Xcell blog of Xilinx’s Steve Leibson.)
On Day 2, engineers from Nokia demonstrated another cornerstone of 5G: higher frequency. In what Nokia’s head of North American radio research Amitabha Ghosh called the first public demonstration of their 5G base station (it debuted at an invitation-only event at the Brooklyn 5G Summit last April), Ghosh and his colleagues sent two MIMO streams across the stage. The 73-GHz (~4 mm) signals used 2 GHz of bandwidth to achieve a combined throughput greater than 10 gigabits per second with a latency under 1 millisecond. (To see the video, go to the NI Week web site, click the “Wednesday” tab, and select “Nokia.”)
Nokia’s proof-of-concept system is the latest iteration of a 1 GHz demonstration displayed last year that used fully programmable FPGA components. Ghosh also reported that 73-GHz signals had been successfully transmitted outdoors to fast-moving receivers and over distances of more than 200 meters.
The results are significant. Some parts of the millimeter-wave spectrum are open in part because they span the resonant frequencies of atmospheric water and oxygen, which produce spikes in atmospheric absorption. While there is a local attenuation trough around 73 GHz (between flanking spikes at 60 and 120 GHz), atmospheric losses are still about 20 times higher than they would be for a 1-GHz carrier. This circumstance had bred widespread doubt that useful signals could be carried at all in that part of the spectrum…doubts that these results have helped to quiet.