Polarizing the Data Center: Spin Lasers Deliver 240 Gigabits Per Second

Next-generation, low-power optical data transmission may rely on polarization rather than switching laser pulses on and off

3 min read
A close-up of a metal mount with a thin needle pointed toward it.
Photo: RUB/Kramer

Researchers in Germany have tested a new form of high-speed data transfer that could boost connection speeds by five times or more. Laser light is still the carrier of information in this prototype technology, but the zeroes and ones are encoded according to the oscillating polarization in the light beam rather than its intensity.

The new technology also works at room temperatures and consumes less power than the intensity method—in which “1” would be represented by a bright laser burst and “0” is either a dim burst or no laser pulse at all.

Transmitting data via laser polarization, by contrast, would instead involve encoding a “1” as a burst of circularly polarized light whose corkscrew turns to the left, while a “0” would be a burst of circular polarization corkscrewing to the right.

The laser polarization method, if it can scale beyond the lab bench, would provide potentially welcome relief to data centers and server farms, whose high-speed interconnects produce substantial waste heat that requires additional cooling.

By the researchers’ calculations, the polarization-based data transfer method can hit 240 gigabits per second but still generate only about 7 percent as much heat as a traditional connection running at 25 gigabits per second.

“Normally, you make the throughput faster by pumping the laser harder, which automatically means you consume a lot of power and produce a lot of heat,” says Nils Gerhardt, chair of photonics and terahertz technology at Ruhr-University Bochum in Germany. “Which is actually a big problem for today’s server farms. But the bandwidth in our concept does not depend on the power consumption. We have the same bandwidth even [at] low currents.”

Gerhardt’s group, which published its work this month in the journal Nature (after posting an earlier draft to the preprint server arXiv), has so far only developed a proof-of-concept system in their lab. The current technology has a long backstory, too. Since 1997, researchers have been studying the connection between electron spin in certain semiconductor lasers and the polarization of its laser light.

Yet, only in the past few years have the technologies converged to deliver data transfer speeds in excess of the fastest speeds provided by an intensity-based laser.

A man wearing a blue shirt in a lab reaches over a machine to hold up a lens in front of it. Photo: RUB/Kramer

In 2016, for instance, Gerhardt’s group published findings from an earlier version of their technology that achieved 44-GHz polarization modulation—potentially translating into data transfer speeds of 44 gigabits per second. This is in the neighborhood of the known speed limit of current-generation light intensity technologies (estimated to be between 40 and 50 gigabits per second).

On the other hand, don’t write off the traditional methods just yet. There may be other ways for light-intensity-based data transmissions to compete in the hundreds-of-gigabits-per-second realm. But as Gerhardt and coauthors point out, even exotic technologies like “mode-locked semiconductor laser diodes” and “quantum cascade lasers” haven’t yet proved intensity-based laser data transfer much above the 100-gigabit threshold.

By contrast, Gerhardt’s group estimates that 240 gigabits per second may only be a stair step to still even higher data transfer speeds. For instance, their paper discusses gallium-arsenide-based semiconductor lasers that could theoretically achieve polarization data transfer speeds in excess of 500 gigabits per second.

Of course, proving that this technology could actually work in the real world and not just in the lab is an important next step.

“Every new server farm placed by Google or Facebook or Amazon needs to have higher bandwidth with less energy consumption,” Gerhardt says. “And in particular, the speed of the interconnect is a limiting factor right now.”

The technology Gerhardt’s group is testing, he says, would be more suitable for interconnecting nodes in a data center or server farm than, say, acting as an Internet backbone—which requires more time-tested technologies that can reliably run at high volumes.

As it is, they’re still trying to figure out how to modulate circular polarization back and forth at high rates with high reliability and repeatability. (At the moment, they achieve some of their clever high-speed polarization modulation by physically bending the circuit board without breaking it. Which is more of a wonky lab trick than a manufacturing technique for mission-critical hardware in a data center.)

“Right now, this is a concept,” Gerhardt says. “We still have a lot of research before we can make a device that you can buy off the shelf. Many challenges to go.”

The Conversation (0)

Metamaterials Could Solve One of 6G’s Big Problems

There’s plenty of bandwidth available if we use reconfigurable intelligent surfaces

12 min read
An illustration depicting cellphone users at street level in a city, with wireless signals reaching them via reflecting surfaces.

Ground level in a typical urban canyon, shielded by tall buildings, will be inaccessible to some 6G frequencies. Deft placement of reconfigurable intelligent surfaces [yellow] will enable the signals to pervade these areas.

Chris Philpot

For all the tumultuous revolution in wireless technology over the past several decades, there have been a couple of constants. One is the overcrowding of radio bands, and the other is the move to escape that congestion by exploiting higher and higher frequencies. And today, as engineers roll out 5G and plan for 6G wireless, they find themselves at a crossroads: After years of designing superefficient transmitters and receivers, and of compensating for the signal losses at the end points of a radio channel, they’re beginning to realize that they are approaching the practical limits of transmitter and receiver efficiency. From now on, to get high performance as we go to higher frequencies, we will need to engineer the wireless channel itself. But how can we possibly engineer and control a wireless environment, which is determined by a host of factors, many of them random and therefore unpredictable?

Perhaps the most promising solution, right now, is to use reconfigurable intelligent surfaces. These are planar structures typically ranging in size from about 100 square centimeters to about 5 square meters or more, depending on the frequency and other factors. These surfaces use advanced substances called metamaterials to reflect and refract electromagnetic waves. Thin two-dimensional metamaterials, known as metasurfaces, can be designed to sense the local electromagnetic environment and tune the wave’s key properties, such as its amplitude, phase, and polarization, as the wave is reflected or refracted by the surface. So as the waves fall on such a surface, it can alter the incident waves’ direction so as to strengthen the channel. In fact, these metasurfaces can be programmed to make these changes dynamically, reconfiguring the signal in real time in response to changes in the wireless channel. Think of reconfigurable intelligent surfaces as the next evolution of the repeater concept.

Keep Reading ↓Show less