18 February 2009—The day an average telecom carrier can send 1 trillion bits (one terabit) of data per second down a single optical fiber may still be many years away. But in the lab, the single-fiber terabit threshold may well be crossed just one or two years from now, thanks to recent research.
Two groups of engineers—one from Australia and Denmark and the other from California—have independently created new optics technologies that could greatly increase the Internet’s speed limits. The key to the new technologies is nonlinear optics, in which physics allows for an optical fiber’s properties to be adjusted from moment to moment.
On the Internet, packets of data are carried mostly by laser light over optical fibers. But in order to route a data packet through each leg of its trip, the address of the packet’s destination must be decoded. This can be done only electronically, by converting the data’s photons into electrons. The electronics are the horse and buggy of the system—unable to keep pace at even a fraction of the bits per second that an optical fiber can handle.
Two leading contenders in the terabit Internet race involve tunable optical fibers and tunable arsenic-glass chips. In both cases, lasers transmit the signal’s zeroes and ones, while a separate laser (or set of lasers) adjusts the properties of the optics in order to ”demultiplex,” or break up, the data into separate streams whose data is in the right range for electronics.
”The highest bit rate that you can do electrically is 40 gigabits per second right now,” says Leif Katsuo Oxenlowe, associate professor of photonics engineering at the Technical University of Denmark, in Lyngby. ”That’s at least what’s commercially available. In labs around the world, [electronics is] reaching 100 Gb/s.”
Oxenlowe says his group of 13 scientists—led by Benjamin Eggleton at the Centre for Ultrahigh Bandwidth Devices for Optical Systems (CUDOS), which is part of the Laser Physics Centre at the Australian National University in Canberra—has been testing a 5-centimeter-wide glass chip that can receive optical signals of up to 640 Gb/s and, when a laser periodically changes the chip’s index of refraction billions of times per second, siphon off a 10 Gb/s signal. That slower signal, in turn, can then be piped into a standard electronic router.
Oxenlowe says that his group’s chip, made of an arsenic and sulfur compound called a chalcogenide, would be relatively cheap and easy to mass-produce and thus could someday be a part of a terabit Internet’s fiber-optic backbone. The drawback is that only one stream can be split off per chip. To completely demultiplex a 640 Gb/s stream into a series of 10 Gb/s streams, you’d need 64 individual chips. If 40 Gb/s electronics were available, you’d need only 16 chips.
On the other hand, a group of researchers at the University of California, San Diego, funded by the Defense Advanced Research Projects Agency, have developed a method of demultiplexing high-speed optical signals that works right inside the optical fiber.
”It’s an almost philosophical difference,” says group leader Stojan Radic, professor of electrical engineering at UCSD. ”From their perspective, it’s very good to go very fast. From our perspective, we don’t think people want to build 16 or 32 devices.”
Radic’s approach involves sending the near-terabit signal down an extremely thin length of specially prepared optical fiber. Just 3 micrometers wide but 50 meters long, this nonlinear medium has been precision-engineered to change its optical properties in response to pulses from a ”pump” laser. The pump pulses combine with the data stream in the fiber to create a series of eight ”overtones.” These overtones are basically copies of the original 320 Gb/s data stream, only now in a rainbow of colors. A second pump laser then acts like a strobe light, picking out one-eighth of each color’s data stream. The end result is a single fiber containing eight different-colored data signals that are each separate 40 Gb/s portions of the 320 Gb/s original. A simple prism can then separate the 40 Gb/s streams, with each stream traveling to electronics that can handle its comparatively slower data rates.
Peter Andrekson, head of the photonics laboratory at Chalmers University of Technology, in Gothenburg, Sweden, says he sees a need for both technologies. The UCSD team’s fiber-based approach uses a robust medium that is well tested and understood and thus could be useful in high-volume or high-sensitivity applications that leave little room for error or chance. On the other hand, Andrekson says, the Danish-Australian group’s chip might be further developed to perform more-complex optical processing, like complex switching routines.
”To get to the terabit, it looks like you really have to rely on these fibers or other optical nonlinearities,” Andrekson says. ”There’s no way in the near future we will see any electronic or semiconductor solution.”
About the Author
Mark Anderson is an author and science writer based in Northampton, Mass. In October 2008 he wrote for Spectrum Online about the bailout of the controversial physics experiment Gravity Probe B.
To Probe Further
Oxenlowe’s group’s work has been published in the latest issue of the online journal Optics Express.
Radic’s latest findings will be presented next month at the Optical Fiber Communication Conference and Exposition and the National Fiber Optic Engineers Conference in San Diego.