Tech Talk iconTech Talk

Researchers Demonstrate A Terahertz Multiplexer

Terahertz waves, which span a frequency range of 300 to 3000 gigahertz, promise a hundred times faster data transmission than today’s cellular and wireless networks. Researchers now face the daunting task of designing and building a communication system with all new components that can work with terahertz radiation. Many efforts have focused on building compact terahertz sources, transmitters and detectors.

Now a group at Brown University in Providence, R.I., has built another key component of any wireless communication system: a multiplexer. Multiplexers combine separate data streams, typically each at a different frequency, into a single combined stream that is sent over optical fibers, TV cables, or telephone lines, making it possible for that medium to carry thousands of phone calls or tens of TV channels at the same time. A demultiplexer separates the signals at the receiver end.

The new device, which acts as both a multiplexer and demultiplexer, is reported in the journal Nature Photonics. It consists of two smooth aluminum plates (any metal should work) placed in parallel, with a few-millimeter gap between them. The plates act like a waveguide for terahertz waves, which travel between them.

To explain how the multiplexer works, it’s helpful to first describe how signals are separated at the receiving end. As the waves travel through the waveguide, some of the radiation leaks out of the slit. The angle at which it emerges depends on the frequency of the wave. So if an entering wave contains multiple frequencies, each carrying a separate data channel, they would each come out a different angle, effectively separated or demultiplexed.

The converse occurs with a multiplexer; the device accepts a signal of a certain frequency only if it comes in at a certain angle.

“We’re envisioning free-space wireless terahertz communication networks,” says Daniel Mittleman, a professor of engineering at Brown University. “But the important difference between a terahertz network and the existing cellular network is that the antenna is broadcasting in all directions. With terahertz, there’s 100 times less diffraction so it’s much more reasonable to think of it as a beam with some directionality.”

The researchers demonstrated a simple multiplexing scheme in which they sent two independent THz-frequency channels to the multiplexer at two different angles. For the source, they used a commercial femtosecond laser, which delivers short pulses of terahertz radiation and is used for spectroscopy. In a practical multiplexer, specially-shaped waveguides and optics could send hundreds of different channels into the multiplexer at different angles.

The device is completely passive right now. But Mittleman says he and his team could make it dynamic by varying the spacing between the plates. The researchers found that this spacing affects the frequency, angle, and bandwidth of the signal that leaks out of the slit. This could, for instance, be used to tune the bandwidth of the channels, Mittleman says.

“In the waveguide, imagine that the lower plate has a trench dug into it parallel to the slot and directly underneath,” he says. “You could use a silicon microelectromechanical switch to dynamically tune the depth of the trench at any location, so that the plate separation varies underneath slot. That would change the bandwidth of that channel.”

The researchers hope to make and demonstrate an active device of that type that changes positions on the timescale of a few milliseconds.

Disney Seeks to Make Visible Light Communication Practical

Visible light communication (VLC) is a method of transmitting data using LEDs. It's a simple idea: since it's possible to turn an LED on and off much faster than the human eye can resolve, you can use lights to transmit binary data in the form of light pulses without any detectable flickering that would drive a human nuts. This data transmission is ambient, detectable by other LEDs (acting as photodiodes), and it seems like a promising way to Internet-of-Thingsify household objects without having to rely on Wi-Fi.

Using modulated ambient light to transmit data isn't a new idea; this "Li-Fi" is already being used commercially, with GE and Philips (among others) deploying data transmitting LEDs in places like stores and warehouses. However, it's difficult to encourage wider adoption of the technology because Li-Fi is generally used in specific (and isolated) situations where it operates alongside existing communications technologies rather than being integrated with them. In other words, you can have Wi-Fi, and you can have Li-Fi, but they don't work together all that well.

For easy and convenient Internet-of-Things-scale adoption, Li-Fi needs to be able to run on the same Internet Protocol (IP) that all of our other Web-connected stuff runs on. Taking the lead on this is Disney Research, which has a visible light communication system that uses a friendly Linux-based networking backbone to, among other things, let you use a magic wand to light up a princess dress. Sign me up.

Read More

New U.S. Military Chip Self Destructs on Command

A new chip built on strained glass can shatter within 10 seconds when remotely triggered. It’s not quite as fast as the fictional Mission: Impossible messages that self-destruct in five seconds, but such vanishing electronics could prove tremendously useful for the U.S. military and corporations by keeping data secure and out of unwanted hands.

Read More

Stimulating Damaged Spines Rewires Rats for Recovery

human os icon

A promising new study shows that the nervous system can rewire itself—with a little help from neural engineers. 

For someone with a spinal cord injury, destroyed neurons act like a roadblock that prevents movement commands from traveling down the spinal cord and along the nerves. Although an injured person wills his fingers to grasp a cup, for example, the command never makes it to his hand.

But a study published today suggests that precisely controlled electrical stimulation can encourage the nervous system to create detours around that roadblock, allowing the command to get through. 

Neuroscientist Steve Perlmutter and his colleagues at the University of Washington devised a clever experiment using rats. The animals were first trained to perform a task in which they reached through narrow slots with their dominant forelimbs to grab food pellets. The rats were then given incomplete spinal cord injuries that almost totally paralyzed those limbs.

Next the rats were divided into three groups and, as if they were in physical therapy, trained again on the same task. The control group tried to perform the reach-and-grasp task unaided, the second group received random pulses of electrical stimulation in their spinal cords during the task, and the third group received stimulation pulses that were triggered by the rats’ attempts to move their immobilized limbs. 

img
Image: Perlmutter et al.
The head-mounted neurochip device recorded electrical signals from the muscles (called electromyographic or EMG activity) and triggered pulses of intraspinal microstimulation (ISMS).

The key advance here is that triggering technique. The researchers used a device called the neurochip-2, which recorded the weak electrical signal from the limb muscles and used that signal as the cue to initiate a pulse of electrical stimulation in the spinal cord. When the attempted muscle movement was synchronized with neural stimulation, the researchers believe that surviving neurons in the spinal cord formed new connections linking the muscles to the brain’s motor control region.  

What’s the underlying mechanism behind this remarkable repair work? I have just one word for you: neuroplasticity. Neural networks are malleable, and changing the patterns of connections between neurons can restore lost function. That’s why people who’ve suffered spinal cord injuries do rehab: They’re not trying to bring dead neurons back to life, but rather to teach the nervous system to work around them. However, people typically don’t recover much function with rehab alone. 

Perlmutter’s research suggests that adding electrical stimulation to rehab could provide a real boost. Over the course of the three-month study, the rats with neurochips showed dramatic improvement. The synchronized-stimulation rats ultimately performed the task 63 percent as well as they had before their injuries. Both the control group and the random-stimulation group performed about 30 percent as well as they did pre-injury.

Spectrum has covered “closed-loop” neurostimulation systems before, most notably in this feature article written by researchers from the companies Medronic, Cyberonics, and Neuropace. The authors described systems that used various bodily signals to trigger electrical pulses that countered epilepsy attacks and chronic pain. Such smart and responsive systems, which are now being used in humans, seem a clear step forward in electrical therapeutics

While the study from Perlmutter and his colleagues was conducted in rats, it points the way toward a new rehab strategy for people with spinal cord injuries. What’s more, it serves as a proof of principle for a strategy that may help people with other nervous system dysfunctions. By leveraging “the nervous system’s intrinsic capacity for reorganization and repair,” the authors write, electrical stimulation could help people regain lost motor abilities, perhaps, or bladder, bowel, or sexual function. 

"Tardis" Memory Could Enable Huge Multi-Core Computer Chips

Future generations of computer chips could become much more powerful, with processors containing hundreds or even thousands of cores. But these huge multi-core processors will also require loads of memory so their directories can keep track of data on each individual core and coordinate updates to shared data. A new MIT technique promises to greatly reduce the required memory for such coordination as multi-core processors scale up in the coming years.

Read More

Mosquitos Have Brought a Nasty New Disease to the Americas. Can Computer Models Predict Its Spread?

So far in 2015, more than 565,000 people in the Americas and the Caribbean have come down with chikungunya, a viral disease spread by mosquitoes, according to estimates from the Pan American Health Organization. That’s pretty impressive work for a virus that made its first appearance in the western hemisphere in December 2013. 

Public health officials throughout the Americas have been scrambling to contain this unprecedented outbreak of chikungunya. To do so, it would certainly be helpful to be able to predict when and where the next hotspots will occur. But right now, public health officials simply don’t have the necessary tools, DARPA program manager Matthew Hepburn tells IEEE Spectrum

If we ask decision makers, whether they’re in the United States, or the ministry of health in one of these other countries, or the Pan American Health Organization, ‘What models or forecasts are you using today to make predictions?’ The answer is, ‘We’re not using any,’ ” Hepburn says.  

DARPA sought to correct this situation last year by issuing a challenge to computer modelers, asking them to predict the spread of chikungunya for the six-month period between September 2014 and March 2015. 

The results don’t seem very encouraging. “No one did a really good job of forecasting when a disease was going to spread to a new country, and when it would go into an exponential growth phase,” Hepburn told the crowd at a DARPA conference this summer.

Read More

What's Next After 25 Years of Wi-Fi?

In 1997, the first version of Wi-Fi appeared. (The same year saw about half of U.S. homes using AOL as their Internet Service Provider, Netscape with the most web browser users, and Microsoft rescuing Apple from the verge of bankruptcy.) Today, the the Wi-Fi standard known as IEEE 802.11 celebrates its 25th anniversary in a world where many people take Wi-Fi access for granted while streaming high-definition video and checking in on social media through their smartphones and laptops.

Read More

Experiments Show How Lasers Can Despin Asteroids by Turning Them Into Rockets

Sometime in the 2020s, NASA will launch the Asteroid Redirect Mission (ARM) towards a 30-meter space rock with the goal of picking a boulder up off of its surface and returning the rock to Earth for us to have a look at. NASA has to be very careful in deciding which asteroid to plunder for this mission, because the spacecraft the space agency plans to send won't have a good way of dealing with an asteroid that's spinning, which lots of asteroids are. And realistically, how the heck do you stop a giant space boulder from spinning, anyway? The answer is of course to use lasers, because, well, lasers solve everything.

Read More

Metasurface Optics for Better Cellphone Cameras and 3-D Displays

Engineers at the California Institute of Technology have created a metasurface out of tiny pillars of silicon that act as waveguides for light. The way they arrange the pillars allows them to control the phase of light passing through the surface; this ability gives them control over how the light is focused, as well as its polarization, which is important for uses such as liquid crystal displays and 3-D glasses. Metasurfaces are structured planes so thin that they count as being two-dimensional; their periodic designs manipulate light in unusual ways.

“We're trying to create kind of a new platform for optics,” says Amir Arbabi, a postdoc in Andrei Faraon's Nanoscale and Quantum Optics Lab. The team described their work in the latest issue of Nature Nanotechnology.

The silicon pillars have to be somewhat shorter than the wavelength of light they're designed to manipulate. In the case of the metasurface described in the paper, the pillars are 715 nanometers tall, to handle infrared light with a wavelength of 915 nm. But they could easily be made shorter for visible light, Arbabi says. The pillars range in diameter from 65 to 455 nm, and they're elliptical in shape. The ellipses are not all oriented in the same direction; the pillars’ thickness and orientation determine how they focus and polarize the light passing through them.

Many of the same effects can be achieved with traditional optics, but that requires lining up multiple components such as lenses and prisms and beam splitters. The metasurface gets the job done with less bulk, allowing, among other things, thinner, lighter-weight cell phone camera lenses and better systems for directing the beams of industrial cutting lasers. It could also lead to novel applications. Using one of these devices, a display could switch between two polarizations and display two different holographic images. Or with an intermediate polarization, it could superimpose one image on the other. The metasruface could provide the optics for an LCD to create a 3-D display viewable from many angles without glasses.

What’s more, all of this can be done using the same lithography techniques used to build computer chips, doing away with individual fabrication and manual alignment of components. “We're trying to take these free-space components that are bulky and large and put them on a chip,” Arbabi says.

It shouldn't take much effort to move these metasurfaces from the lab to the marketplace, says Faraon. It's mainly a question of figuring out which optical system applications could benefit from the kind of mass production this technology makes possible. The array of potential applications is vast, Faraon says. “It gives you a unified framework, so you can design whatever optical component you would like.”

            

Say Hello to MIAOW, the First Open Source Graphics Processor

While open-source hardware is already available for CPUs, researchers from the Vertical Research Group at the University of Wisconsin-Madison have announced at the Hot Chips Event in Cupertino, Calif., that they have created the first open source general-purpose graphics processor (GPGPU). 

Called MIAOW, which stands for Many-core Integrated Accelerator Of the Waterdeep, the processor is a resistor-transistor logic implementation of AMD's open source Southern Islands instruction set architecture. The researchers published a white paper on the device. 

The creation of MIAOW is the latest in a series of steps meant to keep processor development in step with Moore's Law, explains computer scientist Karu Sankaralingham, who leads the Wisconsin research group. 

Read More
Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More