Visible light is all well and good for things like eyeballs, but here at IEEE, we do our best to cover the entire spectrum. As always, we’re especially interested in anything that confers superhero-like abilities, like X-ray vision, or in this case, M-wave vision, which sounds even more futuristic. At MIT, they’ve been working on a prototype for a time of flight microwave camera which can be used to image objects through walls, in 3-D.
A microwave camera is sort of like a cross between a visible light camera and a radar imaging system, incorporating some of the advantages of each. Like radar, microwaves don’t really notice things like darkness or fog or walls, but unlike radar they’re not confused by the kinds of angled surfaces that make the stealth fighter so stealthy. Radar systems also tend to be big, complex, low resolution, and expensive. By taking a more camera-like approach to radio frequency imaging, essentially treating microwaves like waves of light and using a passive reflector like a lens, MIT has been able to leverage computational-imaging techniques to develop a low cost, high resolution imaging system.
MIT’s microwave camera can do 3-D imaging using time of flight, in the same way that Microsoft’s latest Xbox Kinect sensor works. The time of flight camera sends out bursts of microwaves and then keeps careful track of how long it takes for the microwaves to bounce off of something and return to the sensor. After doing some not very fancy math with the speed of light, you can then calculate how far away that something is. MIT’s camera has a temporal resolution of 200 picoseconds, allowing it to resolve distances with an accuracy of 6 cm, enough for usable 3-D imaging.
Here's a video showing the microwave camera taking pictures of (among other things) a mannequin through a solid wall:
If the mannequin in the video looks suspiciously like it's covered in aluminum foil, that’s almost certainly because it is, in fact, covered in aluminum foil. Doing this actually makes the mannequin more human like: we’re very good at reflecting microwaves in this frequency range because we’re ugly bags of mostly water, and covering the plastic mannequin in tin foil makes it a close approximation to the real thing. You can see the resolved 3-D image at the tail end of the video, and at 41 x 41 pixels, it’s sufficient resolution “to [be] able to see how many limbs a person has,” according to MIT. You know, just in case whatever is on the other side of the wall has extra limbs, which means you probably don’t want to enter that room.
Images: Camera Culture Group/MIT Media Lab
One other trick that the microwave camera is capable of is multispectral imaging. As the camera takes each measurement, the microwave emitter sweeps through a frequency range of 7.835 GHz to 12.817 GHz over 10 ms (your microwave oven operates at 2.45 GHz). Different materials respond to the microwaves differently at lower and higher frequencies, and the camera can separate out these spectra. This gives you an image with multiple frequency response “colors,” and the patterns of colors that you get provides information about the materials.
The microwave camera is, at the moment, probably not something that you’d want to carry around. The reflector is over a meter wide, and in order to acquire an image, it has to be mechanically scanned along the entire focal plane, a process that takes something like an hour. However, MIT suggests a few ways of smallerizing the system, including the use of reconfigurable focal-plane sensors or shrinking the transmission wavelength from microwave (3 cm) down to millimeter wave (5 mm), which would significantly reduce the size of the reflector. The idea is that it’ll be useful for the recovery of survivors in disaster situations and imaging in hazardous conditions, which means that it may not be scalable down to cellphone size. Obviously, this is a disappointment for those of us who were looking forward to regularly misusing this kind of technology, as well as those of us who were hoping to have a camera that could also warm up our Hot Pockets.
Evan Ackerman is a senior editor at IEEE Spectrum. Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes.