This is part of IEEE Spectrum's SPECIAL REPORT: WINNERS & LOSERS 2009, The Year's Best and Worst of Technology.
Ryan Smith, CEO of ImSAR, holds his company’s featherweight radar unit, which can provide real-time aerial images of the ground day or night, even through smoke or dense fog.
Other than having an oddly shaped protrusion in its baggage door, the Cessna 172 looks unremarkable, at least for one of its age—this single-engine plane was built during the Apollo program. But behind that lump is mounted a radar set far more sophisticated than anything NASA’s astronauts ever took to the moon.
”That’s a building,” says Ryan Smith, chief executive officer of ImSAR, the radar’s maker, narrating from the back seat as the plane taxis toward the runway of a small municipal airport near Salt Lake City. Strangely, he’s not looking out the window as he says this; his face is instead glued to the laptop perched on his knees.
Good thing Smith is there to interpret, because the signal trace he is pointing to on his computer screen shows nothing more than a vague swell in a sea of seemingly random noise. Could this radar, about the size and weight of a paperback dictionary, really turn that hash into a detailed image?
Lifting off into the bright desert sky, Smith clicks a button and the laptop begins to paint the scene below and immediately to the left of the plane. From an altitude of 300 meters, the radar scans the ground up to roughly a kilometer away, producing a black-and-white image of the terrain the plane has just flown over.
The houses, roads, and shopping malls beneath appear stark and eerily otherworldly in the radar’s 10-gigahertz light. But the scenery is remarkably well resolved, with each pixel on the laptop’s scrolling image representing about a meter on the ground. The picture would look no different had it been obtained at night or through thick fog or smoke. The magic that makes it possible to see the world in such sharp focus using an antenna that’s smaller than a clipboard is called synthetic-aperture radar, or SAR.
It takes a while to get your mind around the idea of imaging with radar—especially if your experience with radio detection and ranging is limited to peering at a boat’s radar screen, which indicates the position of other vessels as just blips and perhaps can show the general shape of a distant coastline. But radar can produce crisp images, too, at least when it’s used to look sideways out of a plane.
Normally, say, if you were using a dish antenna, you’d expect the angular resolution to be no better than the wavelength being sensed divided by the aperture of your instrument. A plain-vanilla 3-centimeter-wavelength radar with a 30-cm antenna, for example, would provide at best an angular resolution of 0.1 radian, or about 6 degrees. With such a device, you might be able to resolve two objects 1 meter apart as long as they were no more than 10 meters away. But a pair of reflectors positioned a kilometer away could be resolved only if they were separated by a football field.
SAR does an end run around this fundamental limitation of physics. Because the antenna is carried in a line that is roughly perpendicular to the direction it is facing, you can ”synthesize” an aperture that is much broader than the physical antenna by combining measurements collected as you fly along—and thus obtain much improved resolution in the direction of motion. Surprisingly enough, SAR can obtain spatial resolutions that for the most part do not depend on the distance to the target.
How can that be? The complications of SAR data processing are enough to make most people’s heads spin, but it’s easy to grasp in a general way why image resolution does not degrade with distance.
That the resolution perpendicular to the flight line is constant is not at all hard to understand. A radar—for instance, one that emits short pulses—can measure the round-trip travel time of the early-arriving reflections just as accurately as it can the later ones, assuming those are not so faint as to be lost in the noise. Figuring out why the resolution parallel to the direction of flight doesn’t fall off with distance requires more insight.
The key is to realize that distant objects are in view of the moving radar for a longer period than closer ones. Therefore the length of the synthetic aperture used to scan an object is proportional to the distance to it. And as the aperture grows, the angular resolution that can be obtained improves, canceling the usual loss of resolution caused by geometrical spreading of the antenna’s several-degree-wide beam.
Of course, to combine the many measurements and synthesize a multitude of different apertures requires serious computing power. But if all the data processing is done right, you can, in theory, compress the spatial resolution of the image in the along-track direction to a value that’s half the physical size of the antenna employed. Real-world systems, of course, have a hard time achieving that limit. In any case, the ultimate resolution in both along-track and sideways directions doesn’t depend on distance. This is a remarkable result, and it distinguishes SAR from other kinds of remote sensing, including using your eyes.