This is part of IEEE Spectrum's SPECIAL REPORT: WINNERS & LOSERS 2009, The Year's Best and Worst of Technology.
Ryan Smith, CEO of ImSAR, holds his company’s featherweight radar unit, which can provide real-time aerial images of the ground day or night, even through smoke or dense fog.
Other than having an oddly shaped protrusion in its baggage door, the Cessna 172 looks unremarkable, at least for one of its age—this single-engine plane was built during the Apollo program. But behind that lump is mounted a radar set far more sophisticated than anything NASA’s astronauts ever took to the moon.
”That’s a building,” says Ryan Smith, chief executive officer of ImSAR, the radar’s maker, narrating from the back seat as the plane taxis toward the runway of a small municipal airport near Salt Lake City. Strangely, he’s not looking out the window as he says this; his face is instead glued to the laptop perched on his knees.
Good thing Smith is there to interpret, because the signal trace he is pointing to on his computer screen shows nothing more than a vague swell in a sea of seemingly random noise. Could this radar, about the size and weight of a paperback dictionary, really turn that hash into a detailed image?
Lifting off into the bright desert sky, Smith clicks a button and the laptop begins to paint the scene below and immediately to the left of the plane. From an altitude of 300 meters, the radar scans the ground up to roughly a kilometer away, producing a black-and-white image of the terrain the plane has just flown over.
The houses, roads, and shopping malls beneath appear stark and eerily otherworldly in the radar’s 10-gigahertz light. But the scenery is remarkably well resolved, with each pixel on the laptop’s scrolling image representing about a meter on the ground. The picture would look no different had it been obtained at night or through thick fog or smoke. The magic that makes it possible to see the world in such sharp focus using an antenna that’s smaller than a clipboard is called synthetic-aperture radar, or SAR.
It takes a while to get your mind around the idea of imaging with radar—especially if your experience with radio detection and ranging is limited to peering at a boat’s radar screen, which indicates the position of other vessels as just blips and perhaps can show the general shape of a distant coastline. But radar can produce crisp images, too, at least when it’s used to look sideways out of a plane.
Normally, say, if you were using a dish antenna, you’d expect the angular resolution to be no better than the wavelength being sensed divided by the aperture of your instrument. A plain-vanilla 3-centimeter-wavelength radar with a 30-cm antenna, for example, would provide at best an angular resolution of 0.1 radian, or about 6 degrees. With such a device, you might be able to resolve two objects 1 meter apart as long as they were no more than 10 meters away. But a pair of reflectors positioned a kilometer away could be resolved only if they were separated by a football field.
SAR does an end run around this fundamental limitation of physics. Because the antenna is carried in a line that is roughly perpendicular to the direction it is facing, you can ”synthesize” an aperture that is much broader than the physical antenna by combining measurements collected as you fly along—and thus obtain much improved resolution in the direction of motion. Surprisingly enough, SAR can obtain spatial resolutions that for the most part do not depend on the distance to the target.
How can that be? The complications of SAR data processing are enough to make most people’s heads spin, but it’s easy to grasp in a general way why image resolution does not degrade with distance.
That the resolution perpendicular to the flight line is constant is not at all hard to understand. A radar—for instance, one that emits short pulses—can measure the round-trip travel time of the early-arriving reflections just as accurately as it can the later ones, assuming those are not so faint as to be lost in the noise. Figuring out why the resolution parallel to the direction of flight doesn’t fall off with distance requires more insight.
The key is to realize that distant objects are in view of the moving radar for a longer period than closer ones. Therefore the length of the synthetic aperture used to scan an object is proportional to the distance to it. And as the aperture grows, the angular resolution that can be obtained improves, canceling the usual loss of resolution caused by geometrical spreading of the antenna’s several-degree-wide beam.
Of course, to combine the many measurements and synthesize a multitude of different apertures requires serious computing power. But if all the data processing is done right, you can, in theory, compress the spatial resolution of the image in the along-track direction to a value that’s half the physical size of the antenna employed. Real-world systems, of course, have a hard time achieving that limit. In any case, the ultimate resolution in both along-track and sideways directions doesn’t depend on distance. This is a remarkable result, and it distinguishes SAR from other kinds of remote sensing, including using your eyes.
Synthetic-aperture radar was first pioneered in the 1950s, although it didn’t really come into vogue until decades later. One challenge for SAR pioneers was that so much data processing was necessary that it had to be done long after the measurements were acquired. This was even the case when NASA conducted a series of SAR missions in the 1980s using the space shuttle. One surprise finding from those experiments was this technique’s ability to map not just the surface of the Earth but also the subsurface—at least in places like the Sahara Desert, where the ground is so dry that the radar waves can penetrate without significant attenuation. The shuttle imaging radar could therefore peer beneath the dunes.
But the SAR systems flown in space and on aircraft in the past have been frighteningly large and complicated, not to mention expensive. Only recently have systems become small enough to carry on typical private aircraft.
For example, the special SAR equipment that Denver-based Intermap Technologies uses for three-dimensional terrain mapping fits on a Learjet. And Rockwell Collins is now in the process of commercializing a SAR system developed by researchers at Sandia National Laboratories that weighs only 12 kilograms. ImSAR has carried this evolution further still, reducing the package to something smaller than a shoebox, including the GPS and inertial-measurement units required to obtain precise navigational data. The company has also shrunk the price, charging less than US $100 000 for its system.
At that size and that price, the radar becomes practical for diverse applications. The U.S. military is keen to put it on small unmanned aerial vehicles (UAVs), for example. And many others—such as law-enforcement agencies, emergency rescue services, even fishermen—might soon use it to obtain SAR images from small planes.
This diminutive unit, fittingly dubbed NanoSAR, is an outgrowth of the thesis work that Ryan Smith did for his master’s degree while a student at Brigham Young University, in Provo, Utah, which is just down the road from ImSAR’s offices. At BYU, Smith investigated ways to make SAR instrumentation ever smaller—which had been one of the research thrusts of David Long, who with fellow professor David Arnold founded a radar lab at BYU in the early 1990s.
In particular, Smith abandoned the usual system of having the radar set send out a short-duration pulse and then listen for the echoes. Instead, he worked on a technique that transmits radio waves continuously while modulating the frequency with a sawtooth pattern that ramps up linearly. Previous efforts in the BYU radar lab had used this linear-frequency, continuous-wave (LF-CW) approach for other projects—a radar altimeter, for example—but at the time there were only vague hints in the technical literature that it might be appropriate for SAR.
LF-CW proved advantageous in making a SAR system small and relatively inexpensive, for two reasons. First, the approach reduces the instantaneous power levels that the electronics must handle. Second, the device mixes the received signal not with a local oscillator—as receiver front ends normally do—but with a copy of the transmitted waveform. The result contains comparatively low frequencies—the difference between those present in the incoming and outgoing waves. The signal can therefore be digitized and numerically processed with relative ease.
Even before he graduated from BYU in 2002, Smith went to work for Wavetronix, a company Arnold had established in nearby Lindon, Utah, to build radar sensors for monitoring highway traffic. But after leaving school, the idea of putting together a tiny SAR package stuck with Smith, so he worked on it on the side. ”I set up a lab in my basement,” he recalls.
Smith had reasonable confidence that he could do the job with LF-CW, having modeled the problem and prototyped hardware for his thesis, but there was still a long way to go. ”You can build all the models you want, but the world of physics has a lot of ’gotchas’ you won’t know until you’re there,” says Smith. And with a mortgage to pay and a family to support, he prudently kept his day job.
It wasn’t lost on Smith that a SAR unit of the size he was working on could fit in a UAV, even one of the smaller varieties. So he started researching UAV manufacturers, and in 2004 he gave a presentation to InSitu, a Bingen, Wash., company (acquired by Boeing in July) that produces small UAVs, including its 12-kg, 1.2-meter-long ScanEagle. Seeing the obvious synergies, InSitu offered him funding to pursue his vision.
That was Smith’s Rubicon: he decided that he was going to leave Wavetronix and form a company of his own to build the world’s tiniest SAR unit. For that, he enlisted the help of Logan Harris, another graduate of BYU’s department of electrical and computer engineering, who was the chief technical officer of Wavetronix at the time. Together they worked with the occasional consultant while maintaining what eventually became part-time positions with Wavetronix. Only in 2007 did they commit fully to ImSAR. Since then the firm has grown to include about 20 full-time employees, including Adam Robertson, who, like Smith, studied SAR imaging in BYU’s radar lab for his master’s degree.
After completing that training in electrical engineering, Robertson obtained a degree in business administration and now serves as the company’s NanoSAR program manager. Smith says it’s critical to have people like Robertson in what are nominally the company’s business roles. There’s nothing worse, says Smith, than having someone with inadequate engineering knowledge attempting to sell a high-tech product, because potential customers will judge the technology by the person fielding their questions: ”If you don’t have the right answers, you’re dead.”
ImSAR’s business focus has so far been on working with InSitu to demonstrate the radar to military customers. ImSAR expects that the first battlefield deployments will come in the early months of 2009—though Smith, Harris, and Robertson are, naturally enough, tight-lipped on the details. They have to be especially mindful about what technical information they release, because SAR is on the list of ”munitions” controlled by the U.S. government’s International Traffic in Arms Regulations (ITAR).
Even so, some of the more obvious military applications of a UAV-borne SAR aren’t hard to imagine. If the shuttle can use imaging radar to map subsurface features, then maybe a UAV can use it to detect buried explosive devices from the air [see ”Countering IEDs,” IEEE Spectrum, September 2008].
ImSAR’s 10-GHz system penetrates soil less deeply than a longer-wavelength radar would, but it sometimes shows features that aren’t at all apparent from just looking down at the ground. ”We found some old settlements that had been abandoned who knows how many years ago, buildings and things,” says Robertson, referring to the results of an earlier radar survey of some sagebrush-covered land nearby.
Even though these ImSAR engineers are closemouthed about their product’s military uses, they gush about what’s possible in other spheres, for while the technology falls under ITAR and thus demands a special license for export, it’s available without restriction to U.S. customers. So ImSAR’s unit should see some civilian deployments in 2009 as well.
UAV-borne SAR systems could, for example, help ice-breaking ships find openings in pack ice, even when visibility is too poor for conventional aerial imaging. And because the radar reflectivity of the ground depends on moisture content, this form of remote sensing could help farmers target their irrigation efforts more efficiently. It could also detect oil slicks, which change the surface characteristics of water in a way that shows up clearly in radar images.
Perhaps more important though will be the application of SAR imagery to search-and-rescue operations carried out in darkness or under other low-visibility conditions. The nearby snow-covered peaks overlook Utah’s famous winter resorts, bringing to mind searches for skiers or snowmobilers lost at night. Would NanoSAR serve for that? ”A human being in the snow might as well have a lightbulb on him when it comes to radar,” says BYU’s Long.
ImSAR’s offices are close to a lot of recreational boating, too. And Smith’s demonstration flight shows off what his NanoSAR system might do one day in a search for someone lost on the water.
The plane turns away from the towering Wasatch Range, flying now over the 400-square-kilometer Utah Lake, one of the largest freshwater bodies in the western United States. Off to the left, a water-skier preparing for some fun floats behind a small pleasure boat. Smith points to his laptop screen shortly after the plane passes over the scene. The boat stands out as a bright white spot in the radar image, which, because the surrounding water reflects the radio waves away, appears largely black. Even the mostly submerged water-skier is easy to pick out.
”We have a very clean radar,” Smith boasts, later referring to what he has stored on his laptop that day as ”some of the best SAR imagery ever collected.” That may be an exaggeration, spoken by the understandably proud father of a newborn device. Nevertheless, the tiny package mounted behind that funny lump in the Cessna’s baggage door indeed seems to be doing a bang-up job.
For more articles, go to Winners & Losers 2009 Special Report.
Snapshot: Sideways Glances
Goal: To develop the world’s smallest synthetic-aperture radar system.
Why: it’s a winner It crams a roomful of electronic equipment into something the size of a shoebox.
Where: Salem, Utah
Staff: About 20
Budget: Info not available
When: First field deployments expected in 2009