The May 2024 issue of IEEE Spectrum is here!

Close bar

Smart Roads Get Better Eyesight

A new way of fusing camera and radar data helps track vehicles at greater distances

4 min read

roadway with cars in both directions with blue, orange and green dots on each

USTC researchers captured car-tracking data from a radar [green], camera [blue], and a fusion of the two [yellow] on an expressway in Heifei, China.

Yao Li

This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.

Smart roads with advanced vehicle-sensing capabilities could be the linchpin of future intelligent transportation systems and could even help extend driverless cars‘ perceptual range. A new approach that fuses camera and radar data can now track vehicles precisely at distances of up to 500 meters.

Real-time data on the flow and density of traffic can help city managers avoid congestion and prevent accidents. So-called “roadside perception,” which uses sensors and cameras to track vehicles, can help create smart roads that continually gather this information and relay it to control rooms.

“This is the first work that offers a practical solution that combines these two types of data and works in real-world deployments and with really challenging distances.” —Yanyong Zhang, University of Science and Technology of China, Hefei

Installing large numbers of road-side sensors can be expensive, though, as well as time consuming to maintain, says Yanyong Zhang, a professor of computer science at the University of Science and Technology of China (USTC) in Hefei. For smart roads to be cost effective you need to use as few sensors as possible, she says, which means sensors need to be able to track vehicles at significant distances.

Using a new approach to fuse data from high-definition cameras and millimeter-wave radar, her team has created a system that can pinpoint vehicle locations to within 1.3 meters at ranges of up to 500 meters. The results were outlined in a recent paper in IEEE Robotics and Automation Letters.

“If you can extend the range as far as possible, then you can reduce the number of sensing devices you need to deploy,” says Zhang. “This is the first work that offers a practical solution that combines these two types of data and works in real-world deployments and with really challenging distances.”

Where Camera-Radar Fusion Becomes Necessary

Cameras and radar are both good low-cost options for vehicle tracking, says Zhang, but individually they struggle at distances much beyond 100 meters. Fusing radar and camera data can significantly extend ranges, but to do so involves surmounting a range of challenges due to sensors generating completely different kinds of data. While the camera captures a simple 2D image, the radar output is inherently 3D and can in fact be processed to generate a bird’s-eye view. Most approaches to camera-radar fusion to date have simply projected the camera data onto the radar’s birds-eye view, says Zhang, but the researchers discovered that this was far from optimal.

In order to better understand the problem, the USTC team installed a radar and a camera on a pole at the end of a straight stretch of expressway close to the university. They also installed a lidar on the pole to take ground-truth vehicle-location measurements, and two vehicles with high precision GPS units were driven up and down the road to help calibrate the sensors.

two images side by side with poles and boxes attached to them, each circled with a blue square and textThe researchers installed a camera, radar and lidar to track vehicles on an expressway in Heifei, China.Yao Li

One of Zhang’s Ph.D. students, Yao Li, then carried out experiments with the data collected by the sensors. He discovered that projecting 3D radar data onto the 2D images resulted in considerably lower location errors at longer ranges, compared to the standard approach in which image data is mapped onto the radar data. This led them to the conclusion that it would make more sense to fuse the data in the 2D images, before projecting it back to a bird’s-eye view for vehicle tracking.

As well as allowing precise localization at distances of up to 500 meters, the researchers showed that the new technique also boosted the average precision of tracking at shorter distances by 32 percent compared to previous approaches. While the researchers have tested the approach only offline on previously collected datasets, Zhang said the underlying calculations are relatively simple and should be possible to implement in real time on standard processors.

Using more than one sensor also entails careful synchronization, to ensure that their data streams match up. Over time, environmental disturbances inevitably cause the sensors to drift apart, and they have to be recalibrated. This involves driving the GPS-equipped vehicle up and down the expressway to collect ground truth location measurements that can be used to tune the sensors.

This is time consuming and costly, so the researchers also built a self-calibration capability into their system. The process of projecting the radar data onto the 2D image is governed by a transformation matrix based on the sensors’ parameters and physical measurements done during the calibration process. Once the data has been projected, an algorithm then tries to match up radar data points with the corresponding image pixels.

If the distance between these data points starts to increase, that suggests the transformation matrix is becoming increasingly inaccurate as the sensors move. By carefully tracking this drift, the researchers are able to automatically adjust the transformation matrix to account for the error. This only works up to a point, says Zhang, but it could still significantly reduce the number of full-blown calibrations required.

Altogether, Zhang says this makes their approach practical to deploy in the real world. As well as providing better data for intelligent transport systems, she thinks this kind of roadside perception could also provide future self-driving cars with valuable situational awareness.

“It’s a little futuristic, but let’s say there is something happening a few 100 meters away and the car is not aware of it, because it’s congested, and its sensing range couldn’t reach that far,” she says. “Sensors along the highway can disseminate this information to the cars that are coming into the area, so that they can be more cautious or select a different route.”

The Conversation (0)