The August 2022 issue of IEEE Spectrum is here!

Close bar

CMU's Autonomous Car Doesn't Look like a Robot

This self-driving Cadillac uses seemingly invisible, automotive-grade sensors

2 min read
CMU's Autonomous Car Doesn't Look like a Robot

The future of automobile autonomy isn't going to involve cars covered in cameras and radar and lasers. It's going to be all invisible, and CMU is already there.

The 2011 Cadillac SRX in the picture above is an autonomous car. Carnegie Mellon University had it drive itself 33 miles last week on public roads, from from Cranberry, Pa. to Pittsburgh International Airport. At first glance, you probably wouldn't be able to tell that the car is self-driving, because self-driving cars looked like this just five years ago:

That's CMU's BOSS competing in the DARPA Urban Challenge in 2007, with who knows how many sensors mounted all over it. And even Google's autonomous cars have that signature Velodyne LIDAR mounted on top of them:

By contrast, CMU's SRX relies entirely on automotive-grade radars, lidars, and cameras. You can see them if you look closely in the picture at the top of this article (there's one above the windshield, for example), but you do have to look closely. Inside, there are some extra buttons and screens, but all of the computers are stuffed under the floor in the trunk. And despite the lack of giant and complicated and expensive sensor systems, the car is still able to achieve the level of autonomy that we all want it to, as CMU's Raj Rajkumar explains:

"This car is the holy grail of autonomous driving because it can do it all — from changing lanes on highways, driving in congested suburban traffic and navigating traffic lights."

In addition to controlling the steering, speed and braking, the autonomous systems also detect and avoid obstacles in the road, including traffic cones and barrels, as well as pedestrians and bicyclists, pausing until they are safely out of the way. The systems provide audible warnings of obstacles and communicate vehicle status to its passengers using a human-like voice.

It's unfortunate that while the technology for all of this is arguably mostly ready, society (socially and legally) just isn't yet. You can buy cars with adaptive cruise control and lane departure warnings, which could hypothetically let the car drive itself, at least under some specific circumstances. And despite the fact that even a bad autonomous (or semi-autonomous) car would still save lives overall, there's no legal infrastructure in place to make it possible for manufacturers to implement such technology without undue risk of being sued into oblivion the first time something goes wrong.

Via [ CMU ]

The Conversation (0)

How Robots Can Help Us Act and Feel Younger

Toyota’s Gill Pratt on enhancing independence in old age

10 min read
An illustration of a woman making a salad with robotic arms around her holding vegetables and other salad ingredients.
Dan Page
Blue

By 2050, the global population aged 65 or more will be nearly double what it is today. The number of people over the age of 80 will triple, approaching half a billion. Supporting an aging population is a worldwide concern, but this demographic shift is especially pronounced in Japan, where more than a third of Japanese will be 65 or older by midcentury.

Toyota Research Institute (TRI), which was established by Toyota Motor Corp. in 2015 to explore autonomous cars, robotics, and “human amplification technologies,” has also been focusing a significant portion of its research on ways to help older people maintain their health, happiness, and independence as long as possible. While an important goal in itself, improving self-sufficiency for the elderly also reduces the amount of support they need from society more broadly. And without technological help, sustaining this population in an effective and dignified manner will grow increasingly difficult—first in Japan, but globally soon after.

Keep Reading ↓Show less