A self-driving vehicle can avoid a lot of problems by just going slowly. A slow bot hurts a lot less when it hits you, and cheap sensors are good enough when you don’t have to see far to avoid a collision.
And cheap is more than a feature, it’s a strategy. Make a costly product cheaply, and it’s not the same product—just look to the supercomputer in your pocket that calls itself a phone.
That’s the idea behind Perceptin, a little startup founded in 2016 by Shaoshan Liu, who got a Ph.D. from the University of California at Irvine and then worked for a decade at the U.S. branch of Baidu. Liu asked himself just how much robocar he could build on a shoestring, and made the task easier by specifying a top speed of 20 kilometers per hour (12 mph).
Researchers at Alphabet’s DeepMind today described a method that they say can construct a three-dimensional layout from just a handful of two-dimensional snapshots.
So far the method, based on deep neural networks, has been confined to virtual environments, they write in Science magazine. Natural environments are still too hard for current algorithms and hardware to handle.
The article doesn’t speculate on commercial applications, and the authors weren’t available for interview. That gives me license to speculate: The new method might be useful for any surveillance system that has to reconstruct a crime from a few snapshots. Self-driving cars and household robots would also seem likely beneficiaries of the technique.
China’s Alibaba says that it has built the world’s first self-driving vehicle guided by solid-state lidar. The vehicle’s a truck; the lidar comes from China’s Robosense.
That’s one small step for delivery bots but one giant leap for solid-state lidar.
A small step, because delivery bots are already out there, managing without lidar. A giant leap, because solid-state lidar has so far been mostly just a smile and a shoeshine. The one production car to sport lidar—the upcoming 2019 Audi A8—packs a mechanical form of the device and uses it only for traffic-jam assist and other functions below true self-driving.
Alibaba’s G Plus vehicle is billed as a road-going truck that indeed will drive itself—someday—thanks in part to its three lidar sets, two fore and one aft. Such a truck would sure fill a need—in the United States, for instance, there’s a massive shortage of truck drivers. But how close that dream is to realization is almost irrelevent here. Everybody’s making airy self-driving claims nowadays.
What really matters is that Alibaba is embracing a solid-state lidar rather than the big, burly, and costly mechanical alternatives put out by Velodyne, the market leader (and, any day now, by upstate Luminar).
Robosense’s lidar instead uses microelectromechanical (MEMs) mirrors to steer the beam.
“RS-LiDAR-M1Pre MEMS micro mirror scanning scheme requires only a few laser emitters and receivers to scan the MEMS micro-mirror in both directions because of the swing angle,” Robosense says in a release. “Resolution is a very fine, high and vertical angle resolution of 0.2° throughout angle of view.”
Exactly how fine that resolution is the release does not say. The company’s website lists various models, one apparently providing 32 beams, another only 16. Velodyne’s top-of-the-line mechanical scanner offers 128.
There’s also a tradeoff between spatial resolution and the number of frames a system can scan in a second. Lidar can either scan fast for a few pixels or slowly for a lot more; combining high resolution with a fast frame rate requires muscle—more muscle than any previous solid-state system have yet been able to muster.
“New York would need 30 percent fewer vehicles if the taxi fleet, even with human drivers, is managed better,” Carlo Ratti, the director of MIT's Senseable City Lab tells IEEE Spectrum. That’s a big savings, both in taxis and in the space they take up on city streets. New York’s 14,000-odd taxis log some 500,000 trips a day.
The technology would seem to help the beleaguered taxi business fend off private ride-hailing services, like Uber and Lyft. They have their own algorithms, optimized partly to match drivers and passengers and partly to pool ride-sharing customers.
Waymo CEO John Krafcik announced last week that the company would be launching a driverless taxi service in Phoenix later this year. An application Waymo filed with the California Department of Motor Vehicles (DMV) for driverless testing, obtained by IEEE Spectrum using public record laws, reveals more about how that service might work.
Waymo is already operating a fully driverless pilot test in Arizona, where companies do not have to seek permission for self-driving cars, with or without human safety operators, or report on their progress. It’s a different matter in California, where many self-driving companies are based. In April, the state’s DMV started accepting applications for fully driverless testing. So far, the DMV has received two applications—one from Waymo, an Alphabet company, and the other from U.S./China startup JingChi.ai.
Self-driving vehicle startup Drive.ai has only been around since 2015, but has moved aggressively toward getting autonomous cars out into the world to do useful things safely and efficiently. Drive struck a partnership with Lyft last September to test its technology with the ride-sharing service in San Francisco, and this week, the company announced an on-demand self-driving car service in Frisco, Texas, just north of Dallas.
Starting in July, 10,000 local residents in a small area consisting mostly of office parks and restaurants will gradually receive access to Drive.ai's app, which they'll be able to use to hail themselves an autonomous car to drive them around (for free). A few fixed routes will connect specific locations. If everything goes well after six months, the company will add more routes in other areas.
Drive.ai is not the first self-driving car company to run a pilot, but there are a few things that make its effort particularly interesting. First is the introduction of hardware and software to allow autonomous vehicles to communicate with pedestrians and other drivers, a desperately needed capability that we haven't seen before. And second, Drive will implement a safety system with remote "tele-choice operators" to add a touch of humanity to tricky situations—and keep humans in the loop even after safety drivers are eventually removed from the vehicles.
We spent a very hot Monday in Frisco at Drive.ai's launch event, took a demo ride and talked with as many engineers as we could to bring you the details on all the new stuff that Drive has been working on.
One of the truisms of the self-driving car business is that you can’t begin to function properly without super-detailed, constantly updated digital maps that show buildings, trees, and other features.
That might seem no problem at all if you’re a Google spinoff called Waymo. After all, your corporate parent possesses vast mapping capabilities, and besides, you’re driving in your home turf—Mountain View, or maybe Phoenix. But how can even mighty Google map every last country lane, then freshen up the data every month or two, so a car won’t be surprised to find that a freshly planted cornfield is now knee-high?
Israel-based Innoviz has announced that it will supply solid-state lidar to BMW. The device, along with radar and other systems, will be incorporated into a self-driving package from Magna, a major autosupplier.
Innoviz says that when volume production begins, the lidar’s price should drop to the hundreds of dollars, down from the “single-digit thousands” that today’s test units go for. The company says that it can now make several thousand units a month on its existing assembly line, in Israel, and that it’s building another line in China.
The company argues that today’s deal with BMW vindicates the solid-state approach to lidar, in which the laser beam is steered without machinery. Innoviz does the trick with microscopic, moveable mirrors. Most recent lidar startups also use solid-state approaches.
Automatic emergency braking that can help cars avoid hitting pedestrians could become standard in many cars in the coming years. But a new study suggests such safety systems will need sensor coverage spanning almost 180 degrees in front of the car to avoid colliding with faster-moving cyclists.
In the coming months, an unnamed manufacturer will bring an electric car to market that offers wireless charging from WiTricity, Alex Gruzen, the company’s chief executive, tells IEEE Spectrum.
Unnamed, yes, but not utterly unguessable. Among the companies that have demonstrated wireless charging are BMW and Hyundai. And, though there are other wireless charging companies out there—Qualcomm, for example—Hyundai has explicitly named WiTricity as the supplier of the system it showed on its new Kona EV last week at the International Geneva Motor Show. Other companies known to be working with WiTricity include Honda, Nissan, and Toyota.