Cars That Think iconCars That Think

The Self-Driving Car Industry Goes to Washington

A bevy of robocar-oriented companies has founded a lobby—a move than provides the single clearest sign that the industry is maturing.

The lobby’s chief, David Strickland, would like all regulatory decisions to be coordinated by the National Highway Transportation Safety Administration (NHTSA). Strickland sure knows what to do: he’s a former administrator of NHTSA—and yet another example of Washington’s revolving door.

The lobby is called the Self-Driving Coalition for Safer Streets, and it includes Google, Ford, Volvo, Uber and Lyft. It looks as if Google is the prime mover here. Yesterday, at a public hearing that NHTSA held at Stanford University, Google’s robocar chief Chris Urmson emphasized the importance of uniform regulations.

Read More Brings Deep Learning to Self-Driving Cars is the 13th company to be granted a license to test autonomous vehicles on public roads in California. This is exciting news, especially because we had no idea that even existed until just last week. The company has been in stealth mode for the past year, working on applying deep learning techniques to self-driving cars. We spoke with two of's co-founders, Sameep Tandon and Carol Reiley, about why their approach to self-driving cars is going to bring us vehicle autonomy that's more efficient, more adaptable, more reliable, and safer than ever.

Read More

The GeoOrbital Wheel Lets You Make Your Bike Electric in Minutes

The first thing I notice about a bicycle retrofitted with an electric motor and battery within its front wheel is that it demands a light thumb. Touch the throttle lever on the handlebar a little too hard, and you lurch ahead.

The second thing: I’m being watched. Even jaded New Yorkers, rushing by the IEEE Spectrum building, in midtown Manhattan, crane their necks to get a load of the bicycle and the bozo perched uneasily atop it. Nowadays, the only bike I ever mount is the stationary one in my gym.

Which raises the question of which market this invention—the GeoOrbital Wheel—is meant to serve. Most cyclists are fitness buffs; why would they want a motor-assisted leg up?

“Maybe you don’t want to arrive at the office covered in sweat,” suggests Michael Burtov, the founder of GeoOrbital, which is based in Cambridge, Mass. “And you always can put the original wheel back in if you do want to do some cardio.”

Another market, he notes, is the older demographic—those who bike for fun and exercise but need help climbing hills. It should do well in San Francisco. Here in New York City it’s a tougher sell, if only because motor-powered bicycles are illegal. In theory. Nobody arrested me, and you do see the occasional electric scooter on the streets and sidewalks.

The cost, so far, has been in the tens of thousands of dollars. Now that the bike is ready for mass-manufacturing, it will go to any Kickstarter subscriber who is prepared to pony up US $500. 

Read More

Jia Yueting’s Sprawling Self-Driving Car Empire

Google may have the most self-driving cars on the road and Ford might be testing its driverless vehicles in the most extreme conditions, but Jia Yueting certainly has the most autonomous auto brands under his belt.

The Chinese Internet billionaire owns self-driving EV startup Faraday Future, has a big investment in its stealthy rival Atieva, and has partnered with Aston Martin to build a third EV. Now, Jia just revealed yet another self-driving car, under the LeEco brand.

Read More

Mobileye Bullish on Full Automation, but Pooh-Poohs Deep-Learning AI for Robocars

Mobileye, the Israeli car automation company that came onto the self-driving car scene as sort of an anti-Google, is now looking at the future in terms that seem a bit closer to Google's than used to be the case.

Speaking Friday at a conference organized by Goldman Sachs (which owned a chunk of Mobileye’s shares when the company first became publicly traded in 2014), Amnon Shashua, Mobileye’s founder and chief technical officer, placed a lot of emphasis on mapping, something Google has done all along. And now Shashua is predicting utterly hands-free driving—if only on the highway—by 2021.

Mobileye had always emphasized incremental steps, such as active cruise control and emergency braking, collectively called advanced driver assistance systems (ADAS). It was Google that proposed to skip all half measures and get right to full-bore self-driving cars. 

But Google is also taking a step back from its original position. Chris Urmson, Google’s robocar chief, used to say he expected its car—with no steering wheel, accelerator, or brake pedal—to go on sale in time for his own teenage son to avoid ever having to take a driver’s qualification test. In March, though, he said full autonomy might trickle into various driving environments over a three-to-30 year period.

Still, Mobileye’s approach differs from Google's in a number of ways.

First, Google uses an expensive array of sensors costing tens of thousands of dollars per car. Mobileye got its start with a system built around a single camera, at a cost to manufacturers of less than US $1,000. The prospect of a relatively simple and super-cheap robocar system explains why Mobileye wowed Wall Street with sky-high market valuations.

Second, Google has professional drivers test a relatively small fleet of experimental cars, while Mobileye and its automotive collaborators—notably Tesla—have gotten their data from customers, through crowdsourcing. Shashua says that the collaborators can also include mapping and navigation companies, such as Tom Tom, in the Netherlands, and Here, in Germany.

Now Google appears to be moving ever more in the direction of the "deep learning" approach to teaching cars to drive themselves. This approach, in which deep neural networks train themselves into expertise with little or no human intervention, is what powered Google's AlphaGo program to its recent victory over a leading master of the game of Go.

AlphaGo learned to imitate the play of human masters through trial and error. At no point did a human being step in and tell the machine to pay attention to, say, points near the edge of the board. The same researchers earlier trained machines to play Atari games—again, without giving them any hints. The program had to learn the rules as it went along.

But Shashua has poured a little cold water on the idea of cars being self-taught. Deep learning does well on games and other well-defined tasks, like recognizing images in a database, or translating from one language to another—two of Google’s other specialties. But driving, taken as a whole, says Shashua, is not so well defined.

“What makes both driving assist and autonomous driving real is the ability to find a needle in a haystack,” he said. “There are many rare events that need to be covered to reach 99.999% capability. Building for demonstration is manageable, but building something that will reach production-worthiness requires this remaining 10 percent, and that makes all the difference.”

MobileEye plans to continue using human experts to break self-driving down into parts that it can automate—an expert-system approach. “We have 600 people annotating images at MobileEye; at the end of this year, it will be 1000,” Shashua said.

Sure, Shashua was answering his company’s critics, who maintain that it got the direction of the industry wrong when it bet on simple camera systems and incremental automation. But the anti-Google is suddenly looking like less of an outlier than was previously thought.

Ford Testing Autonomous Cars in The Dark, Wants You to Be Impressed

Driving a car in the middle of the desert at night without any headlights is easy for any driver. The trip becomes a little more challenging if there are obstacles the driver has to avoid, and harder still if there’s a road to navigate. At this point, most humans (those without immediate access to a high-quality night vision system) might start to have some trouble. Robots, being much better at this whole driving thing than humans are, don't really care whether there’s daylight or street lights, as some recent testing from Ford demonstrates.

Read More

Will We Prove That Autonomous Cars Are Safe Before They Go on Sale?

It could take hundreds of billions of kilometers of driving before autonomous cars are pronounced safe, according to a new report. Logging that many kilometers, in order to generate a large enough cache of safety data, could take decades. To put those hundreds of billions of kilometers in context, Google’s self-driving cars have driven 2.4 million kilometers since 2009.

Analysts from the nonprofit RAND Corporation in Santa Monica, Calif., who authored the report say that proving self-driving cars to be as safe as human-controlled ones will require robocar developers and testers to employ test methods—virtual testing and simulators, mathematical modeling, and scenario testing among them—that don’t require the rubber to meet the road.

The U.S. National Highway Traffic Safety Administration says that more than 90 percent of car crashes are caused by human errors such as speeding, drunk-driving, distraction, and fatigue. Besides remaining ever alert and being incapable of failing a breathalyzer test, self-driving cars’ ability to communicate with each other and fixed infrastructure like traffic lights could make driving pretty efficient.

But the statistics will never stack up enough for us to be completely certain that accidents won’t happen, the analysts say.

That’s because human error is a critical benchmark with which to compare self-driving cars. And though we tend not to view it that way, the rate of road injuries and deaths because of human error is pretty low compared to the total distance traveled. Americans drive roughly 4.9 trillion kilometers every year, according to the Bureau of Transportation Statistics, and for every 160 million miles driven, there are about 77 injuries and about 1 death. To prove that autonomous cars are safe—that they have similar or lower injury and death rates—they would have to log comparable travel distances, the anslysts contend. And that will be impossible before self-driving cars are sold to the public.

Google seems to be aware of these limitations and the safety concerns related to robotic cars. The company’s Chris Urmson recently outlined plans for an incremental roll-out of the cars, with vehicles meant for sunny weather and wide-open roads coming out before models designed for places where it snows and traffic is regularly snarled.

5D Robotics Can Locate You To Within An Inch

GPS falls from the sky and costs nothing to use, but it may not reach a car roving the canyons of Manhattan or a forklift moving boxes in a warehouse. For uninterrupted autonomous driving, you need some backup.

Sure, you can festoon your vehicles with a vast array of overlapping sensors, but even that won’t always give you a clear sense of where you are. So, when the GPS satellites can’t pinpoint you, why not resort to land-based beacons?

That’s the solution proposed by 5D Robotics, a Carlsbad, Calif. company that marks anchor points with transmitters that broadcast in ultra-wide-band frequencies. The UWB signal is very low in power, so you have to be within 200 meters, but that’s okay if you have enough beacons, according to Philip Mann, the company’s vice president for sales.

Read More

Nvidia to Supply Robocar Brains for Roborace Formula E Series

Nvidia’s chief executive says its self-driving system will be installed in all the cars in the Roborace Formula E series, an all-robotic, all-electric variant of Formula One that’s to begin by early next year.

In a speech on Tuesday at the GPU Technology Conference, in San Jose, Calif., Jen-Hsun Huang said his company’s Drive PX 2 system would be standard in all the cars that the 10 Roborace teams will manage. The hardware, which was unveiled in January at CES, can be held in one hand and can perform 24 trillion operations per second while wrangling data from a dozen cameras as well as radar sets and LIDAR.

Read More

Will Robocars Make You Puke?

When the robocar revolution comes, we’re told, the person formerly known as the driver will curl up with an e-reader or swivel fully around in order to talk with people in the back seat.

That’s the vision epitomized in the Mercedes-Benz F 015 concept, which figures on our list of the Top Ten Tech Cars of 2016. It’s a lounge on wheels, and it’s undeniably cool, in a retro-jet-age way.

But even more retro is the accessory that nobody mentions: the barf bag. Because if you read or face backward while being driven, you may well get carsick and lose your lunch. 

Automakers think about the problem more than they talk about it. “I am working in this area with Ford Europe as well as Valeo [a Paris-based auto supplier],” says Cyriel Diels, a ergonomics specialist at Britain’s Coventry University. “I know that various other manufacturers and suppliers are aware of my papers, including Nissan and Jaguar Land Rover.”

Diels argues that the designers of self-driving cars must begin by considering the comfort of passengers. “Automated vehicles cannot simply be thought of as living rooms, offices, or entertainment venues on wheels,” he argues. 

Attending to the causes of motion sickness is key. Reading and facing backward tend to induce the problem because they create a mismatch between what your eyes are seeing and your inner ear is feeling. With every unexpected jolt and every wrench to one side during a turn, your brain loses its sense of where it is. That’s when the queasiness starts creeping up on you. 

The driver is largely immune to car sickness because he sees what’s happening just when he feels it; no string of surprises for him. That’s why a bad driver who repeatedly accelerates and brakes will feel just fine even as his hapless passengers turn 50 shades of green.

Passive passengers have felt the same sickness at sea and in outer space. In fact, astronauts calibrate space sickness on the informal “Garn scale”; a full garn is held to be the ultimate in misery. It’s named after former U.S. Senator Jake Garn, who, as head of the subcommittee that handled budget requests from NASA, wangled himself a trip on the Space Shuttle in 1985 that he later must have regretted taking. “Barfin’ Jake Garn,” the comic strip Doonesbury called him.  

The inner ear is key to the problem. Tellingly, deaf people do not get motion sickness; blind people can. And quick reflexes are apparently no defense, either: young adults develop motion sickness the most, octogenarians, the least. Women get it more often than men (or are more likely to admit to getting it).

A key way of defending against motion sickness is by keeping yourself fully apprised of upcoming lurches in your vehicle. It’s good to look out the window, better to look at the road up ahead, and best of all to be behind the wheel, so that you don’t merely predict the future but take a hand in bringing it about. So, one obvious rule of thumb for robocar designers is to let in a lot of daylight, both from the front and the side of the car, which means arranging interior posts so as to obstruct the view as little as possible.  

In a paper published this month in Applied Ergonomics, Diels and Jelte Bos of the Netherlands Organisation for Applied Scientific Research suggest that additional visual cues could be provided through augmented reality. They cite one study conducted in flight simulators that showed that people had a fourfold reduction in airsickness when they were presented with a projected trajectory of motion.

Reading or playing on a gaming console can also be made easier on a passenger’s stomach. You can simply make the e-reader or other electronic device small in relation to the view of the outside world, particularly when driving conditions involve start-and-stop motion. That way, a people can more readily track road conditions with their peripheral vision. You can also use virtual reality displays to create a see-through cockpit, with walls and A-pillars appearing transparent, so people can keep tabs on the outside world no matter where they may be looking.

But a settled stomach comes at a cost. If engineers tune autonomous cars to drive in such a way as to minimize the chance of carsickness, such cars would find it harder to pack themselves more efficiently onto the roads. Maybe road congestion is here to stay—but at least we could entertain ourselves while mired in traffic.


Cars That Think

IEEE Spectrum’s blog about the sensors, software, and systems that are making cars smarter, more entertaining, and ultimately, autonomous.
Contact us:

Senior Editor
Philip E. Ross
New York City
Assistant Editor
Willie D. Jones
New York City
Senior Writer
Evan Ackerman
Berkeley, Calif.
Lucas Laursen

Newsletter Sign Up

Sign up for the Cars That Think newsletter and get biweekly updates, all delivered directly to your inbox.

Load More