MEMS Makers Want to Put Even More Sensors in Smartphones for Total Situational Awareness

Our phones are going to know everything about us and our immediate environments

3 min read

MEMS Makers Want to Put Even More Sensors in Smartphones for Total Situational Awareness
Photo: iStockphoto

Yeah, yeah, so your phone can sense when you hold it close to your head, it tracks your location and helps you navigate, and it dims automatically when you turn down the lights. But just when you thought your smartphone was, well, pretty smart, it turns out that it has a long way to go. Because it isn’t yet tracking things like air pressure, humidity, and temperature. And it could and should.

That's according to sensor industry leaders attending the MEMS Executive Congress held this month in Napa, Calif. Of course, they would be expected to argue for more sensors in phones—their companies sell sensors, after all—but they made did make a pretty compelling case for adding at least a few new ones.

The first (of what turned out to be many) times that I heard the case made for these additional sensors I was skeptical. Pressure, humidity, and temperature are useful, I thought, only if you’re a weatherman. With Siri, who can fetch me a weather forecast in an instant, do I build my own? Actually, Stefan Finkbeiner, CEO of Bosch Sensortec told me, I do. Or rather, I need my mobile device to build a hyper-local weather report for me, because it could tie that reading of the extremely micro climate to my fitness tracking program and suggest when I might want to dial down my work out—or push a little harder, based on how I’ve handled such weather conditions in the past. It also might look at the temperature and humidity of a room, compare it to a more general weather forecast, connect it to information from a sleep tracker, and let me know when I might want to open a window to sleep better. Of course, that would just be frustrating in a hotel room with sealed windows, but the idea makes sense.  (To date, Samsung is the only major manufacturer that has added pressure, humidity, and temperature sensors to a phone, the Galaxy S4. The pressure sensor has gotten a lot of attention from indoor navigation system developers.)

Both these scenarios—linking my workouts and sleep quality to ambient conditions—assume I’m always attached to some kind of wearable device. That is actually becoming a reasonable assumption. Indeed, many of my Silicon Valley friends are so attached to their Fitbits and other fitness trackers that they are naming them. If you assume everyone has a smartphone and a wearable, Ivo Stivoric, vice president of research and development for Jawbone told me, there are all sorts of things that become possible. If a phone is tracking the temperature in the room, and a wearable tracks when you fall asleep, the wearable could alert the phone, which in turn could tell your smart thermostat to turn down the heat. In the morning, as you start stirring, your wearable would know you were starting to wake up, and the smart thermostat could raise the heat in response, rather than going with a pre-programmed setting.

Besides more sensors and more sensor integration, the other trend spotted at the MEMS Executive Congress is the growing attention being paid to the motion co-processor, a low-power chip that collects and processes the signals from all the sensors in the mobile device. The iPhone 5S is pioneering the use of a motion coprocessor, in its case, the M7.

The most important thing, said Becky Oh, president and CEO of PNI sensors, is that a low-power motion coprocessor can be always on—even when the rest of the mobile device is asleep. So it knows things that can help you—like whether you’re in a car or walking when you call up a map application.

Of course, this information isn’t only interesting to you. It could also be interesting to Google or another company collecting data about how consumers move through a grocery store or shopping mall, where they pause, and where they speed up. Even if this kind of data were gathered anonymously, the concept does make me squirm. I’ll get over that, assured Kevin Shaw, chief technology officer of Sensor Platforms, speaking on a panel at the conference. “Anybody using Gmail?” he asked the attendees. “They are scanning every word you read and write. And we are comfortable with that.”

In the future, Shaw said, we will have a trillion sensors—that’s 100 for every person on the planet—and 7 to 10 connected devices.  These devices will be acting somewhat autonomously, looking at the endless stream of data from the sensors and deciding when it’s worth alerting you to something that is going on.

Flavio Bonomi, founder of Vulcano, says it is when your devices “break their silos” and connect with other mobile devices that it will get really interesting. “Information from multiple phones could be fused into a bigger perspective, like multiple cameras with different angles, giving you a better analysis.”

So my takeaway from the MEMS Congress? Our phones are going to know everything about us and our immediate environments, and their sensors will be friends with sensors on other phones, on the walls and in the cars around us. And all these sensors will talk about us behind our backs, but we'll be OK with that.

And maybe I ought to name my Fitbit.

The Conversation (0)