Beyond Touch: Tomorrow’s Devices Will Use MEMS Ultrasound to Hear Your Gestures

Touch screens are on the way out; piezoelectric gesture control is on the way in

10 min read
Opening illustration of hands
Illustration: Brian Stauffer

Today, we control our electronic world by touch—we tap, we swipe, we pinch and zoom. The touch interface went from being a novelty back in 2007, when Apple first brought it to the iPhone, to a ubiquitous feature in less than a decade. It is so commonplace that toddlers declare nontouch displays broken when the screen doesn’t respond to their fingers. But touch isn’t the end of the story. You can’t use it when you’re dripping in the shower, you can’t wear a touch screen on your eyeglasses, and you probably won’t explore virtual reality by swiping and pinching a handheld slab of glass.

Fast-forward to 2020, when, I predict, the touch interface will be used on a few phones and tablets, but not much else. You’ll start your morning run by sweeping your fingers near your fitness band. Tiny ultrasonic transceivers embedded in the band will sense the motion of your fingertips, identify the gesture, and turn on your favorite tunes. After your run, if your cellphone rings while you are showering, you’ll answer the call by sticking your arm out of the shower to pass your palm somewhere above the phone’s display. Later, in the car, navigation alerts and incoming text messages might threaten to distract you, but you’ll dismiss these with a flick of your hand.

Keep reading...Show less

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions

Video Friday: Turkey Sandwich

Your weekly selection of awesome robot videos

4 min read
A teleoperated humanoid robot torso stands in a kitchen assembling a turkey sandwich from ingredients on a tray

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

CoRL 2022: 14–18 December 2022, AUCKLAND, NEW ZEALAND

Enjoy today's videos!

Keep Reading ↓Show less

New AI Speeds Computer Graphics by Up to 5x

Neural rendering harnesses machine learning to paint pixels

5 min read
Four examples of Nvidia's Instant NeRF 2D-to-3D machine learning model placed side-by-side.

Nvidia Instant NeRF uses neural rendering to generate 3D visuals from 2D images.


On 20 September, Nvidia’s Vice President of Applied Deep Learning, Bryan Cantanzaro, went to Twitter with a bold claim: In certain GPU-heavy games, like the classic first-person platformer Portal, seven out of eight pixels on the screen are generated by a new machine-learning algorithm. That’s enough, he said, to accelerate rendering by up to 5x.

This impressive feat is currently limited to a few dozen 3D games, but it’s a hint at the gains neural rendering will soon deliver. The technique will unlock new potential in everyday consumer electronics.

Keep Reading ↓Show less

Fourth Generation Digitizers With Easy-to-Use API

Learn about the latest generation high-performance data acquisition boards from Teledyne

1 min read

In this webinar, we explain the design principles and operation of our fourth-generation digitizers with a focus on the application programming interface (API).

Register now for this free webinar!

Keep Reading ↓Show less