I’m wearing a first-generation Fitbit Flex. It’s been a while since it’s been cool.
Sitting near me in a Stanford University conference room last month was someone wearing the latest Apple Watch. It seemed like the latest in wearable tech when the Wearable Tech + Digital Health + Neurotech Conference started—not so much a few hours later.
That’s because the advances in hardware and software discussed by researchers and entrepreneurs on the stage are already, at minimum, laboratory prototypes. And their developers expect that these devices will soon migrate from the labs out onto people’s bodies, relegating simple step-counters and heart-rate sensors to a retrospective display at the Computer History Museum or IEEE Spectrum’s own Consumer Electronics Hall of Fame.
None of these new wearables is guaranteed to succeed in the marketplace. But the technology has clearly evolved rapidly. Shahin Farshchi, a partner at Lux Capital, said: “It feels like neurotech now is where PCs were in the 70s; a lot of tech was there but the applications weren’t clear.”
Though nobody knows exactly what new applications of wearables and neurotech are going to be hits, people have plenty of ideas. Here’s a sampling of what sorts gadgets are being tested today, and what they might look like if and when you will be able to order them on Amazon:
Mind-reading smart glasses. Julia Brown, CEO of MindX, said that her company plans to produce glasses that let you “access information with a single thought.” The company is using technology licensed from the Johns Hopkins Applied Physics lab to pick up signals from eye movement and from brain waves to know where you are looking and what you are thinking when you look there. Brown is excited about the possibilities of visual search, for example. No word on when the first devices might come to market; the company is still in the development stage, and currently looking to hire a brain computer interface software engineer who is a “full stack neuro nerd” and a neural data scientist who is “expert in extracting meaning from bio signals” along with more traditional software engineers.
Sweat-sensing glasses. Google Glass may have gone down in consumer electronics history as a fail, but developers haven’t given up on the glasses form factor. Joseph Wang, director of the Center for Wearable Sensors at the University of California, San Diego, told conference attendees that for health-centric wearables it’s important to monitor chemical changes in the body, not just activity levels and vital signs. His roundup of the applications of the printable electrochemical sensor arrays being developed by his team included chemical-sensing smart glasses. These would have disposable nose-piece pads that collect and analyze sweat. Wang also pointed to the possibilities of slightly invasive—but painless—wearables, like flexible stickers full of microneedle sensor arrays that can track chemical changes just under the skin.
Goggles that modulate brain waves to eliminate chronic pain. Richard Hanbury, founder of Sana, says his 20-plus year effort to create and commercialize a technology to treat chronic pain without drugs has finally led to a real product, arriving on the market later this year. The gadget looks like an eyeshade. It produces patterns of lights and sounds while monitoring brain waves in what the company says is a closed-loop biofeedback system. Hanbury said that, worn for 10 minutes a couple of times a day, the device trains the brain to reduce or eliminate the pain.
Hanbury announced that clinical studies are complete and that he expects FDA certification in the very near future. “We get emails every day asking if can we just buy this as a consumer device for pain,” Hanbury says, “but in the consumer space anybody can say anything they want to. You can’t compete like that. But you can compete if you get clinical validation first.”
A gadget that lets you listen to your body—literally. Startup Data Garden started out building devices that let people listen to their plants. The company’s newest project, currently in a beta testing, involves an algorithm that translates someone’s heartbeat, as tracked by the user’s wearable of choice, into music. To create the musical track, each heartbeat triggers the choice of a note, while other algorithms choose the appropriate instruments.
This isn’t just a party trick, said Jon Shapiro, Data Garden chief product officer. For example, it could provide useful information to people using apps that guide meditation or pace running without requiring them to look at a screen. In the future, Shapiro said, the software will be able to scrape a user’s Spotify history to personalize the timbre of the music.
And a gadget that lets you listen with your body instead of just your ears. David Eagleman has been working for several years with technology that translates sound into patterns of vibration that can be felt instead of heard. He announced that his team has reduced the size of the device from a vest that covered most of the chest and back to a wristband that called the Buzz. Eagleman, Stanford adjunct professor and co-founder of NeoSensory, says the gadget has been tested on 605 people who are deaf, and they were able to learn its language of vibrations simply by wearing it regularly, and will soon roll out to schools that serve the deaf community. He says the gadget can also allow people to “listen” to signals that have no sound, like changes in the stock market or trends in information flowing across the Internet.
Messages from your future self. Walter Greenleaf, a behavioral neuroscientist at Stanford, has been working on software that uses information from wearables and other sources about how you are treating your body to age an image of your future self. In testing on Stanford students, he discovered that seeing an image of themselves age as few as five years can make them highly motivated to take better care of themselves in the present.