In early October, I showed up at an old firehouse on Staten Island for a glimpse into the future of virtual reality. That future depends largely on haptics. Now that we can use VR headsets to transport ourselves to another world, the thinking goes, we need systems to recreate sensations to bring those virtual experiences to life.
I went to Staten Island to meet up with a little-known company that fancies itself “the leader of realistic haptic feedback.” The company—now called HaptX—had promised to let me try out a prototype of its very first product.
I was looking forward to it, because I’d seen a spectacular demo by the same company, then named AxonVR, at CES 2017. That demo consisted of putting on an HTC Vive and sticking my hand into a large metal box to experience the thrill of feeling a tiny virtual deer lay down in my palm.
The technology was bulky and awkward back then, but the results were “absolutely magical,” as my coworker Evan Ackerman wrote at the time. When we left CES, the company promised more announcements later in the year. In September, they said they were about to make a big one.
At the Staten Island firehouse (now an Airbnb the company had rented), the HaptX team showed me a prototype of the HaptX Glove, officially announced today, which will ship in 2018. It looks and feels like a big black ski glove, except it has plastic clips on the fingertips and is connected by a very thick black cord to a slick, glowing box (which the team says is 26 times smaller than the box I stuck my hand into at CES).
HaptX hopes to sell its HaptX Glove to companies that want to give employees a more realistic environment in which to train, practice making repairs, or test a new product.Photo: HaptX
HaptX will try to sell its haptic glove to companies, rather than as a consumer product. They think the glove will be used for three main purposes: employee training (in healthcare, defense, and retail); design and manufacturing (primarily within the auto industry); and location-based entertainment (such as in VR arcades or at theme parks). “More than a dozen” companies, including several Fortune 500 companies, have already signed agreements to test it.
Before the demo, I spoke with Andrew Mitrak, director of marketing, Greg Bilsland, senior communications manager, and Kurt Sjoberg, a biomedical engineer, about the company’s plans for its first product. With the announcement, the company also changed its name from AxonVR to HaptX. The Seattle-based startup has raised $9 million and has 32 employees.
An illustration from a HaptX patent shows a pattern of microfluidic channels that the company has incorporated into its HaptX Glove.Photo: HaptX
The glove combines tactile feedback with force feedback—two key components of the sensation we recognize as touch. When you touch a piece of glass, for example, the feel of it comes partly from the way it displaces the surface of your skin. Similarly, the sensation you feel when gripping a glass is produced in part by force feedback. Together, tactile and force feedback systems can give you a sense of the “shape and rigidity of objects,” Mitrak says.
To produce this, HaptX engineers invented a special textile laced with microfluidic channels. Those channels fill with air to operate pneumatic actuators, which cause the glove to press against the wearer’s skin in very specific spots. Tendon-like structures along the glove’s exterior reflect the positions of actual tendons in the wearer’s hand, and provide varying degrees of resistance for force feedback. The glove also has a “proprietary motion-tracking solution,” as Mitrak puts it, that pinpoints the positions of a wearer’s fingertips with sub-millimeter accuracy.
These technologies create a far more sophisticated sense of touch than other haptics products, Mitrak says: “So much of haptics today is vibration, but touch is so much more than vibration.”
It was time to try it out. Sjoberg took just one measurement of my hand—from my palm to the tip of my middle finger—to calibrate the system. He said this calibration uses population data taken from a wide variety of sources, including some collected outside of the U.S.
I slipped on the glove (which Sjoberg said is “one size fits most”) and pulled an HTC Vive over my eyes. Once Sjoberg started the demo, I was transported to a cartoonish farm scene. He told me to interact with any objects I saw—so I picked up rocks, moved my hand over a field of wheat, and reached up to feel the curvature of the moon. Each of these actions produced a sensation that was unique and appropriate—it actually felt as though heads of wheat were brushing across my fingertips.
Somewhat reluctantly, I allowed a brown-and-white spider named Gertrude, which I remembered from the CES demo, to crawl into my palm, and felt the uneasy prickle of her many legs. The deer was nowhere to be seen, but a friendly fox showed up and hopped into my hand, which was just as delightful.
Next, I tried to pluck a sunflower from its stem, and it worked—but it felt strangely weightless, like perhaps the sensation was too light for me to register. Part of the challenge in building a high-fidelity haptic system is to produce the appropriate amount of feedback to allow its user to experience the difference between, for example, holding a cloud and holding a sunflower. To me, the demo at times failed to make a clear enough distinction between lightweight objects.
Another challenge appeared to be that sensations focused on the fingertips and the palm registered more clearly, whereas those that required the entire finger to simulate were less clearly conveyed. For example, I felt the sunflower’s stem at the tips of my fingers and in the center of my palm, but lost my sense of its rigidity in the space between.
At one point, it started to rain on the virtual farm, so I held my hand under a cloud to catch raindrops. It turned out to be one of the most pleasing parts of the experience—I felt big, heavy droplets press against my palm (without getting wet).
The version of this technology that I tried at CES had also included an ice cube and a fire-breathing dragon that produced very realistic feelings of cold and hot. But Mitrak says the companies they are now targeting don’t need thermal changes as much as they need tactile and force feedback. Removing those features—and whatever mysterious technology was powering them—helped HaptX reduce the size of their system.
Without a doubt, this startup has achieved something impressive in shrinking their bulky hardware to a portable haptics system that will soon be available for commercial applications. Along the way, they’ve made tough decisions about which features to include and cut out. Most importantly, they’ve managed to render realistic haptic sensations suitable for the purposes they’ve described—training, simulation, and location-based entertainment.
The technology still isn’t perfect, and it’s certainly not immersive. Even now, it’s a very long way from the dream that some people harbor of losing oneself in a virtual world (I, admittedly, have never been one of those people).
But HaptX knows this about its own technology, and the parts of the demo that proved underwhelming shouldn’t prevent HaptX from achieving their commercial goals in 2018. They’ve wisely backed away from promising immersive VR and settled on finding practical uses for their technology across various enterprises.
As a training tool, the HaptX Glove is perfectly sufficient. It will allow wearers to get a feel for the tasks at hand, building muscle memory and completing repetitions in a realistic virtual environment.
Meanwhile, HaptX hasn’t given up on the ultimate vision of immersive VR, which is what originally inspired Jake Rubin and Robert Crockett to start the company. In fact, the company is still working on a “full-body haptic platform.”
“The point at which virtual reality experiences are indistinguishable from real life—that’s still our goal,” Mitrak says.