I’ve tried out a number of virtual reality and augmented reality systems over the years that let me interact with a computer-generated world or overlay images on the real world. These typically involved putting on opaque glasses that replaced lenses with video screens; any real world images came from a camera and video feed. Because I’m susceptible to motion sickness, I couldn’t wear these for more than five minutes or so without feeling seriously queasy. And I couldn’t visualize walking down the street in them, anyway, although some researchers certainly did.
They did find use in industry, helping mechanics view complex wiring diagrams, for example, while looking at what they were trying to repair. And augmented reality itself, without the glasses, is booming. Thanks to smart phones with built-in cameras and GPS receivers, it’s not hard to build an app these days that overlays computer generated information on a live video image. Apps today let you identify constellations when you point your phone at the sky or help you find your car in a crowded parking lot.
Still, staring down at a small screen while you walk around trying to point the phone’s camera in the right direction isn’t likely the vision of a wearable augmented reality researcher Tom Caldell had in mind when he coined the term back in 1992. For me, though, it was a lot easier to stomach than any glasses I’d tried.
Until last month. I sat down with a researcher from France, on his first visit to Silicon Valley, courtesy of the French Tech Tour, sponsored by UbiFrance, a French government agency. Zile Liu, an optoelectronics engineer, is one of the founders of Laster Technologies, based in Gif-Sur-Yvette, France. This five-year-old company makes eye glasses that connect to PCs or mobile devices to create what the company calls a “visual walkman.” They have industrial prototypes being piloted in France this year, and expect to bring a consumer version to market within two years.
The Laster glasses are not opaque—they have clear glass lenses, looking like ordinary glasses with the exception of a thicker-than-normal earpiece. The earpiece houses a microdisplay projection system that sends the computer-generated image out onto the glass lens. A camera near the eye captures the real-world view and sends it back to the computer for analysis.
Sitting in a café overlooking Stanford University's Rodin Sculpture Garden, an appropriate place to meet with a French researcher, I tried on the glasses. First, I looked at an industrial application—an overlay of data. I found I could easily ignore the text and see the real world outside, or concentrate on the text and read it with little difficulty. More fun—playing with a little Tinkerbell-sized dancer, and tossing her from hand to hand. (I have a feeling the folks at the next table were seriously starting to wonder about me at this point.) Key for me, I didn’t get that queasy feeling I’ve gotten trying video display glasses in the past.
As cell phone use increased, hands free headsets became ubiquitous. As augmented reality apps proliferate, it’s likely that we'll be looking for a hands-free way to use these as well--which would be good news for little Laster. And perhaps soon nobody will know if I’m wearing glasses because I have bad eyes, or because I’m just an early adopter.
Photo: Laster Technologies
Tekla S. Perry is a senior editor at IEEE Spectrum. Based in Palo Alto, Calif., she's been covering the people, companies, and technology that make Silicon Valley a special place for more than 30 years. An IEEE member, she holds a bachelor's degree in journalism from Michigan State University.