Consumer Electronics

This Is the Year for Apple’s AR Glasses—Maybe

With head-up displays, cameras, inertial sensors, and lidar on board, Apple’s long-expected augmented-reality glasses could be a game-changer


Illustration of a man with apple icon shining.
Photo-illustration: Edmon de Haro

Apple didn't invent the portable music player, although I challenge you to name one of the approximately 50 digital-music gadgets that preceded the iPod. Apple didn't invent the smartphone either—it just produced the first one that made people line up overnight to buy it.

And Apple isn't first out of the gate with augmented-reality (AR) glasses, which use built-in sensors, processors, and displays to overlay information on the world as you look at it. Google introduced its Glass in 2013, but it generated more controversy and criticism than revenues. More recently, Magic Leap promised floating elephants and delivered file sharing. And Epson has been quietly selling its Moverio AR glasses for niche applications like closed captioning for theatergoers and video monitoring for drone pilots, while steering clear of the consumer market. The point is, although they were pioneering, none of these efforts managed to put augmented reality into comfortable, useful, affordable glasses that appealed to an ordinary person.

And now comes Apple. For years, Apple has been filing patents for AR and virtual-reality (VR) technology, acquiring related startups, and hiring AR experts from the Jet Propulsion Laboratory, Magic Leap, Oculus, and others. The company has been tilling this soil for quite a while, and speculation has for years been intense about when all this cultivation would bear fruit. Though Apple has carefully shrouded its AR efforts since their origins around 2015, a few signs, such as a declaration from a legendary Apple leaker, suggest that an unveiling could come as soon as March of this year.

It's a giant project for Apple. Some analysts suggest it could give the company a jump on a market that could swell from US $7.6 billion to $29.5 billion over the next five years. Published reports indicate that Apple has around 1,000 people working on the effort. And now, after working on various designs for years, those engineers have likely made dozens and dozens of prototypes, according to Benedict Evans, an analyst who also produces an influential newsletter on technology. Before long, we'll find out whether Apple can do for AR glasses what it did for portable music players, smartphones, and smartwatches.

“It's the threshold moment that all of the AR community have been waiting for," says David Rose, a researcher in the MIT Media Lab and former CEO of ­Ambient Devices. “AR glasses hold so much promise for learning, and ­navigating, and simply getting someone to see through your eyes. The uses are mind-blowing…. You could see a city through the eyes of an architect or an urban planner; find out about the history of a place; how something was made; or how the landscape you are seeing could be made more sustainable in the future."

Rumors of a 2021 launch flared up last May, when Jon Prosser, who hosts the YouTube Channel Front Page Tech and has made a career out of reporting leaks from Apple and others, said that an announcement of what he expected to be called Apple Glass would likely come at a March 2021 event. Prosser predicted displays for both eyes, a gesture-control system, and a $500 price point. Other pundits have chimed in with different release dates and specifications. But 2021 remains the popular favorite, at least for an unveiling.

Early Entries: Microsoft (top), Magic Leap (middle), and Epson (bottom) are already selling AR glasses in niche markets, but none of the products have broad appeal.Photos, from top: Daniel Reinhardt/picture alliance/Getty Images; Chesnot/Getty Images; Toshifumi Kitamura/AFP/Getty Images

What technology will be packed inside Apple's first generation of AR glasses? It depends on the experience Apple has chosen to provide, and for this, there are two main possibilities. One is simply displaying information about what's in front of the wearer via text or icons that appear in a corner of the visual field and effectively appear attached to the glasses. In other words, the text doesn't change as you swivel your head. The alternative is placing data or graphics so that they appear to be attached to or overlaid upon objects or people in the environment. With this setup, if you swivel your head, the data moves out of your vision as the objects do and new data appears that's relevant to the new objects swerving into your field of view. This latter scheme is harder to pull off but more in line with what people expect when they think about AR.

Evans is betting on that second approach. “If they were just going to do a head-up display, they could have done it already for $100," he points out.

Evans isn't making a guess as to whether Apple will launch AR glasses in 2021 or later, but when they do, he says, it won't be as a prototype, or an experiment aimed at a niche market, like Magic Leap or HoloLens. “Apple sells things that they think have a reason for a normal person to buy. It will be a consumer product and have a mass-market price. There will be stuff to develop further, but it won't be $2,000 and weigh 3 kilos."

Evans expects the first version will include eye tracking, so the glasses can tell what part of the broader field of view is attracting the user's attention, along with inertial sensors to monitor head motion. Head gestures may well be part of the interface, and it will likely have a lidar sensor on board, enabling the glasses to create a depth map of the wearer's surroundings. In fact, Apple's top-of-the-line tablet and phone, the iPad Pro and iPhone 12 Pro, incorporate lidar for tracking motion and calculating distances to objects in a scene. “It's pretty obvious," Evans says, “that lidar in the iPad is a building block" for the glasses.

One big question about the glasses' display, Evans says, is whether it will take a new approach to presenting an image that can be visible in daylight. The most common approach to date has been using a microLED to project the image onto the glass; in daylight conditions this approach requires that the added-in graphics be limited to the brightest of colors. Recent rumors suggest that Apple will use Sony's OLED microdisplay as a source for the projected image. But although the luminance of OLED displays is impressive, MIT's Rose says, rendering a full spectrum of color in daylight will still be challenging.

The glasses will contain a visible-light camera—or two—to collect images of people and places for analysis. The main function of that camera won't be to record video, because the backlash against Google Glass made that function pretty much a nonstarter. Rather, the purpose of the camera will be to simply enable the software to know what the wearer is seeing in order to provide the contextual information.

Patent Pileup: Apple's many patent filings show years of effort in developing the technological building blocks for augmented-reality glasses.Images: Apple/U.S. Patent and Trademark Office

“Apple will try hard to not to use words like 'video camera,'" says Rose. “Rather, they will call it, say, a 'full-spectrum sensor,'" he adds. “Lifelogging as a use case has become pretty abhorrent to our society." If an option to store video clips does exist, Apple will likely design the glasses to prominently warn observers exactly when video or still images are being recorded, Rose believes.

The data processing, at least for this first generation of glasses, is widely expected to take place on the user's phone. Otherwise, says Rose, “the battery requirements will be too high." And off-board processing means the designers don't have to worry about the problem of heat dissipation just yet.

What will Apple call the gadget? Prosser is saying “Glass"; others say anything but, given that Google Glass became the subject of many jokes.

Whether or not Apple will ship AR glasses in 2021—and whether or not the product will be successful—comes down to one question, says analyst Evans. “Whose job at Apple is it to look at this and say 'This is sh-t' or 'This is not sh-t'? In the past it was Steve Jobs. Then it was Jonathan Ive. Who now will look at version 86 or version 118 and say, 'Yes, this is great now. This is it!'?"

This article appears in the January 2021 print issue as “Look Out for Apple's AR Glasses."