Wow, Is Apple’s Vision Pro Loaded With Pixels

Cupertino’s pace-setter AR/VR display leaves the metaverse in the metadust

6 min read
The Apple Vision Pro headset turned towards the user. A display inside the headset shows two mirrored images of a mountain landscape.

It may look like a pair of nerdy ski goggles, but don’t scoff at the specs: Apple’s Vision Pro packs an impressive 23 million pixels—three times as many as you’d find in a 4K display.


Apple announced the Vision Pro, its first augmented reality (AR) headset, at its 5 June WWDC keynote. The headset contains two leading-edge micro-OLED displays packing an incredible 23 million pixels—nearly three times as many as in a 4K display.

AR/VR display quality has improved in leaps and bounds since the original Oculus Rift, which had a single 1,280 x 800 display. But the need to balance price with quality has forced AR/VR leaders, such as Meta and HTC Corp., to compromise. Apple opts for a different strategy, pushing the device’s display technology, and the headset’s US $3,499 manufacturer’s suggested retail price, to the extreme.

“[Apple] ends up having three displays on this headset, which are not cheap by any measure,” says Anshel Sag, principal analyst at Moor Insights & Strategy. “The two micro-OLEDs in the inside, I’ve never seen implemented in anything. This is an extremely low volume, high-cost display with a three-element lens and optical system, which was obviously custom designed for this headset.”

A new frontier for pixel density

A grid of Micro-OLED pixels magnified so that each pixel's red, blue, and green sub pixels are defined.The Vision Pro has a pair of 1.41-inch micro-OLED displays. The size of each pixel is comparable to the size of a human red blood cell. Apple

Apple offered few specifics about the Vision Pro’s display at WWDC 2023, aside from its pixel count, but analysts have filled in the details. Ross Young, CEO of Display Supply Chain Consultants, says the Vision Pro packs a pair of 1.41-inch micro-OLED displays that combine an OLED frontplane from Sony with a silicon backplane from chip foundry TSMC. Each pixel measures a mere 7.5 micrometers in size—similar to the diameter of a human red blood cell. “It is by far the highest resolution micro-OLED on the market,” says Young.

The resolution, though impressive, doesn’t tell the whole story. AR/VR engineers measure pixel density by the number of pixels per degree of vision (PPD), a metric that accounts for a headset’s field of view. Apple hasn’t published the Vision Pro’s FOV, but journalists who’ve tried the device say it’s competitive with other AR/VR headsets, which offer a FOV between 100 and 120 degrees. That should place the headset’s pixels per degree around 50 to 70 PPD.

“The resolution of the fovea, the highest resolution portion of the eye, is considered to be 60 pixels per degree. And if you have a display like 60 pixels per degree, probably like 99.9 percent of people wouldn’t perceive the pixels,” says Michael Miller, the augmented reality hardware lead at Niantic. “Reducing it to 40 pixels per degree, instead of 60, is still considered to be OK. And for [Niantic’s] reference design, we tried 30 pixels per degree, and so long as users don’t see the gap between the pixels, it was still fine.”

“I think [Apple has] gone high resolution enough that you’re not going to be able to see the pixels.”
—Anshel Sag, principal analyst at Moor Insights & Strategy.

Most AR/VR headsets make do with a much lower pixel density. Meta’s Quest Pro delivers 22 PPD, the HTC Vive XR Elite provides 19 PPD, and the Microsoft HoloLens 2 quotes 47 PPD (though some dispute that figure). The widely praised Varjo VR-3, which released in 2021 and retails for $3,645, achieves 70 PPD, but does so with a “bionic display” that combines a high-resolution focus area with peripheral displays that achieve a lower, though still impressive, 30 PPD.

“I feel like we’ve got some headsets already that almost got to that point, or already crossed it a little bit, like the Varjo headsets,” says Sag. “So I think [Apple has] gone high resolution enough that you’re not going to be able to see the pixels.”

Apple Silicon and foveated rendering share the load

Apple’s incredible pixel count presents challenges beyond producing the display itself. Any increase in pixel count increases the graphics performance required to render an image, which in turn increases power consumption and heat generation.

Every Vision Pro headset will ship with Apple’s M2, which includes CPU, GPU, and AI accelerators on a single chip, to drive its displays and handle compute. Yet even the M2, a chip most frequently found in mid-range Apple laptops such as the MacBook Air, would be sorely taxed if forced to drive the Vision Pro’s displays at full resolution.

The Varjo VR-3 makes this clear. Unlike the Apple Vision Pro, which is an entirely self-contained computer, the VR-3 doesn’t include a CPU or GPU, instead functioning as a display for a Windows PC. Varjo recommends a range of bulky high-end laptops and desktops with Intel Core i7 and i9 processors and Nvidia RTX 3080 or 3090 graphics.

The Apple Vision Pro headset photographed from the side on a white background. The entire headset, including the battery pack, is contained in the photograph.Apple’s Vision Pro uses two on-board Apple Silicon chips to handle compute and graphics.Apple

Apple tackles this problem with an all-new chip, the Apple R1, which takes on the burden of processing input from the headset’s cameras. A portion of that data is used for foveated rendering, a technique that dynamically varies resolution in response to the user’s gaze.

“I think foveated rendering is a huge, huge factor in how much of the GPU is actually being utilized, and how much of the display they need to run at full resolution,” says Sag. “They could theoretically run the display with foveated rendering, where they’re only running full resolution in the middle, and everything else is a lower resolution.”

Foveated rendering is not a new idea—researchers at Microsoft explored the idea over a decade ago—but has only recently appeared in consumer headsets like the Meta Quest Pro and Sony’s PlayStation VR 2. Its rarity is due to the difficult problem of tracking a user’s eyes to determine where their attention is focused. Apple purchased SensoMotoric Instruments (SMI), a leading provider of eye-tracking solutions, in 2017, forcing others to lean on newer alternatives or devise in-house solutions.

Pixel density is a must-have for augmented reality

The display technology in Apple’s Vision Pro could be overkill for enthusiasts who prefer VR over AR. “If we dig into the screen tech itself, the best way to summarize it is that, for many use cases, it doesn’t matter,” says Jeremy Dalton, spatial computing advisor and author of “Reality Check.” Dalton notes that most consumer AR/VR headsets are used for games, movies, and fully immersive 3D experiences. Improved sharpness, while nice to have, isn’t a necessity when thumping through a track in Beat Saber or exploring Meta’s Horizon Worlds.

Apple has a different audience in mind. Its demos said nothing of the metaverse and only briefly mentioned gaming. The company instead focused on video calls, photography, office productivity, and AR entertainment. These scenarios drastically increase the need for resolution, as users will interact with real and virtual objects simultaneously.

“They definitely wanted to support the pass-through use case. You need to present the user their own environment using the displays. This is crucial,” says Miller. “To create this experience, and make it believable, this is the most important reason why you need all this high resolution.”

A man stands at his desk in a large, open office. He is wearing an Apple Vision Pro headset which displays several screens in front of him. The screens don't exist in real space but instead exist only within the headset's virtual space.Extreme pixel density opens the door to useful AR applications that don’t push the eye strain of previous-generation AR display models. Apple

Current headsets fail to deliver on realism. The HTC Elite XR, which I tried at CES 2023, offered the most attractive pass-through mode I’d encountered so far, but interacting with real-world objects remained awkward. Meta’s Quest Pro is even less convincing. Adi Robertson, who reviewed the headset for The Verge, complained that “Meta’s color pass-through doesn’t look remotely like the real world.”

“The high-end screen makes the most sense when you have, for example, high-end training to deliver. NASA uses the Varjo for astronaut training,” says Dalton. Apple didn’t build the Vision Pro for astronauts, but high-density pixels remain useful in more mundane tasks, too. Astronauts need sharpness to see and read the small gauges and dials found in a spacecraft’s cockpit. Everyone else needs it to comfortably read printed documents or use a smartphone.

“I would expect the headset to perform beautifully in these sorts of scenarios,” says Dalton. “Productivity tends to be very text based. The better the viewing experience, the less the strain, and the longer you’ll be able to last.”

The Conversation (1)
Reto Spoerri
Reto Spoerri10 Jun, 2023

Your calculation of the ppd‘s is wrong. Vision pro has ~35ppd‘s. As it‘s counted only in one dimension. 3500/(100..120)