Privacy Is Just No Longer a Thing in Augmented Reality?

Skepticism mounts over Apple’s modest claims on Vision Pro data

4 min read
A man uses an Apple Vision Pro at his desk. He is typing on a keyboard while viewing apps inside the Vision Pro headset.

The Apple Vision Pro is not just a camera strapped to your face—it's twelve cameras strapped to your face.

Apple

The Apple Vision Pro teleports you to a parallel universe, one where you can summon apps and flick through web pages with a wave of your hands. It’s a feat made possible by an array of cameras, microphones, and gyroscopes that monitor your every move. And the Cupertino, Calif.-based tech giant has recently outlined—such as it is—how the company’s famously privacy-forward stance applies to the loads of sometimes quite confidential data the Vision Pro’s many sensors generate as a matter of course.

Where does that data go, and what could it be used for? The latest wave of interest in augmented reality (AR) makes that a pressing question. But, with trust in technology companies at an all-time low, experts are skeptical tech companies will treat the issue seriously.

“It’s at least theoretically possible for an AR/VR implementation to mislead you, to make judgements not only about who you are and what you want, but also to manipulate you.” —Mike Godwin, technology policy attorney

“I think with virtual reality technologies, because they’re sensing so much of what you do, it necessarily will create troves of data people will want to have access to,” said technology policy lawyer and author Mike Godwin. “Government’s hunger for what you do in cyberspace is already unlimited [...] so, I would expect VR and AR to raise even more questions, though perhaps in the same character of questions as mobile devices already raised.”

A portal to your life

Meta’s Quest 3 headset has four external cameras. Apple ups that count to twelve: six “world-facing” tracking cameras, two high-resolution video passthrough cameras, and four eye-tracking cameras. It also has a TrueDepth 3D sensor and Lidar.

However, the privacy experts I spoke with felt the risks associated with AR/VR headsets come not from the number of cameras or sensors, but from the perspective they offer into a user’s life.

Data privacy expert Maritza Johnson said it’s “very common for most people in the U.S. to be surrounded by cameras and microphones.” But those devices, though ever-present, aren’t physically attached to the user. “A phone is not mounted to my head, [...] a big difference is in the intimacy of the device,” she said.

It’s not enough for each company to set its own individual policies as they will and react to potential problems as they occur.

Johnson recalled a problem with Reactions in FaceTime, a feature Apple released alongside iOS 17 in September of 2023. Reactions trigger special effects, like fireworks, based on user gestures. It can be a fun feature in calls with friends, but individuals attend medical or therapy appointments through SimplePractice, a telehealth app, saw Reactions activate when they weren’t wanted. Many users, Johnson included, were unaware of the feature until it appeared in an appointment.

A man facing a camera has both hands raised and making a peace sign with two fingers on each hand. This triggers a burst of confetti to appear while using the FaceTime app in Apple iOS 17.Reactions are a fun feature recently added to Apple’s FaceTime, but they’re also an example of how tech can intrude in private spaces.Apple

Johnson’s experience didn’t occur on the Vision Pro, and SimpleHealth has not yet released a Vision Pro app. Still, it offers relevant insights. Reactions were a problem not because private data was collected and shared, but instead because the feature intruded on a private space. AR/VR headsets, designed to accompany a user through every moment of every day, offer even more opportunities for unintended interactions.

The experience soured Johnson’s trust in Apple. “[I saw the Reactions] and I’m like, what just happened? Why did it happen? Who put it there? How do I make it stop? At the end of the day, that’s on Apple.”

Whom can you trust?

Apple has stated “privacy is a fundamental human right,” an idea repeated in the company’s Apple Vision Pro Privacy Overview, which it released last month. The white paper detailed efforts to keep user data private through on-device processing and data minimization. Some information the headset collects, such as eye-tracking data, is processed on-device and not shared with Apple or third-party apps.

Yet there’s caveats, loopholes, and possible end-runs. Apps don’t need to access hand or head-tracking data to function, but they might access that data in certain situations. And while nuanced eye tracking data isn’t shared, the gaze-driven design of VisionOS can’t fully eliminate the problem, as any button a user selects implies where the user is looking.

And, of course, Apple isn’t the only company in this space. Meta, the other major player in consumer AR/VR headsets, notified users of plans to begin collecting additional data from Meta Quest products in February of 2024. The Supplement Meta Platforms Technologies Privacy Policy states that collected data may include “your avatar’s lip and face movement” and “abstracted hand and body data.”

I Spent 30 Days Working in VR (here’s what I learnt...)www.youtube.com

Handing over that data would seem to imply trust—and Godwin finds that difficult to accept. “I think it’s the wrong approach to say we should trust [companies] to do the right thing, because their incentives do not align with consumers.”

He points out that user data, once collected and stored, is often accessed in ways the user didn’t intend. Governments can legally force companies to hand over data, and data which today seems useless may eventually become useful for identifying, or even manipulating, individuals. The Cambridge Analytica scandal proved that. No single point of data collected by the organization was critical on its own. But in sum, and over time, it allowed the organization to micro-target individuals.

“It’s at least theoretically possible for an AR/VR implementation to mislead you, to make judgements not only about who you are and what you want, but also to manipulate you,” said Godwin.

Ultimately, Godwin said the tech industry has a lot of work to do before users can “trust” companies with the data AR/VR headsets can collect. He pointed to other professional industries, like medicine and law, that have specific codes of ethics which, in some cases, are reinforced by law. It’s not enough for each company to set its own individual policies as they will and react to potential problems as they occur.

“Today, even companies that are more trusted in the tech space, are still trusted less because of other companies that have acted badly,” he said. “The only way to rebuild trust is to proactively do things that build trust.”

The Conversation (0)