top of page

Virtually Exposed: Privacy + VR

Updated: Apr 19, 2022

Virtual Reality is forging its way into the public consciousness. Showing up in everyday apps more and more as we embrace immersive technology. The problem is, our private information is not as protected as you think.

Much like augmented reality which I covered in my previous post. We are used to allowing tracking of our location, who we interact with within a game, and even scanning of our faces to play with filters. We're not worried because, at first glance, it seems the type of information a simple game collects isn't that telling. It's not your address or credit card. Those have layers of security. In fact, the physical tracking of your eye, head, and body movements aren't that valuable to the common thief.

However, there is big money in metrics for advertising. The tracking of user reactions and interaction with ads is incredibly valuable. Here's why. According to a report by the Future of Privacy Forum, "Twenty minutes of VR use can generate approximately two million data points and unique recordings of body language."

Consider how much that reveals about you. When watching an ad like the kind being floated for virtual games. A few years ago, HTC developed in-game VR ad services. The opt-in ads offered a discount on games and in turn recorded every eye twitch, hand swipe, and glance away to better target their users. It turns out where you look or don't look, what holds your attention, and what loses it all make for more curated ads aimed at you and your wallet. At what point does constant, tailored to your psychological profile, advertising beamed directly into your eyes become manipulation?

In that same report, it was noted that "Gaze or eye-tracking and gait analysis have been singled out as especially sensitive." Headsets can tell a lot more than where you look. Pupil dilation is a good indicator of sobriety, fatigue, attention issues, and more. Diagnostic information on your gait can be used to deduce medical conditions like heart disease. This is all private medical data that should be protected. With cloud computing needed for raw processing power, the more secure local storage is not being used as much. And remember that Google Glass fiasco where people who weren't even using the device were recorded in bikinis or picking their noses? It doesn't matter if you don't use the technology yourself. VR privacy is an everyone issue.

But wait, there's more! According to the ACLU, there are software programs currently in development that purport to be able to recognize emotions and cognitive states with this data. Pretty private stuff. Without proper, transparent security for our data, we risk our most vulnerable and private moments being exposed.

Think about the notorious naked celebrity selfies that captured everybody's attention when stars' cloud accounts were compromised. How securely would say, a virtual session with my therapist be? What bothers most watchdog groups is the lack of transparency about not only what's being collected, but how it's transferred, stored, and shared. There is no guarantee that the data collected is encrypted from your device to where it's stored. Or that it's properly secured while in an unused state. And no control for the customer over who can purchase that information. It's not out of the realm of possibility that something you want to keep private is either sold to inquiring employers, divorce lawyers, or others.

Don't get me wrong. I love technology. I like using it and find it incredibly fun. There are so many advantages to VR as well. For instance, the medical community is fast embracing virtual reality to perform complicated surgeries before ever opening up a patient. The cost of replicating entire simulated control rooms such as those used for nuclear plant emergency training can be done at much less cost with VR simulations. Virtual sessions to combat PTSS and PTSD are turning out to be incredibly helpful. All of these innovations are valuable. But massive collection and processing of our very personal data are required to give that experience. This is why the privacy issue is so important.

What can we do to protect ourselves? We need to push for legislation and regulations that give consumers rights over their information. For instance, we need to have the right to request reports on what companies have collected about us. There needs to be a process to have inaccurate information corrected. We need to be able to have that data deleted if we request. And, we should be able to opt out of the collection of our metrics as much as possible, for starters.

For now, the advice is to secure your device as soon as you get it. Change any default passwords. Use a separate dedicated email not connected to you or your finances in any way. And be aware that though the world rendering around you may seem private, it's most definitely not.


Never miss a new release, sale, or exciting update. Sign up here for my once-a-month newsletter: Electric Dreams Newsletter

I share news on tech innovations and how they affect us, publishing industry news, updates on my near-future thriller books, and recipes from my characters.

28 views0 comments

Recent Posts

See All


Couldn’t Load Comments
It looks like there was a technical problem. Try reconnecting or refreshing the page.
bottom of page