The metaverse is going to suck for privacy
More thought – or at least some thought – needs to be given to privacy protection in the promised metaverse of connected 3D virtual-reality worlds, experts have concluded.
In a paper distributed via ArXiv, titled “Exploring the Unprecedented Privacy Risks of the Metaverse,” boffins at UC Berkeley in the US and the Technical University of Munich in Germany play-tested an “escape room” virtual reality (VR) game to better understand just how much data a potential attacker could access.
Through a 30-person study of VR usage, the researchers – Vivek Nair (UCB), Gonzalo Munilla Garrido (TUM), and Dawn Song (UCB) – created a framework for assessing and analyzing potential privacy threats. They identified more than 25 examples of private data attributes available to potential attackers, some of which would be difficult or impossible to obtain from traditional mobile or web applications.
The wealth of information available through augmented reality (AR) and VR hardware and software has been known for years. For example, a 2012 article in New Scientist described Ingress, an AR game from Google spin-off Niantic Labs, as “a data gold mine.” That’s why data-monetization firms like Meta are willing to invest billions to make the market for head-hugging hardware and AR/VR apps more than just a sadness of tech enthusiasts with no use for torsos.
Similarly, the trust and safety issues of online social interaction have vexed online services since the days of dial-up modems and bulletin boards, before web browsers were even a thing. And now that Apple, Google, Microsoft, Meta, and other players see a chance to remake Second Life under their own gatekeeping, corporate consultancies are again reminding clients that privacy will be a problem.
“Advanced technologies, especially in VR headsets and smart glasses, will track behavioral and biometric information at a record scale,” explains The Everest Group in its recent report: “Taming the Hydra: Trust and Safety in the Metaverse”.
“At present, digital technologies can capture data regarding facial expressions, hand movements, and gestures. Hence, personal, and sensitive information that will leak through the metaverse in the future will include real-world information about user habits and physiological characteristics.”
Not only is privacy an unsolved metaverse issue, but hardware security also leaves something to be desired. A related recent study of AR/VR hardware, “Security and Privacy Evaluation of Popular Augmented and Virtual Reality Technologies,” found vendor websites full of potential security vulnerabilities, their hardware and software lacking in multifactor authentication, and their privacy policies obtuse.
The escape room study enumerates the specific data points available to attackers of various sorts – hardware, client, server, and user adversaries. It’s worth noting that “attacker,” as defined by the researchers, encompasses not only external threat actors but participants and the companies running the show.
The potential data points identified by the researchers include: Geospatial Telemetry (Height, Arm Length, Interpupillary Distance, and Room Dimensions); Device Specifications (Refresh Rate, Tracking Rate, Resolution, Device Field-of-View, GPU, and CPU); Network (Bandwidth, Proximity); Behavioral Observations (Languages, Handedness, Voice, Reaction Time, Close Vision, Distance Vision, Color Vision, Cognitive Acuity, and Fitness).
From these metrics, various inferences can be made about a VR participant’s gender, wealth, ethnicity, age, and disabilities.
“The alarming accuracy and covertness of these attacks and the push of data-hungry companies towards metaverse technologies indicate that data collection and inference practices in VR environments will soon become more pervasive in our daily lives,” the paper concludes.
“We want to start by saying that these ‘attacks’ are theoretical and we don’t have evidence that anyone is actually using them currently, although it would be quite hard to know if they were,” wrote Nair and Munilla Garrido in an email to The Register. “Also, we use ‘attacks’ as a term of art, but in reality, if this data harvesting were to be deployed, consent would likely be buried in an agreement somewhere and be in theory entirely above board.”
If a company wanted to do data harvesting, it could get vastly more information about users in VR than it could from mobile apps … pivoting towards VR would make perfect sense in that context
However, the two researchers say there’s reason to believe that companies investing in the metaverse do so at least in part on the expectation that post-sales advertising will make up for losses like the $12.5 billion spent by Meta’s Reality Labs group last year to earn a mere $2.3 billion in revenue.
“Now, assuming a company of that size knows how to calculate a bill of materials, this loss-leading approach must be a strategic decision that they believe will eventually pay for itself,” argued Nair and Munilla Garrido. “And if we look at who these companies are, and which revenue methods they have already perfected, we suppose it will be at least somewhat tempting to deploy those same methods to recoup hardware losses. But again, this is speculative.
“All our research shows is that if a company wanted to do data harvesting, it could get vastly more information about users in VR than it could from mobile apps for example, and that pivoting towards VR would make perfect sense in that context.”
Asked whether existing privacy rules adequately address metaverse data collection, the two eggheads replied that they believe so, unless those rules only pertain to mobile apps.
“But we do have a unique challenge with respect to metaverse apps, in that there is a plausible reason to be broadcasting this data to central servers,” they explained. “Fundamentally, metaverse applications work by tracking all of your body movements and streaming all of this data to a server so a representation of yourself can be rendered for other users around the world.
“So for example, while a company would struggle to argue that tracking your movements is required for their mobile app, it’s actually an integral part of the metaverse experience! And at that point, it is much easier to argue that logs about it need to be stored for troubleshooting, etc. So in theory, even if the same privacy laws apply, they could be interpreted in dramatically different ways due to the fundamental data needs of the platform being so different.”
Nair and Munilla Garrido acknowledged that some of 25 or so collectible attributes they identified in their research may be obtainable through mobile phones or other online interactions. But metaverse apps represent a one-stop shop for data.
“We have a situation where all of these categories of information can be collected at once, within a few minutes,” they explained.
“And because you need to combine multiple attributes to make inferences (e.g., height and voice to infer gender), the presence of all these data collection methods in the same place at the same time is what makes VR a unique risk in terms of being able to infer user data attributes highly accurately.”
The sheer volume of information available through the metaverse is enough to de-anonymize any VR user, they claimed. They argue this is not the case for apps or websites.
They already have one in mind: a plugin for the Unity game engine called MetaGuard. The name makes clear the source of the privacy threat.
“Think of it like ‘incognito mode for VR,'” wrote Nair and Munilla Garrido. “It works by adding noise, using a statistical technique known as differential privacy, to certain VR tracking measurements, such that they are no longer accurate enough to identify users, but without significantly impacting the user experience. Like incognito mode in browsers, it’s something users could toggle on and off and adjust as they please depending on the environment and their level of trust.”
Here’s hoping metaverse privacy will be that simple.
Source: The Register