What could possibly go wrong?

While all this sounds great, data privacy and ethics are key concerns. It’s understandable that when arcane technological discussions around mixed reality elicit phrases like “eyeball tracking,” people get nervous. The prospect of someone learning about you by measuring each scintilla of eyeball movement during a VR interaction takes concerns about digital surveillance to a whole new level.

As the saying goes, the eyes are the windows into the soul. End users of the technology need to feel safe when they’re using it.

A virtual window into your soul

Perhaps some of these fears are overblown; Microsoft, for example, maintains that its latest versions of HoloLens use on-device eyeball tracking as a way to compensate for the otherwise herky-jerky, and nausea-inducing, latency of centralized servers rendering images to the device.

But the use of mixed reality intensifies the criticality of privacy and ethics, since wearables (as well as smartphones) are capable of tracking user behavior down to the most minute detail. In fact, VR and AR (like AI) might one day understand us better than we know ourselves.

In the end, VR and AR alike “work” by having real humans (with thoughts, feelings, desires and reactions) looking around inside the medium, as fast as their mind can go.

As the saying goes, the eyes are the windows into the soul. End users of the technology need to feel safe when they’re using it, so privacy and ethical guardrails of use are necessary, proper and essential concerns that are everyone’s responsibility.

The privacy and ethics issue isn’t just about the user of VR; it’s also about how the user sees others. Consider VR tourism: Would it be appropriate to erase the squalid street scenes of, say, India, Mexico or San Francisco to augment scenes of virtual splendor? If so, how might it reduce our capacity for innate, real-reality empathy and communication? In the words of author Douglas Rushkoff, “Instead of figuring out how to get away from the rest of us… [we] might want to focus on making the world a place from which [we] wouldn’t have to retreat.”

Addiction to Virtual Space is another concern. Imagine a virtual world so fine-tuned to your personal preferences that you never

want to leave. (The Japanese call this subculture “hikikomori,” which in English roughly translates into “pulling inward.”)

Given the mixed track record of digital ethics in, say, social media to date, worries about coercion and exploitation in Virtual Space are a concern. Matty Healy of pop group The 1975 (creators of the cutting-edge VR experience “Mindshower,” about digital detox), openly worries: “I think we’re going to essentially create a digital world where you won’t be able to tell the difference between reality and non-reality.”

Like whiskey, too much of a good thing can be a bad thing. Making time for breaks (like taking digital sabbaths, or entering WiFi-free zones) as part of our weekly routines will be needed to give our addled brains a break from Virtual Space.

Business leaders, futurists and policymakers need to look at the interplay of these issues in Virtual Space – and do it often. It’s critical to get ahead of the curve now, before business models of the future are hardcoded and established.