The metaverse has arrived — or that was the vibe last week, anyway, at the 2021 installment of the Augmented World Expo (AWE) conference. Many speakers also acknowledged that the metaverse is still in its early days, but the consensus was that after decades of gestation, the metaverse is now born.

Since 2013, AWE has brought together organizations and individuals (annually, except in 2020) working on extended reality (XR) — augmented reality (AR) and virtual reality (VR). This year, it included tech behemoths like Amazon and Microsoft; XR giants like Magic Leap, Niantic, Roblox, Unity, and Unreal; as well as many XR startups and also (crucially) companies creating XR experiences for their customers, employees, patients, and other users, such as Boeing, Disney, Kohler, and Mayo Clinic.

Niantic CEO John Hanke in his opening keynote commented that “If only one person is seeing a vision, it’s a hallucination, but if we all see it together, it’s magic” — hinting that the metaverse magic is starting to happen. And the many demos in the expo hall did give us some remarkable experiences in terms of vision, touch, and even smell — for example:

  • Vision: Varjo’s VR flight simulator felt real in part because its headset’s resolution matches the human eye’s in the fovea (the center of the field of vision, where acuity is highest), and its life-size car in AR felt that way, too, even when I leaned close to details like the grain of the leather upholstery.
  • Touch: HaptX’s gloves tricked me into feeling raindrops on my hand, then the paws of a tiny animal (a miniature fox) walking across my palm, thanks to compressed air fed onto the skin through tiny tubes into apertures millimeters apart. (On day three, HaptX won AWE’s VR Best in Show award.)
  • Smell: OVR’s technology adds the olfactory dimension to headsets — the smell of a virtual rose I picked, for example, or of woodsmoke from a campfire at my feet and even of a marshmallow roasting on a stick I held close to the embers.

Some of these examples may seem frivolous, but the talks and other demos also emphasized how these technologies are being used today already to make brain surgery more reliable, design collaboration more productive, or PTSD therapy more effective, among many other valuable applications.

Amid the excitement about advances, however, Hanke also had words of caution, pointing out that in most sci-fi novels and films that have inspired many to want to help create the metaverse, its purpose is to let people live in an increasingly augmented or virtual reality because the real world has become so broken that it’s a dystopia. He then challenged the audience, asking: “Is that the world you want?” But he was mostly preaching to the choir: It was clear from the more than 200 mainstage and track sessions (as well as the myriad demos) that companies at the event are striving to improve the world we know, not replace it.

The biggest obstacle I saw coming up again and again (in what the presenters and panelists had to say and in the demos) is that, although XR has attracted millions of users, there are billions who don’t know, don’t like, or just don’t believe in XR as a relevant, desirable future. To reach an adoption tipping point will require that the industry shift focus from technology to people:

  • Yes, recent advances in XR technology are pretty amazing in the face of the enormous technical challenges.
  • But for the potential of XR to be realized at scale, beyond early enthusiasts and niche applications, a radical leap forward in the user experience (UX) of XR will need to occur. As real as the demos felt at certain moments, they were still also jarring and glitchy in many ways likely to disappoint or even repel the not-yet-converted. And big questions about ethics, inclusion, and privacy loom large and mostly unaddressed, too.

I’ll be doing a deep dive into the principal UX obstacles and emerging best practices for organizations aiming to design XR experiences for some upcoming research. I’ll interview people who’ve been at it for long enough to have valuable insights to share, and I’ll distill what I learn into a stream of reports about the UX of XR. If the above describes you, would you be willing to tell me your story and share what you’ve learned to help your peers in the field design better AR and VR experiences for everyone? If so, message me — thanks!