Apple has unveiled its highly anticipated next-generation spatial computing device, Apple Vision Pro. The device will retail at $3,499 and be available in early 2024. It is an augmented reality (AR) experience, with graduated immersive experiences that come very close to virtual reality (VR) at times. I had the chance to try the headset for about 20 minutes. The optics and interface are true “wow” factors. Here’s what is interesting about the announcement. Apple:
- Announced a new computing device — not a headset. First, the device stands alone — i.e., a consumer need not tether it to a computer or smartphone. Entry-level Apple computers start around $1,000 and range up to nearly $7,000. If a consumer is buying an accessory for a smartphone or computer such as AirPods, they expect to pay hundreds of dollars. If a consumer is buying a computer, they will pay more. $3,500 would be a midtier price point for an Apple computer though above an average selling price.
- Bypasses challenging interface education challenges with intuitive, well-understood interface. Most existing extended reality (XR) systems depend upon handheld controls such as those used with gaming systems. These controllers typically have from 4 to 8 buttons or triggers. The complexity for using the controllers — not to mention keeping them charged — creates a barrier to entry and usage for nongaming consumers today. Apple uses simple gestures to extend the most popular tactics used to interface with its touchscreens — a combination of eye-tracking, flicks, and pinching to select, scroll, and pinch and zoom on icons or media on the virtual screen. This can be done without a second device such as a smart watch.
- Tackles privacy head on by letting consumers know that their eye movements won’t be recorded. Apple uses an Optic ID to log users into the device but encrypts that data and only stores it on the device. In addition, eye-tracking is used to optimize rendering of content on a screen as well as for navigation. Privacy advocates have expressed concern that eye-tracking will become yet another way that companies selling advertising will collect data on consumers. This data remains on the device unless the consumer chooses to share it.
- Taps the Apple TV plus app ecosystem to offer broad utility from the start. Apple primarily showed use cases around media consumption of photos plus its Apple TV media. One of the challenges of VR has been limited use cases for consumers. Top use cases are gaming, education, fitness, and maybe social media. Vision Pro offers far broader utility from the start and allows consumers to tap into media and apps they already own.
- Focuses on comfort. When consumers talk about virtual reality headsets, they speak of the discomfort of wearing the headset for long periods of time (more than 30 to 60 minutes) and nausea or disorientation. Apple didn’t announce the weight of its headset but took substantial time to describe the knit band — similar to my lightweight Nike Flyknit sneakers. Apple also talked about the chips that it’ll use to minimize latency between sensors and the display.
- Tries to break down the barrier between a headset wearer and the people around them. An unexpected feature is that the face of the headset has a screen that can mimic the “eyes” of the person behind it, showing their eye movements when in augmented reality mode, meaning that they would be able to interact with people around them. In virtual reality mode, the screen shifts to a light, indicating to others nearby that the wearer is not looking at them nor is likely to notice them. It’s the next best thing to using a transparent headset, which isn’t technically feasible yet.
- Starts with easier use cases. When consumers use devices, they often begin with media consumption, followed by communication, shopping/transacting, and then control. Apple focused on media: movies, photos, music, gaming, TV, and more. The company also demoed some communication, collaboration, and productivity applications (e.g., Microsoft applications). I expect the traction on these use cases to take a bit longer — creating content on these virtual screens takes a tad more collaboration and getting used to “the feel” than sitting back and watching your favorite movie. Also, the lack of a specific enterprise-first focus, like what Microsoft had when it debuted HoloLens, is probably fine given that XR devices have yet to change work for most people.
Those are the device details. But the top question I’ve received from the press today is, “How does this position Apple versus the competition — specifically Meta?” It’s an important question, and it should come as no surprise that Meta preannounced its Oculus Quest 3 headset last week to get ahead of this announcement. While some may find similarities between the two companies, Apple’s approach feels very different. It is focused on augmented reality (where there are more compelling use cases in the long run) rather than virtual reality. It’s public about how it uses eye-tracking. Apple is selling a computing device rather than a headset. And it will enhance and extend its own ecosystem — other devices, media, fitness services, and more, just as Meta will — but its ecosystem is very different. And it’s not specifically tied to anything like Horizon, Meta’s metaverse experience, which requires that users buy into not only the device but the bigger concept of virtual worlds that is still not ready for prime time.
In summary, this is not a brand new day for XR technologies, but it is a commitment from one of the most important companies in the world to this new mode of computing — at a time when Microsoft and Meta both are laying off entire teams that have labored on similar devices for years. If it remains committed to the market, Apple will likely force the hand of others to continue investing, stimulating further innovation.
For more information on consumer adoption, usage, and comfort with AR and VR devices, please see Forrester’s Moments Map research. Forrester has data for 10 geographies across North America, Europe, and APAC.