Skip to content

Mass Market Spatial Computing In Sight With Meta In VR & Apple In AR

Mass Market Spatial Computing In Sight With Meta In VR & Apple In AR

Meta and Apple are building from different directions a path to the same future with virtual worlds to visit, spatial calling with hyper-realistic avatars, and all day augmentation.

What we call the intersection that these tech giants arrive at is branding, but it is time we be real about the overall shape of our future and realize Meta and Apple are building the same destination.

VR: Starting With Meta Quest

Meta started with VR as Facebook in 2014 by acquiring Oculus. From Gear VR to Go to Rift to Rift S to Quest and Quest 2, Meta Quest 3 continues a tradition of VR-first devices, despite Meta's marketing department attempting to brand it otherwise. Quest 3 looks like it is the world’s best standalone VR headset, following its predecessor.

Right out of the $500 or $650 box, the XR2 Gen 2 processor in Quest 3 delivers sharper and clearer visuals across all VR games practically by default with pancake lens technology that’s a sight to behold after looking through any VR lens released before 2022’s Quest Pro and Pico 4.

Meta might be exploring making the controllers optional, but they also have the benefit of being combined with hand tracking now for dual mode input. While Apple called them "clumsy" during its announcement, at the same time Meta's AI program reduced theirs to little more than slimline haptic grips with precision buttons. Over time, controllers may actually be a major differentiator for games on Quest.

With Link to PC VR and direct touch for Android apps, it’s hard to argue anything else comes close in value for an overall VR console gaming system, even when Quest 3 starts $200 more expensive than Quest 2 did in 2020.

AR: Starting With Apple Vision Pro

Apple begins in 2024 with Vision Pro as an AR-first device starting at $3500. In the same way iPod formed a foundation for iPhone, iPhone forms a foundation for Vision Pro. Yes, you'll have your iPhone on Vision Pro, but the entire stack of technologies driving it will be visible every moment you wear it too.

The Vision product category is also building toward the audacious idea Apple is ready to take on the task of augmenting your life continuously. In iOS today, accessibility features add an assortment of "detection" modes for people, doors, or text with options for audio or haptic feedback. There's even "Point and Speak" mode where you point at something with the camera and the iPhone will "describe what is being pointed at by the finger in the camera view." Apple notes, "Point and Speak should not be used for navigation or in circumstances where you could be harmed or injured." That sentence reads as much a warning to users as a challenge to Apple's engineers working on Vision Pro and future Apple headsets.

Apple's more established wearable product lines like Apple Watch and iPhone already have alert features and even satellite contact systems to call for help anywhere in the most dire emergencies, but in daily life they're also becoming more seamless for everyday interactions. You can double pinch the air to answer a call from the latest Apple Watch, while AirPods Pro can keep your audio private with no iPhone in the middle. The latest Apple features for audio can even stay aware of conversations you might need to pay attention to in your physical surroundings.

Augmentation and dependency on a wearable object that we feel naked without might not be what people have in mind when they hear augmented reality, but that's how we should think about our connection to technology with this level of embedded intelligence accompanying us through most of the day and night.

Meeting In The Middle

In the past, Apple CEO Tim Cook has called AR one of the "very few profound technologies" that can impact your day-to-day life while Meta CEO Mark Zuckerberg pointed out in 2017 "the biggest trend in transportation is that its a lot easier to move bits around than atoms.”

In the present, Apple is allowing you to talk to whoever you want wherever you want without even requiring you look at a screen, while releasing iPhone 15 Pro as its first spatial camera. Meanwhile, Meta is shipping Quest 3 and bringing your full upper body into VR on Quest 3 with generative legs and getting people comfortable with wearing Meta livestreaming cameras.

In the near future, Meta plans to ship persistent digital objects you can hang in your environment (Augments) while Apple plans to have FaceTime calls that pull in hyper-realistic digital avatars (Personas). Meta has realistic avatars too – but creating the avatars is still enormously time consuming – and its own form of the technology for pinching in mid-air to interact with your wearables. In fact, Zuckerberg thinks their neural wristband technology will be faster than typing on keyboards in just a couple years.

Mixed Reality & Spatial Computing

I think of Quest 3 and Vision Pro as two extremes on the same continuum, but it's not the one our technical readers think about with transparent AR optics on one end and opaque VR optics on the other.

Instead, I put Quest on the left side of a chart with VR and put Vision Pro on the right sight of the chart with AR. In the middle between them is a future mixed reality all-purpose spatial computer that we wear for most of the day.

I'd wager these headsets will cost somewhere between $1000-$3000, replace every computer you own, and you'll find yourself spending about as much time visiting virtual spaces with friends as you do with them alongside digital objects floating in your physical environment.

And with each passing keynote and each new headset release from here, I'd expect the headsets drawn up by both Meta and Apple engineers to increasingly look like one another.

Community Discussion

Weekly Newsletter

See More