Comments made by Consulting Oculus CTO John Carmack suggest Facebook could pursue a multi-year effort to reduce simulator sickness in Facebook’s VR headsets.
The comments come from a discussion between Carmack and the Vice President of Facebook Reality Labs Andrew Bosworth made in an audio chat on Twitter. Twitter’s Spaces platform for audio chat doesn’t yet archive its broadcasts, but we do have a recording we’ve been transcribing. We’ll post soon both an edited recording (removing sentence fragments and unrelated discussion) as well as the transcription.
For now, we’ve been pulling out headline-worthy segments. First, we noted a Quest Pro with more sensors onboard won’t ship in 2021 while Quest 2 stays in market for “a long while“. We also posted about Carmack’s comments related to making Facebook’s VR headsets more fully functional personal computers which could go after the market for Chromebooks and tablets. And there was also the comment about a multi-year evolution that could one day offer “a controller-free SKU in the future where we rely just on hand tracking for people that want to use keyboard and mouse and don’t want to pay for the controllers.”
Today, we’re noting Carmack’s response to a question from Bosworth where he asks him to share progress Facebook has made into reducing motion sickness caused by VR. Here’s the full transcription of Carmack’s response where he discusses what might be a multi-year effort to offer a system-wide method for reducing simulator sickness across apps:
John Carmack: We had an interesting discussion about this recently where someone in research made some comment about ‘what causes discomfort is when something’s moving in your visual field in a way that’s inconsistent with your vestibular system.’ Some of the mitigations that we tell people about about, well, like a cockpit around the edges can help because that’s the furthest out to your periphery — that largest motion — if you cover that up, people do the shrinking vignettes when you’re moving.
I’m kind of excited about this prospect of if we do a little bit of changes to the engines and pass some more depth information, which we want to do for positional time warp things, anyways, it’s possible for us to do a systems-level approach that’s actually aware of the depth next to it. So in a game like POPULATION: ONE, they offer multiple comfort levels that determine like how much you pull in at the sides and it’s helpful for people. But it’s wasteful in some sense, because if you’re outside, you’ve got the sky at effectively infinity and that causes no impact to your comfort at all when it’s moving there. But the vignette winds up covering it anyways. What we need to do is look at the depth of things relative to your view, how much it’s moving incorrectly relative to the inertial stuff and only fade out things proportional to that relationship. So I think that if you’re ducking for cover right behind a wall, that’s the stuff that can really make you sick if you translate next to it, the sky doesn’t have any impact on it whatsoever. So I think that we could do something system-level that could then be uniform across games, which would be great because right now each game has its own mitigation method and it would be good for users if they just realize that, okay, this is the way VR worlds behave when you’re close to something and you slide with the controller, you can expect that to kind of vignette out on the side. So that’s the type of thing that heck it’ll probably take us two years to sort of work something out and push it through developers and get buy in and get people to agree to it. But I think that’s a long-term direction that’s got some real potential.