Almost two years of scientific research and development went into creating Ubisoft’s new VR gaming experience, Eagle Flight [Review: 7.5/10]. At a VRDC presentation this past month, Game Director, Olivier Palmieri, went through an in depth look of Eagle Flight’s creation journey.
In September 2014, Palmieri gathered a small team of VR enthusiasts within the Company and got to work. Their goal? To build the most intuitive and comfortable VR gaming experience. Most people who have put on a VR headset have had at least one nauseating experience. To better understand what triggers nausea-inducing VR experiences, they looked at several causes.
They began studying the Vestibulo-Ocular Reflex, which is the relationship of motion between the inner ear and vision. We have three semicircular canals per ear, which signal to the brain your head orientation—up and down, as well as tilting motion.
In addition, they researched saccadic eye motion, or when you focus on a moving object without moving your head. There is a direct connection between the inner ear and eyes that triggers this saccadic motion, allowing your eyes to refocus.
When there are disagreements between what motion you see and what your inner ear perceives, your brain has a physiological defense mechanism. It assumes a toxin has entered your body—hence the nausea. Palmieri explained it happens in two ways: when you feel motion, but don’t see it (like when you are reading in a car), and when you see motion, but don’t feel it (like in space, because there is no gravity).
Finally, there was the issue of going through virtual walls. Because of the immersive nature of VR, Palmieri said when you go through a wall or obstacle, your brain expects a reaction it doesn’t get, so it “freaks out.”
Based on this research, prototypes focused on three main things: VR comfort, intuitiveness, and precision of controls with no latency. The first prototype was an exploration inside Notre Dame Cathedral in Paris. They took it to a couple of conferences in Montreal to playtest with a large audience.
“VR for a lot of people is still magic,” said Palmieri, “So it’s a great way to get a lot of people to test.” More than 300 people playtested the prototype, gamers and non-gamers alike.
Then came the Eagle Flight prototype. They joined Ubisoft Fun House and took it to E3 in 2015, where they discovered what was long suspected: everyone wants to fly. The positive response was their green light, and the first public event of a pre-alpha version of Eagle Flight demoed at GDC 2016.
Using your head as a controller was a risky move, but their research and playtests proved it was intuitive—no need to learn hand controls or a joystick. “In Eagle Flight, as soon as you move your head, the eagle is going to move his without any…latency…which is good for comfort,” said Palmieri.
Their choice was also informed by studying peripheral vision of humans as well as animals. The team looked at how eagles perceive the world as predators, and recreated that focused predatory vision. The winding flight path through the catacombs where exact precision is needed to navigate the constantly weaving terrain is an example of what is capable with head controls.
“The devil is in the details—we worked hard on refining and doing tests,” said Palmieri, ensuring their number one goal of creating a comfortable VR experience. They even considered minute details like noses. Our brains are used to seeing our noses in real life, and in VR when wearing an HMD, the brain realizes this disconnect. “This is why we added…the beak for Eagle Flight,” said Palmieri–to give the brain an extra reference point.
Next was to ensure continuous forward motion. Acceleration perception is very controlled in Eagle Flight; players can control their speed, but effects and visuals like particles and wind tunnels are used for transitioning speeds.
Another effect were the controversial “dynamic blinders”. He acknowledges when they first showed these to people, they were skeptical. “Many people think it’s going to be problematic or disturbing,” said Palmieri. “They were playing the prototype, we were asking them if they saw anything in particular, and they didn’t even notice.”
He said your brain naturally focuses on where you want to go—like the hyper focus of a Formula One racecar driver. Limiting peripheral vision forces this kind of focus.
Their collision system design also had to be rethought. The team avoided instant speed boosts or stops, which would trigger nausea. Inspired by NASA’s research on space sickness solutions, they employed fade-to-black screens in the game when running into obstacles and virtual walls.
Lastly, Palmieri noted during Q&A that sound is key for VR immersion. “[We] went with a theatrical exaggerated sound for comfort,” said Palmieri. Leaving no part of the final game experience unrefined, they also balanced out the sound on each ear to help convince the brain the movement is really happening.
Looking into the future of VR game development, Palmieri had three main takeaways for those wanting to tackle VR comfort: One, try to reduce the conflicts that can happen between your inner ear and vision. Two, never move the camera without player input—the player should control what they see at all times. Three, “Try to think function before the form—see the strength of VR and build on it,” said Palmieri.
“Everybody wants to fly, and that was also my personal dream—to have a VR experience where you fly, but we really started studying the medium first…and building everything around that.”
Form follows function, or so the saying goes.