At CES 2019, HTC revealed the “Vive Pro Eye” featuring eye tracking from Tobii. Several demos from Vive partner developers showcased potential use cases for the Vive Pro Eye.
A company called Zero Light, for example, showed how Vive Pro Eye could be used with foveated rendering for increased clarity in the tiny details of a virtual car’s interior. By its nature foveated rendering should be invisible to the eyes, so Zero Light toggled various modes to show the eye tracking and rendering areas in various ways. Areas directly in front of the eyeball were supersampled at a resolution many times that of the panel. It was shown on a Quadro RTX 6000 and the supersampling improvements weren’t readily apparent to my eyes on the Vive Pro panel. One of the modes, though, showed green, yellow and red areas to indicate where the eye is pointed. In VR, this mode appeared to reflect where each eye was pointed very accurately. Members of the team also used it with glasses on and it worked fine.
A separate demo from Tobii itself in a Vive Pro showed a simple interactive game with creatures coming toward me. The eye tracking hardware could be used to essentially upgrade aim assist just by shooting a box in the game. Once I realized how much targeting these creatures was helped by eye tracking I only lasted 10-15 seconds before turning the feature on and leaving it on for the duration of the demo. This demo also used adjustable lines to show lower and higher resolution regions to my eyes.
Another demo from HTC’s partners showing at CES 2019 revealed how eye tracking could be used to assist in teaching an aircraft takeoff procedure, with each switch highlighted by the software and then “selecting” it by gazing continuously at the tiny switch for a brief time. This is how some simple interactions in VR are already handled on a headset with no other interactions — with a few smallish buttons and a few seconds of head gaze used to indicate intent. Most often, this is used to play videos.
Eye tracking, though, like the kind being shipped in Vive Pro Eye Tobii, uses sensors inside the headset to make the area of interest more specific than ever before. When incorporated into game design, this intent of the player could be used to hone the reactivity of characters or the environment. By tracking that gaze over the length of the play period, though, deeper insights can be learned for bigger changes in software design, or player behavior.
For instance, below is a screenshot I took after the aircraft training demo showing a record of where my eyes were focused throughout my flight. If I was training to become a pilot and spending too much time looking out the windows instead of focused on the controls, this data could let help inform and improve my next trip in the simulator.
Tobii’s eye tracking requires a very brief training session when putting the headset on. There’s also a dialog on-screen that shows the wearer how to turn a dial on Vive Pro to better align the panels directly in front of the eyes.
Overall, Tobii’s eye tracking seemed to work across a variety of demos but its real-world benefits are still unclear. Vive Pro Eye is likely to be a high-end version of a high-end headset used for very specific use cases by developers, like interaction research.