The eye tracking company Tobii had some VR demos that they were showing on the GDC Expo Hall floor as well as within Valve’s booth. They were primarily focusing on the new user interaction paradigms that are made available by using eye gazing to select specific objects, direct action, but also locomotion determined by eye gaze. I had a chance to catch up with Johan Hellqvist, VP products and integrations at Tobii, where we discussed some of the eye tracking applications being demoed. We also had a deeper discussion about what type of eye tracking data should be recorded and the consent that application developers should secure before capturing and storing it.
LISTEN TO THE VOICES OF VR PODCAST
One potential application that Hellqvist suggested was amplifying someone’s eye dilation in a social VR context as a way of broadcasting engagement and interest. He said that there isn’t explicit science to connect dilation with someone’s feelings, but this example brought up an interesting point about what type of data from an eye tracker should or should not be shared or recorded.
Hellqvist says that from Tobii’s perspective that application developers should get explicit consent about any type of eye tracking data that they want to capture and store. He says, “From Tobii’s side, we should be really, really cautious about using eye tracking data to spread around. We separate using eye tracking data for interaction… it’s important for the user to know that’s just being consumed in the device and it’s not being sent [and stored]. But if they want to send it, then there should be user acceptance.”
Hellqvist says our eye gaze is semi-conscious data that we have limited control over, and that this is something that will ultimately be up to each application developer as to what to do with that data. Tobii has a separate part of their business that does market research with eye tracking data, but he cautions that using eye tracking within consumer applications is a completely different context than market research that should require explicit consent.
Hellqvist says, “It’s important to realize that when you do consumer equipment and consumer programs that the consumer knows that his or her gaze information is kept under control. So we really want from Tobii’s side, if you use the gaze for interaction then you don’t need the user’s approval, but then it needs to be kept on the device so it’s not getting sent away. But it should be possible that if the user wants to use their data for more things, then that’s something that Tobii is working on in parallel.”
Tobii will be actively working with the OpenXR standardization initiative to see if it makes sense to put some of these user consent flags within the OpenXR API. In talking with other representatives from OpenXR about privacy I got the sense that the OpenXR APIs will be a lot lower level than these types of application-specific requirements. So we’ll have to wait for OpenXR’s next update in the next 6-12 months as to whether or not Tobii was able to formalize any type of privacy protocols and controls within the OpenXR standard.
Overall, Tobii’s and SMI VR demos that I saw at GDC proved to me that there are a lot of really compelling social presence, user interface, and rendering applications of eye tracking. However, there are still a lot of open questions around the intimate data that will be available to application developers and the privacy and consent protocols that will inform users and provide them with some level of transparency and control. It’s an important topic, and I’m glad that Tobii is leading an effort to bring some more awareness to this issue within the OpenXR standardization process.
Support Voices of VR
Music: Fatality & Summer Trip