2 research outputs found
A privacy-preserving approach to streaming eye-tracking data
Eye-tracking technology is being increasingly integrated into mixed reality
devices. Although critical applications are being enabled, there are
significant possibilities for violating user privacy expectations. We show that
there is an appreciable risk of unique user identification even under natural
viewing conditions in virtual reality. This identification would allow an app
to connect a user's personal ID with their work ID without needing their
consent, for example. To mitigate such risks we propose a framework that
incorporates gatekeeping via the design of the application programming
interface and via software-implemented privacy mechanisms. Our results indicate
that these mechanisms can reduce the rate of identification from as much as 85%
to as low as 30%. The impact of introducing these mechanisms is less than
1.5 error in gaze position for gaze prediction. Gaze data streams can
thus be made private while still allowing for gaze prediction, for example,
during foveated rendering. Our approach is the first to support
privacy-by-design in the flow of eye-tracking data within mixed reality use
cases.Comment: 12 pages, 4 figures, to appear in IEEE TVCG Special Issue on IEEE VR
202
The Security-Utility Trade-off for Iris Authentication and Eye Animation for Social Virtual Avatars
The gaze behavior of virtual avatars is critical to social presence and
perceived eye contact during social interactions in Virtual Reality. Virtual
Reality headsets are being designed with integrated eye tracking to enable
compelling virtual social interactions. This paper shows that the near
infra-red cameras used in eye tracking capture eye images that contain iris
patterns of the user. Because iris patterns are a gold standard biometric, the
current technology places the user's biometric identity at risk. Our first
contribution is an optical defocus based hardware solution to remove the iris
biometric from the stream of eye tracking images. We characterize the
performance of this solution with different internal parameters. Our second
contribution is a psychophysical experiment with a same-different task that
investigates the sensitivity of users to a virtual avatar's eye movements when
this solution is applied. By deriving detection threshold values, our findings
provide a range of defocus parameters where the change in eye movements would
go unnoticed in a conversational setting. Our third contribution is a
perceptual study to determine the impact of defocus parameters on the perceived
eye contact, attentiveness, naturalness, and truthfulness of the avatar. Thus,
if a user wishes to protect their iris biometric, our approach provides a
solution that balances biometric protection while preventing their conversation
partner from perceiving a difference in the user's virtual avatar. This work is
the first to develop secure eye tracking configurations for VR/AR/XR
applications and motivates future work in the area.Comment: 11 pages, 10 figures, to appear in IEEE TVCG Special Issue on IEEE VR
202