2,397 research outputs found
Towards Secure and Usable Authentication for Augmented and Virtual Reality Head-Mounted Displays
Immersive technologies, including augmented and virtual reality (AR & VR) devices, have enhanced digital communication along with a considerable increase in digital threats. Thus, authentication becomes critical in AR & VR technology, particularly in shared spaces. In this paper, we propose applying the ZeTA protocol that allows secure authentication even in shared spaces for the AR & VR context. We explain how it can be used with the available interaction methods provided by Head-Mounted Displays. In future work, our research goal is to evaluate different designs of ZeTA (e.g., interaction modes) concerning their usability and users\u27 risk perception regarding their security - while using a cross-cultural approach
Biomove: Biometric user identification from human kinesiological movements for virtual reality systems
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. Virtual reality (VR) has advanced rapidly and is used for many entertainment and business purposes. The need for secure, transparent and non-intrusive identification mechanisms is important to facilitate users’ safe participation and secure experience. People are kinesiologically unique, having individual behavioral and movement characteristics, which can be leveraged and used in security sensitive VR applications to compensate for users’ inability to detect potential observational attackers in the physical world. Additionally, such method of identification using a user’s kinesiological data is valuable in common scenarios where multiple users simultaneously participate in a VR environment. In this paper, we present a user study (n = 15) where our participants performed a series of controlled tasks that require physical movements (such as grabbing, rotating and dropping) that could be decomposed into unique kinesiological patterns while we monitored and captured their hand, head and eye gaze data within the VR environment. We present an analysis of the data and show that these data can be used as a biometric discriminant of high confidence using machine learning classification methods such as kNN or SVM, thereby adding a layer of security in terms of identification or dynamically adapting the VR environment to the users’ preferences. We also performed a whitebox penetration testing with 12 attackers, some of whom were physically similar to the participants. We could obtain an average identification confidence value of 0.98 from the actual participants’ test data after the initial study and also a trained model classification accuracy of 98.6%. Penetration testing indicated all attackers resulted in confidence values of less than 50% (\u3c50%), although physically similar attackers had higher confidence values. These findings can help the design and development of secure VR systems
Towards Secure and Usable Authentication for Augmented and Virtual Reality Head-Mounted Displays
Immersive technologies, including augmented and virtual reality (AR & VR)
devices, have enhanced digital communication along with a considerable increase
in digital threats. Thus, authentication becomes critical in AR & VR
technology, particularly in shared spaces. In this paper, we propose applying
the ZeTA protocol that allows secure authentication even in shared spaces for
the AR & VR context. We explain how it can be used with the available
interaction methods provided by Head-Mounted Displays. In future work, our
research goal is to evaluate different designs of ZeTA (e.g., interaction
modes) concerning their usability and users' risk perception regarding their
security - while using a cross-cultural approach
Biomove: Biometric user identification from human kinesiological movements for virtual reality systems
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. Virtual reality (VR) has advanced rapidly and is used for many entertainment and business purposes. The need for secure, transparent and non-intrusive identification mechanisms is important to facilitate users’ safe participation and secure experience. People are kinesiologically unique, having individual behavioral and movement characteristics, which can be leveraged and used in security sensitive VR applications to compensate for users’ inability to detect potential observational attackers in the physical world. Additionally, such method of identification using a user’s kinesiological data is valuable in common scenarios where multiple users simultaneously participate in a VR environment. In this paper, we present a user study (n = 15) where our participants performed a series of controlled tasks that require physical movements (such as grabbing, rotating and dropping) that could be decomposed into unique kinesiological patterns while we monitored and captured their hand, head and eye gaze data within the VR environment. We present an analysis of the data and show that these data can be used as a biometric discriminant of high confidence using machine learning classification methods such as kNN or SVM, thereby adding a layer of security in terms of identification or dynamically adapting the VR environment to the users’ preferences. We also performed a whitebox penetration testing with 12 attackers, some of whom were physically similar to the participants. We could obtain an average identification confidence value of 0.98 from the actual participants’ test data after the initial study and also a trained model classification accuracy of 98.6%. Penetration testing indicated all attackers resulted in confidence values of less than 50% (\u3c50%), although physically similar attackers had higher confidence values. These findings can help the design and development of secure VR systems
Who Is Alyx? A new Behavioral Biometric Dataset for User Identification in XR
This article presents a new dataset containing motion and physiological data
of users playing the game "Half-Life: Alyx". The dataset specifically targets
behavioral and biometric identification of XR users. It includes motion and
eye-tracking data captured by a HTC Vive Pro of 71 users playing the game on
two separate days for 45 minutes. Additionally, we collected physiological data
from 31 of these users. We provide benchmark performances for the task of
motion-based identification of XR users with two prominent state-of-the-art
deep learning architectures (GRU and CNN). After training on the first session
of each user, the best model can identify the 71 users in the second session
with a mean accuracy of 95% within 2 minutes. The dataset is freely available
under https://github.com/cschell/who-is-aly
Active User Authentication for Smartphones: A Challenge Data Set and Benchmark Results
In this paper, automated user verification techniques for smartphones are
investigated. A unique non-commercial dataset, the University of Maryland
Active Authentication Dataset 02 (UMDAA-02) for multi-modal user authentication
research is introduced. This paper focuses on three sensors - front camera,
touch sensor and location service while providing a general description for
other modalities. Benchmark results for face detection, face verification,
touch-based user identification and location-based next-place prediction are
presented, which indicate that more robust methods fine-tuned to the mobile
platform are needed to achieve satisfactory verification accuracy. The dataset
will be made available to the research community for promoting additional
research.Comment: 8 pages, 12 figures, 6 tables. Best poster award at BTAS 201
A Secure Authentication Framework to Guarantee the Traceability of Avatars in Metaverse
Metaverse is a vast virtual environment parallel to the physical world in
which users enjoy a variety of services acting as an avatar. To build a secure
living habitat, it's vital to ensure the virtual-physical traceability that
tracking a malicious player in the physical world via his avatars in virtual
space. In this paper, we propose a two-factor authentication framework based on
chameleon signature and biometric-based authentication. First, aiming at
disguise in virtual space, we propose a chameleon collision signature algorithm
to achieve the verifiability of the avatar's virtual identity. Second, facing
at impersonation in physical world, we construct an avatar's identity model
based on the player's biometric template and the chameleon key to realize the
verifiability of the avatar's physical identity. Finally, we design two
decentralized authentication protocols based on the avatar's identity model to
ensure the consistency of the avatar's virtual and physical identities.
Security analysis indicates that the proposed authentication framework
guarantees the consistency and traceability of avatar's identity. Simulation
experiments show that the framework not only completes the decentralized
authentication between avatars but also achieves the virtual-physical tracking.Comment: 12 pages, 9 figure
Extensible Motion-based Identification of XR Users with Non-Specific Motion
Recently emerged solutions demonstrate that the movements of users
interacting with extended reality (XR) applications carry identifying
information and can be leveraged for identification. While such solutions can
identify XR users within a few seconds, current systems require one or the
other trade-off: either they apply simple distance-based approaches that can
only be used for specific predetermined motions. Or they use
classification-based approaches that use more powerful machine learning models
and thus also work for arbitrary motions, but require full retraining to enroll
new users, which can be prohibitively expensive. In this paper, we propose to
combine the strengths of both approaches by using an embedding-based approach
that leverages deep metric learning. We train the model on a dataset of users
playing the VR game "Half-Life: Alyx" and conduct multiple experiments and
analyses. The results show that the embedding-based method 1) is able to
identify new users from non-specific movements using only a few minutes of
reference data, 2) can enroll new users within seconds, while retraining a
comparable classification-based approach takes almost a day, 3) is more
reliable than a baseline classification-based approach when only little
reference data is available, 4) can be used to identify new users from another
dataset recorded with different VR devices. Altogether, our solution is a
foundation for easily extensible XR user identification systems, applicable
even to non-specific movements. It also paves the way for production-ready
models that could be used by XR practitioners without the requirements of
expertise, hardware, or data for training deep learning models
- …