272 research outputs found

    Knowledge-driven Biometric Authentication in Virtual Reality

    Get PDF
    With the increasing adoption of virtual reality (VR) in public spaces, protecting users from observation attacks is becoming essential to prevent attackers from accessing context-sensitive data or performing malicious payment transactions in VR. In this work, we propose RubikBiom, a knowledge-driven behavioural biometric authentication scheme for authentication in VR. We show that hand movement patterns performed during interactions with a knowledge-based authentication scheme (e.g., when entering a PIN) can be leveraged to establish an additional security layer. Based on a dataset gathered in a lab study with 23 participants, we show that knowledge-driven behavioural biometric authentication increases security in an unobtrusive way. We achieve an accuracy of up to 98.91% by applying a Fully Convolutional Network (FCN) on 32 authentications per subject. Our results pave the way for further investigations towards knowledge-driven behavioural biometric authentication in VR

    A systematic literature review on Virtual Reality and Augmented Reality in terms of privacy, authorization and data-leaks

    Full text link
    In recent years, VR and AR has exploded into a multimillionaire market. As this emerging technology has spread to a variety of businesses and is rapidly increasing among users. It is critical to address potential privacy and security concerns that these technologies might pose. In this study, we discuss the current status of privacy and security in VR and AR. We analyse possible problems and risks. Besides, we will look in detail at a few of the major concerns issues and related security solutions for AR and VR. Additionally, as VR and AR authentication is the most thoroughly studied aspect of the problem, we concentrate on the research that has already been done in this area.Comment: 9 Pages, 4 figure

    Who Is Alyx? A new Behavioral Biometric Dataset for User Identification in XR

    Get PDF
    This article presents a new dataset containing motion and physiological data of users playing the game "Half-Life: Alyx". The dataset specifically targets behavioral and biometric identification of XR users. It includes motion and eye-tracking data captured by a HTC Vive Pro of 71 users playing the game on two separate days for 45 minutes. Additionally, we collected physiological data from 31 of these users. We provide benchmark performances for the task of motion-based identification of XR users with two prominent state-of-the-art deep learning architectures (GRU and CNN). After training on the first session of each user, the best model can identify the 71 users in the second session with a mean accuracy of 95% within 2 minutes. The dataset is freely available under https://github.com/cschell/who-is-aly

    Comparison of Data Representations and Machine Learning Architectures for User Identification on Arbitrary Motion Sequences

    Full text link
    Reliable and robust user identification and authentication are important and often necessary requirements for many digital services. It becomes paramount in social virtual reality (VR) to ensure trust, specifically in digital encounters with lifelike realistic-looking avatars as faithful replications of real persons. Recent research has shown that the movements of users in extended reality (XR) systems carry user-specific information and can thus be used to verify their identities. This article compares three different potential encodings of the motion data from head and hands (scene-relative, body-relative, and body-relative velocities), and the performances of five different machine learning architectures (random forest, multi-layer perceptron, fully recurrent neural network, long-short term memory, gated recurrent unit). We use the publicly available dataset "Talking with Hands" and publish all code to allow reproducibility and to provide baselines for future work. After hyperparameter optimization, the combination of a long-short term memory architecture and body-relative data outperformed competing combinations: the model correctly identifies any of the 34 subjects with an accuracy of 100% within 150 seconds. Altogether, our approach provides an effective foundation for behaviometric-based identification and authentication to guide researchers and practitioners. Data and code are published under https://go.uniwue.de/58w1r.Comment: in press at IEEE VRAI 202

    Emerging ExG-based NUI Inputs in Extended Realities : A Bottom-up Survey

    Get PDF
    Incremental and quantitative improvements of two-way interactions with extended realities (XR) are contributing toward a qualitative leap into a state of XR ecosystems being efficient, user-friendly, and widely adopted. However, there are multiple barriers on the way toward the omnipresence of XR; among them are the following: computational and power limitations of portable hardware, social acceptance of novel interaction protocols, and usability and efficiency of interfaces. In this article, we overview and analyse novel natural user interfaces based on sensing electrical bio-signals that can be leveraged to tackle the challenges of XR input interactions. Electroencephalography-based brain-machine interfaces that enable thought-only hands-free interaction, myoelectric input methods that track body gestures employing electromyography, and gaze-tracking electrooculography input interfaces are the examples of electrical bio-signal sensing technologies united under a collective concept of ExG. ExG signal acquisition modalities provide a way to interact with computing systems using natural intuitive actions enriching interactions with XR. This survey will provide a bottom-up overview starting from (i) underlying biological aspects and signal acquisition techniques, (ii) ExG hardware solutions, (iii) ExG-enabled applications, (iv) discussion on social acceptance of such applications and technologies, as well as (v) research challenges, application directions, and open problems; evidencing the benefits that ExG-based Natural User Interfaces inputs can introduceto the areaof XR.Peer reviewe
    • …
    corecore