2,674 research outputs found

    Predicting sex as a soft-biometrics from device interaction swipe gestures

    Get PDF
    Touch and multi-touch gestures are becoming the most common way to interact with technology such as smart phones, tablets and other mobile devices. The latest touch-screen input capacities have tremendously increased the quantity and quality of available gesture data, which has led to the exploration of its use in multiple disciplines from psychology to biometrics. Following research studies undertaken in similar modalities such as keystroke and mouse usage biometrics, the present work proposes the use of swipe gesture data for the prediction of soft-biometrics, specifically the user's sex. This paper details the software and protocol used for the data collection, the feature set extracted and subsequent machine learning analysis. Within this analysis, the BestFirst feature selection technique and classification algorithms (naïve Bayes, logistic regression, support vector machine and decision tree) have been tested. The results of this exploratory analysis have confirmed the possibility of sex prediction from the swipe gesture data, obtaining an encouraging 78% accuracy rate using swipe gesture data from two different directions. These results will hopefully encourage further research in this area, where the prediction of soft-biometrics traits from swipe gesture data can play an important role in enhancing the authentication processes based on touch-screen devices

    GazeTouchPIN: Protecting Sensitive Data on Mobile Devices Using Secure Multimodal Authentication

    Get PDF
    Although mobile devices provide access to a plethora of sensitive data, most users still only protect them with PINs or patterns, which are vulnerable to side-channel attacks (e.g., shoulder surfing). How-ever, prior research has shown that privacy-aware users are willing to take further steps to protect their private data. We propose GazeTouchPIN, a novel secure authentication scheme for mobile devices that combines gaze and touch input. Our multimodal approach complicates shoulder-surfing attacks by requiring attackers to ob-serve the screen as well as the user’s eyes to and the password. We evaluate the security and usability of GazeTouchPIN in two user studies (N=30). We found that while GazeTouchPIN requires longer entry times, privacy aware users would use it on-demand when feeling observed or when accessing sensitive data. The results show that successful shoulder surfing attack rate drops from 68% to 10.4%when using GazeTouchPIN

    GTmoPass: Two-factor Authentication on Public Displays Using Gaze-touch Passwords and Personal Mobile Devices

    Get PDF
    As public displays continue to deliver increasingly private and personalized content, there is a need to ensure that only the legitimate users can access private information in sensitive contexts. While public displays can adopt similar authentication concepts like those used on public terminals (e.g., ATMs), authentication in public is subject to a number of risks. Namely, adversaries can uncover a user's password through (1) shoulder surfing, (2) thermal attacks, or (3) smudge attacks. To address this problem we propose GTmoPass, an authentication architecture that enables Multi-factor user authentication on public displays. The first factor is a knowledge-factor: we employ a shoulder-surfing resilient multimodal scheme that combines gaze and touch input for password entry. The second factor is a possession-factor: users utilize their personal mobile devices, on which they enter the password. Credentials are securely transmitted to a server via Bluetooth beacons. We describe the implementation of GTmoPass and report on an evaluation of its usability and security, which shows that although authentication using GTmoPass is slightly slower than traditional methods, it protects against the three aforementioned threats

    DragID: A Gesture Based Authentication System

    Get PDF
    Department of Electrical EngineeringWith the use of mobile computing devices with touch screens is becoming widespread. Sensitive personal information is often stored in the mobile devices. Smart device users use applications with sensitive personal data such as in online banking. To protect personal information, code based screen unlock methods are used so far. However, these methods are vulnerable to shoulder surfing or smudge attacks. To build a secure unlocking methods we propose DragID, a flexible gesture and biometric based user authentication. Based on the human modeling, DragID authenticates users by using 6 input sources of touch screens. From the input sources, we build 25 fine grained features such as origin of hand, finger radius, velocity, gravity, perpendicular and so on. As modeling the human hand, inour method, features such as radius or origin is difficult to imitate. These features are useful for authentication. In order to authenticate, we use a popular machine learning method, support vector machine. This method prevents attackers reproducing the exact same drag patterns. In the experiments, we implemented DragID on Samsung Galaxy Note2, collected 147379 drag samples from 17 volunteers, and conducted real-world experiments. Our method outperforms Luca???s method and achieves 89.49% and 0.36% of true positive and false positive. In addition, we achieve 92.33% of TPR in case we implement sequence technique.ope
    corecore