1,122 research outputs found

    Gesture passwords: concepts, methods and challenges

    Full text link
    Biometrics are a convenient alternative to traditional forms of access control such as passwords and pass-cards since they rely solely on user-specific traits. Unlike alphanumeric passwords, biometrics cannot be given or told to another person, and unlike pass-cards, are always “on-hand.” Perhaps the most well-known biometrics with these properties are: face, speech, iris, and gait. This dissertation proposes a new biometric modality: gestures. A gesture is a short body motion that contains static anatomical information and changing behavioral (dynamic) information. This work considers both full-body gestures such as a large wave of the arms, and hand gestures such as a subtle curl of the fingers and palm. For access control, a specific gesture can be selected as a “password” and used for identification and authentication of a user. If this particular motion were somehow compromised, a user could readily select a new motion as a “password,” effectively changing and renewing the behavioral aspect of the biometric. This thesis describes a novel framework for acquiring, representing, and evaluating gesture passwords for the purpose of general access control. The framework uses depth sensors, such as the Kinect, to record gesture information from which depth maps or pose features are estimated. First, various distance measures, such as the log-euclidean distance between feature covariance matrices and distances based on feature sequence alignment via dynamic time warping, are used to compare two gestures, and train a classifier to either authenticate or identify a user. In authentication, this framework yields an equal error rate on the order of 1-2% for body and hand gestures in non-adversarial scenarios. Next, through a novel decomposition of gestures into posture, build, and dynamic components, the relative importance of each component is studied. The dynamic portion of a gesture is shown to have the largest impact on biometric performance with its removal causing a significant increase in error. In addition, the effects of two types of threats are investigated: one due to self-induced degradations (personal effects and the passage of time) and the other due to spoof attacks. For body gestures, both spoof attacks (with only the dynamic component) and self-induced degradations increase the equal error rate as expected. Further, the benefits of adding additional sensor viewpoints to this modality are empirically evaluated. Finally, a novel framework that leverages deep convolutional neural networks for learning a user-specific “style” representation from a set of known gestures is proposed and compared to a similar representation for gesture recognition. This deep convolutional neural network yields significantly improved performance over prior methods. A byproduct of this work is the creation and release of multiple publicly available, user-centric (as opposed to gesture-centric) datasets based on both body and hand gestures

    DragID: A Gesture Based Authentication System

    Get PDF
    Department of Electrical EngineeringWith the use of mobile computing devices with touch screens is becoming widespread. Sensitive personal information is often stored in the mobile devices. Smart device users use applications with sensitive personal data such as in online banking. To protect personal information, code based screen unlock methods are used so far. However, these methods are vulnerable to shoulder surfing or smudge attacks. To build a secure unlocking methods we propose DragID, a flexible gesture and biometric based user authentication. Based on the human modeling, DragID authenticates users by using 6 input sources of touch screens. From the input sources, we build 25 fine grained features such as origin of hand, finger radius, velocity, gravity, perpendicular and so on. As modeling the human hand, inour method, features such as radius or origin is difficult to imitate. These features are useful for authentication. In order to authenticate, we use a popular machine learning method, support vector machine. This method prevents attackers reproducing the exact same drag patterns. In the experiments, we implemented DragID on Samsung Galaxy Note2, collected 147379 drag samples from 17 volunteers, and conducted real-world experiments. Our method outperforms Luca???s method and achieves 89.49% and 0.36% of true positive and false positive. In addition, we achieve 92.33% of TPR in case we implement sequence technique.ope

    Keystroke and Touch-dynamics Based Authentication for Desktop and Mobile Devices

    Get PDF
    The most commonly used system on desktop computers is a simple username and password approach which assumes that only genuine users know their own credentials. Once broken, the system will accept every authentication trial using compromised credentials until the breach is detected. Mobile devices, such as smart phones and tablets, have seen an explosive increase for personal computing and internet browsing. While the primary mode of interaction in such devices is through their touch screen via gestures, the authentication procedures have been inherited from keyboard-based computers, e.g. a Personal Identification Number, or a gesture based password, etc.;This work provides contributions to advance two types of behavioral biometrics applicable to desktop and mobile computers: keystroke dynamics and touch dynamics. Keystroke dynamics relies upon the manner of typing rather than what is typed to authenticate users. Similarly, a continual touch based authentication that actively authenticates the user is a more natural alternative for mobile devices.;Within the keystroke dynamics domain, habituation refers to the evolution of user typing pattern over time. This work details the significant impact of habituation on user behavior. It offers empirical evidence of the significant impact on authentication systems attempting to identify a genuine user affected by habituation, and the effect of habituation on similarities between users and impostors. It also proposes a novel effective feature for the keystroke dynamics domain called event sequences. We show empirically that unlike features from traditional keystroke dynamics literature, event sequences are independent of typing speed. This provides a unique advantage in distinguishing between users when typing complex text.;With respect to touch dynamics, an immense variety of mobile devices are available for consumers, differing in size, aspect ratio, operating systems, hardware and software specifications to name a few. An effective touch based authentication system must be able to work with one user model across a spectrum of devices and user postures. This work uses a locally collected dataset to provide empirical evidence of the significant effect of posture, device size and manufacturer on user authentication performance. Based on the results of this strand of research, we suggest strategies to improve the performance of continual touch based authentication systems

    Soft biometrics through hand gestures driven by visual stimuli

    Get PDF
    We present a novel biometric solution which exploits hand gestures, tracked by the Microsoft Kinect sensor, performed in response to a circle randomly appearing in five predefined screen positions. Features of both hand and screen pointer are used for classification purposes, considering both the whole 20-path trajectory and shorter routes. In particular, we search for the "optimal" trajectory length which assures a good trade-off between precision and user effort. For identification, the approach achieves classification accuracies ranging from 0.748 to 0.942. For verification, accuracy is still satisfactory (always higher than 0.962), despite moderate specificity values. Keywords: Soft biometrics, Gestures, Visual stimul

    Empirical techniques and algorithms to develop a resilient non-supervised touch-based authentication system

    Full text link
    Touch dynamics (or touch based authentication) refers to a behavioral biometric for touchscreen devices wherein a user is authenticated based on his/her executed touch gestures. This work addresses two research topics. We first present a series of empirical techniques to detect habituation in the user’s touch profile, its detrimental effect on authentication accuracy and strategies to overcome these effects. Habituation here refers to changes in the user’s profile and/or noise within it due to the user’s familiarization with the device and software application. With respect to habituation, we show that habituation causes the user’s touch profile to evolve significantly and irrevocably over time even after the user is familiar with the device and software application. This phenomenon considerably degrades classifier accuracy. We demonstrate techniques that lower the error rate to 3.68% and sets the benchmark in this field for a realistic test setup. Finally, we quantify the benefits of vote-based reclassification of predicted class labels and show that this technique is vital for achieving high accuracy in realistic touch-based authentication systems. In the second half, we implement the first ever non-supervised classification algorithm in touch based continual authentication. This scheme incorporates clustering into the traditional supervised algorithm. We reduce the mis-classification rate by fusing supervised random forest algorithm and non-supervised clustering (either Bayesian learning or simple rule of combinations). Fusing with Bayesian clustering reduced the mis-classification rate by 50% while fusing with simple rule of combination reduced the mis-classification rate by as much as 59.5% averaged over all the users.Master of ScienceComputer Science & Information SystemsUniversity of Michigan-Flinthttp://deepblue.lib.umich.edu/bitstream/2027.42/134750/1/Palaskar2016.pdfDescription of Palaskar2016.pdf : Main articl

    Exploration of Machine Learning Classification Models Used for Behavioral Biometrics Authentication

    Full text link
    Mobile devices have been manufactured and enhanced at growing rates in the past decades. While this growth has significantly evolved the capability of these devices, their security has been falling behind. This contrast in development between capability and security of mobile devices is a significant problem with the sensitive information of the public at risk. Continuing the previous work in this field, this study identifies key Machine Learning algorithms currently being used for behavioral biometric mobile authentication schemes and aims to provide a comprehensive review of these algorithms when used with touch dynamics and phone movement. Throughout this paper the benefits, limitations, and recommendations for future work will be discussed

    CGAMES'2009

    Get PDF

    Touch-screen Behavioural Biometrics on Mobile Devices

    Get PDF
    Robust user verification on mobile devices is one of the top priorities globally from a financial security and privacy viewpoint and has led to biometric verification complementing or replacing PIN and password methods. Research has shown that behavioural biometric methods, with their promise of improved security due to inimitable nature and the lure of unintrusive, implicit, continuous verification, could define the future of privacy and cyber security in an increasingly mobile world. Considering the real-life nature of problems relating to mobility, this study aims to determine the impact of user interaction factors that affect verification performance and usability for behavioural biometric modalities on mobile devices. Building on existing work on biometric performance assessments, it asks: To what extent does the biometric performance remain stable when faced with movements or change of environment, over time and other device related factors influencing usage of mobile devices in real-life applications? Further it seeks to provide answers to: What could further improve the performance for behavioural biometric modalities? Based on a review of the literature, a series of experiments were executed to collect a dataset consisting of touch dynamics based behavioural data mirroring various real-life usage scenarios of a mobile device. Responses were analysed using various uni-modal and multi-modal frameworks. Analysis demonstrated that existing verification methods using touch modalities of swipes, signatures and keystroke dynamics adapt poorly when faced with a variety of usage scenarios and have challenges related to time persistence. The results indicate that a multi-modal solution does have a positive impact towards improving the verification performance. On this basis, it is recommended to explore alternatives in the form of dynamic, variable thresholds and smarter template selection strategy which hold promise. We believe that the evaluation results presented in this thesis will streamline development of future solutions for improving the security of behavioural-based modalities on mobile biometrics
    • 

    corecore