570 research outputs found

    Keystroke and Touch-dynamics Based Authentication for Desktop and Mobile Devices

    Get PDF
    The most commonly used system on desktop computers is a simple username and password approach which assumes that only genuine users know their own credentials. Once broken, the system will accept every authentication trial using compromised credentials until the breach is detected. Mobile devices, such as smart phones and tablets, have seen an explosive increase for personal computing and internet browsing. While the primary mode of interaction in such devices is through their touch screen via gestures, the authentication procedures have been inherited from keyboard-based computers, e.g. a Personal Identification Number, or a gesture based password, etc.;This work provides contributions to advance two types of behavioral biometrics applicable to desktop and mobile computers: keystroke dynamics and touch dynamics. Keystroke dynamics relies upon the manner of typing rather than what is typed to authenticate users. Similarly, a continual touch based authentication that actively authenticates the user is a more natural alternative for mobile devices.;Within the keystroke dynamics domain, habituation refers to the evolution of user typing pattern over time. This work details the significant impact of habituation on user behavior. It offers empirical evidence of the significant impact on authentication systems attempting to identify a genuine user affected by habituation, and the effect of habituation on similarities between users and impostors. It also proposes a novel effective feature for the keystroke dynamics domain called event sequences. We show empirically that unlike features from traditional keystroke dynamics literature, event sequences are independent of typing speed. This provides a unique advantage in distinguishing between users when typing complex text.;With respect to touch dynamics, an immense variety of mobile devices are available for consumers, differing in size, aspect ratio, operating systems, hardware and software specifications to name a few. An effective touch based authentication system must be able to work with one user model across a spectrum of devices and user postures. This work uses a locally collected dataset to provide empirical evidence of the significant effect of posture, device size and manufacturer on user authentication performance. Based on the results of this strand of research, we suggest strategies to improve the performance of continual touch based authentication systems

    Multimodal Behavioral Biometric Authentication in Smartphones for Covid-19 Pandemic

    Get PDF
    The usage of mobile phones has increased multi-fold in recent decades, mostly because of their utility in most aspects of daily life, such as communications, entertainment, and financial transactions. In use cases where users’ information is at risk from imposter attacks, biometrics-based authentication systems such as fingerprint or facial recognition are considered the most trustworthy in comparison to PIN, password, or pattern-based authentication systems in smartphones. Biometrics need to be presented at the time of power-on, they cannot be guessed or attacked through brute force and eliminate the possibility of shoulder surfing. However, fingerprints or facial recognition-based systems in smartphones may not be applicable in a pandemic situation like Covid-19, where hand gloves or face masks are mandatory to protect against unwanted exposure of the body parts. This paper investigates the situations in which fingerprints cannot be utilized due to hand gloves and hence presents an alternative biometric system using the multimodal Touchscreen swipe and Keystroke dynamics pattern. We propose a HandGlove mode of authentication where the system will automatically be triggered to authenticate a user based on Touchscreen swipe and Keystroke dynamics patterns. Our experimental results suggest that the proposed multimodal biometric system can operate with high accuracy. We experiment with different classifiers like Isolation Forest Classifier, SVM, k-NN Classifier, and fuzzy logic classifier with SVM to obtain the best authentication accuracy of 99.55% with 197 users on the Samsung Galaxy S20. We further study the problem of untrained external factors which can impact the user experience of authentication system and propose a model based on fuzzy logic to extend the functionality of the system to improve under novel external effects. In this experiment, we considered the untrained external factor of ‘sanitized hands’ with which the user tries to authenticate and achieved 93.5% accuracy in this scenario. The proposed multimodal system could be one of the most sought approaches for biometrics-based authentication in smartphones in a COVID-19 pandemic situation

    Empirical techniques and algorithms to develop a resilient non-supervised touch-based authentication system

    Full text link
    Touch dynamics (or touch based authentication) refers to a behavioral biometric for touchscreen devices wherein a user is authenticated based on his/her executed touch gestures. This work addresses two research topics. We first present a series of empirical techniques to detect habituation in the user’s touch profile, its detrimental effect on authentication accuracy and strategies to overcome these effects. Habituation here refers to changes in the user’s profile and/or noise within it due to the user’s familiarization with the device and software application. With respect to habituation, we show that habituation causes the user’s touch profile to evolve significantly and irrevocably over time even after the user is familiar with the device and software application. This phenomenon considerably degrades classifier accuracy. We demonstrate techniques that lower the error rate to 3.68% and sets the benchmark in this field for a realistic test setup. Finally, we quantify the benefits of vote-based reclassification of predicted class labels and show that this technique is vital for achieving high accuracy in realistic touch-based authentication systems. In the second half, we implement the first ever non-supervised classification algorithm in touch based continual authentication. This scheme incorporates clustering into the traditional supervised algorithm. We reduce the mis-classification rate by fusing supervised random forest algorithm and non-supervised clustering (either Bayesian learning or simple rule of combinations). Fusing with Bayesian clustering reduced the mis-classification rate by 50% while fusing with simple rule of combination reduced the mis-classification rate by as much as 59.5% averaged over all the users.Master of ScienceComputer Science & Information SystemsUniversity of Michigan-Flinthttp://deepblue.lib.umich.edu/bitstream/2027.42/134750/1/Palaskar2016.pdfDescription of Palaskar2016.pdf : Main articl

    Continuous touchscreen biometrics: authentication and privacy concerns

    Get PDF
    In the age of instant communication, smartphones have become an integral part of our daily lives, with a significant portion of the population using them for a variety of tasks such as messaging, banking, and even recording sensitive health information. However, the increasing reliance on smartphones has also made them a prime target for cybercriminals, who can use various tactics to gain access to our sensitive data. In light of this, it is crucial that individuals and organisations prioritise the security of their smartphones to protect against the abundance of threats around us. While there are dozens of methods to verify the identity of users before granting them access to a device, many of them lack effectiveness in terms of usability and potential vulnerabilities. In this thesis, we aim to advance the field of touchscreen biometrics which promises to alleviate some of the recurring issues. This area of research deals with the use of touch interactions, such as gestures and finger movements, as a means of identifying or authenticating individuals. First, we provide a detailed explanation of the common procedure for evaluating touch-based authentication systems and examine the potential pitfalls and concerns that can arise during this process. The impact of the pitfalls is evaluated and quantified on a newly collected large-scale dataset. We also discuss the prevalence of these issues in the related literature and provide recommendations for best practices when developing continuous touch-based authentication systems. Then we provide a comprehensive overview of the techniques that are commonly used for modelling touch-based authentication, including the various features, classifiers, and aggregation methods that are employed in this field. We compare the approaches under controlled, fair conditions in order to determine the top-performing techniques. Based on our findings, we introduce methods that outperform the current state-of-the-art. Finally, as a conclusion to our advancements in the development of touchscreen authentication technology, we explore any negative effects our work may cause to an ordinary user of mobile websites and applications. In particular, we look into any threats that can affect the privacy of the user, such as tracking them and revealing their personal information based on their behaviour on smartphones

    A Survey of Applications and Human Motion Recognition with Microsoft Kinect

    Get PDF
    Microsoft Kinect, a low-cost motion sensing device, enables users to interact with computers or game consoles naturally through gestures and spoken commands without any other peripheral equipment. As such, it has commanded intense interests in research and development on the Kinect technology. In this paper, we present, a comprehensive survey on Kinect applications, and the latest research and development on motion recognition using data captured by the Kinect sensor. On the applications front, we review the applications of the Kinect technology in a variety of areas, including healthcare, education and performing arts, robotics, sign language recognition, retail services, workplace safety training, as well as 3D reconstructions. On the technology front, we provide an overview of the main features of both versions of the Kinect sensor together with the depth sensing technologies used, and review literatures on human motion recognition techniques used in Kinect applications. We provide a classification of motion recognition techniques to highlight the different approaches used in human motion recognition. Furthermore, we compile a list of publicly available Kinect datasets. These datasets are valuable resources for researchers to investigate better methods for human motion recognition and lower-level computer vision tasks such as segmentation, object detection and human pose estimation

    Machine learning techniques for implicit interaction using mobile sensors

    Get PDF
    Interactions in mobile devices normally happen in an explicit manner, which means that they are initiated by the users. Yet, users are typically unaware that they also interact implicitly with their devices. For instance, our hand pose changes naturally when we type text messages. Whilst the touchscreen captures finger touches, hand movements during this interaction however are unused. If this implicit hand movement is observed, it can be used as additional information to support or to enhance the users’ text entry experience. This thesis investigates how implicit sensing can be used to improve existing, standard interaction technique qualities. In particular, this thesis looks into enhancing front-of-device interaction through back-of-device and hand movement implicit sensing. We propose the investigation through machine learning techniques. We look into problems on how sensor data via implicit sensing can be used to predict a certain aspect of an interaction. For instance, one of the questions that this thesis attempts to answer is whether hand movement during a touch targeting task correlates with the touch position. This is a complex relationship to understand but can be best explained through machine learning. Using machine learning as a tool, such correlation can be measured, quantified, understood and used to make predictions on future touch position. Furthermore, this thesis also evaluates the predictive power of the sensor data. We show this through a number of studies. In Chapter 5 we show that probabilistic modelling of sensor inputs and recorded touch locations can be used to predict the general area of future touches on touchscreen. In Chapter 7, using SVM classifiers, we show that data from implicit sensing from general mobile interactions is user-specific. This can be used to identify users implicitly. In Chapter 6, we also show that touch interaction errors can be detected from sensor data. In our experiment, we show that there are sufficient distinguishable patterns between normal interaction signals and signals that are strongly correlated with interaction error. In all studies, we show that performance gain can be achieved by combining sensor inputs
    • 

    corecore