143 research outputs found

    Active detection of age groups based on touch interaction

    Full text link
    This paper studies user classification into children and adults according to their interaction with touchscreen devices. We analyse the performance of two sets of features derived from the Sigma-Lognormal theory of rapid human movements and global characterization of touchscreen interaction. We propose an active detection approach aimed to continuously monitorize the user patterns. The experimentation is conducted on a publicly available database with samples obtained from 89 children between 3 and 6 years old and 30 adults. We have used Support Vector Machines algorithm to classify the resulting features into age groups. The sets of features are fused at score level using data from smartphones and tablets. The results, with correct classification rates over 96%, show the discriminative ability of the proposed neuromotorinspired features to classify age groups according to the interaction with touch devices. In active detection setup, our method is able to identify a child using only 4 gestures in averageThis work was funded by the project CogniMetrics (TEC2015-70627-R) and Bio-Guard (Ayudas Fundación BBVA a Equipos de Investigación Científica 2017

    Predicting sex as a soft-biometrics from device interaction swipe gestures

    Get PDF
    Touch and multi-touch gestures are becoming the most common way to interact with technology such as smart phones, tablets and other mobile devices. The latest touch-screen input capacities have tremendously increased the quantity and quality of available gesture data, which has led to the exploration of its use in multiple disciplines from psychology to biometrics. Following research studies undertaken in similar modalities such as keystroke and mouse usage biometrics, the present work proposes the use of swipe gesture data for the prediction of soft-biometrics, specifically the user's sex. This paper details the software and protocol used for the data collection, the feature set extracted and subsequent machine learning analysis. Within this analysis, the BestFirst feature selection technique and classification algorithms (naïve Bayes, logistic regression, support vector machine and decision tree) have been tested. The results of this exploratory analysis have confirmed the possibility of sex prediction from the swipe gesture data, obtaining an encouraging 78% accuracy rate using swipe gesture data from two different directions. These results will hopefully encourage further research in this area, where the prediction of soft-biometrics traits from swipe gesture data can play an important role in enhancing the authentication processes based on touch-screen devices

    Continuous Smartphone Authentication using Wristbands

    Get PDF
    Many users find current smartphone authentication methods (PINs, swipe patterns) to be burdensome, leading them to weaken or disable the authentication. Although some phones support methods to ease the burden (such as fingerprint readers), these methods require active participation by the user and do not verify the user’s identity after the phone is unlocked. We propose CSAW, a continuous smartphone authentication method that leverages wristbands to verify that the phone is in the hands of its owner. In CSAW, users wear a wristband (a smartwatch or a fitness band) with built-in motion sensors, and by comparing the wristband’s motion with the phone’s motion, CSAW continuously produces a score indicating its confidence that the person holding (and using) the phone is the person wearing the wristband. This score provides the foundation for a wide range of authentication decisions (e.g., unlocking phone, deauthentication, or limiting phone access). Through two user studies (N=27,11) we evaluated CSAW’s accuracy, usability, and security. Our experimental evaluation demonstrates that CSAW was able to conduct initial authentication with over 99% accuracy and continuous authentication with over 96.5% accuracy

    Different strokes for different folks? Revealing the physical characteristics of smartphone users from their swipe gestures

    Get PDF
    AbstractAnthropometrics show that the lengths of many human body segments follow a common proportional relationship. To know the length of one body segment – such as a thumb – potentially provides a predictive route to other physical characteristics, such as overall standing height. In this study, we examined whether it is feasible that the length of a person׳s thumb could be revealed from the way in which they complete swipe gestures on a touchscreen-based smartphone.From a corpus of approx. 19,000 swipe gestures captured from 178 volunteers, we found that people with longer thumbs complete swipe gestures with shorter completion times, higher speeds and with higher accelerations than people with shorter thumbs. These differences were also observed to exist between our male and female volunteers, along with additional differences in the amount of touch pressure applied to the screen.Results are discussed in terms of linking behavioural and physical biometrics

    Active Authentication using an Autoencoder regularized CNN-based One-Class Classifier

    Full text link
    Active authentication refers to the process in which users are unobtrusively monitored and authenticated continuously throughout their interactions with mobile devices. Generally, an active authentication problem is modelled as a one class classification problem due to the unavailability of data from the impostor users. Normally, the enrolled user is considered as the target class (genuine) and the unauthorized users are considered as unknown classes (impostor). We propose a convolutional neural network (CNN) based approach for one class classification in which a zero centered Gaussian noise and an autoencoder are used to model the pseudo-negative class and to regularize the network to learn meaningful feature representations for one class data, respectively. The overall network is trained using a combination of the cross-entropy and the reconstruction error losses. A key feature of the proposed approach is that any pre-trained CNN can be used as the base network for one class classification. Effectiveness of the proposed framework is demonstrated using three publically available face-based active authentication datasets and it is shown that the proposed method achieves superior performance compared to the traditional one class classification methods. The source code is available at: github.com/otkupjnoz/oc-acnn.Comment: Accepted and to appear at AFGR 201

    Mobile Device Background Sensors: Authentication vs Privacy

    Get PDF
    The increasing number of mobile devices in recent years has caused the collection of a large amount of personal information that needs to be protected. To this aim, behavioural biometrics has become very popular. But, what is the discriminative power of mobile behavioural biometrics in real scenarios? With the success of Deep Learning (DL), architectures based on Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs), such as Long Short-Term Memory (LSTM), have shown improvements compared to traditional machine learning methods. However, these DL architectures still have limitations that need to be addressed. In response, new DL architectures like Transformers have emerged. The question is, can these new Transformers outperform previous biometric approaches? To answers to these questions, this thesis focuses on behavioural biometric authentication with data acquired from mobile background sensors (i.e., accelerometers and gyroscopes). In addition, to the best of our knowledge, this is the first thesis that explores and proposes novel behavioural biometric systems based on Transformers, achieving state-of-the-art results in gait, swipe, and keystroke biometrics. The adoption of biometrics requires a balance between security and privacy. Biometric modalities provide a unique and inherently personal approach for authentication. Nevertheless, biometrics also give rise to concerns regarding the invasion of personal privacy. According to the General Data Protection Regulation (GDPR) introduced by the European Union, personal data such as biometric data are sensitive and must be used and protected properly. This thesis analyses the impact of sensitive data in the performance of biometric systems and proposes a novel unsupervised privacy-preserving approach. The research conducted in this thesis makes significant contributions, including: i) a comprehensive review of the privacy vulnerabilities of mobile device sensors, covering metrics for quantifying privacy in relation to sensitive data, along with protection methods for safeguarding sensitive information; ii) an analysis of authentication systems for behavioural biometrics on mobile devices (i.e., gait, swipe, and keystroke), being the first thesis that explores the potential of Transformers for behavioural biometrics, introducing novel architectures that outperform the state of the art; and iii) a novel privacy-preserving approach for mobile biometric gait verification using unsupervised learning techniques, ensuring the protection of sensitive data during the verification process

    SwipeFormer: Transformers for mobile touchscreen biometrics

    Get PDF
    The growing number of mobile devices over the past few years brings a large amount of personal information, which needs to be properly protected. As a result, several mobile authentication methods have been developed. In particular, behavioural biometrics has become one of the most relevant methods due to its ability to extract the uniqueness of each subject in a secure, non-intrusive, and continuous way. This article presents SwipeFormer, a novel Transformer-based system for mobile subject authentication by means of swipe gestures in an unconstrained scenario (i.e., subjects could use their personal devices freely, without restrictions on the direction of swipe gestures or the position of the device). Our proposed system contains two modules: (i) a Transformer-based feature extractor, and (ii) a similarity computation module. Mobile data from the touchscreen and different background sensors (accelerometer and gyroscope) have been studied, including in the analysis both Android and iOS operating systems. A complete analysis of SwipeFormer is carried out using an in-house large-scale database acquired in unconstrained scenarios. In these operational conditions, SwipeFormer achieves Equal Error Rate (EER) values of 6.6% and 3.6% on Android and iOS respectively, outperforming the state of the art. In addition, we evaluate SwipeFormer on the popular publicly available databases Frank DB and HuMIdb, achieving EER values of 11.0% and 5.0% respectively, outperforming previous approaches under the same experimental setup

    Continuous touchscreen biometrics: authentication and privacy concerns

    Get PDF
    In the age of instant communication, smartphones have become an integral part of our daily lives, with a significant portion of the population using them for a variety of tasks such as messaging, banking, and even recording sensitive health information. However, the increasing reliance on smartphones has also made them a prime target for cybercriminals, who can use various tactics to gain access to our sensitive data. In light of this, it is crucial that individuals and organisations prioritise the security of their smartphones to protect against the abundance of threats around us. While there are dozens of methods to verify the identity of users before granting them access to a device, many of them lack effectiveness in terms of usability and potential vulnerabilities. In this thesis, we aim to advance the field of touchscreen biometrics which promises to alleviate some of the recurring issues. This area of research deals with the use of touch interactions, such as gestures and finger movements, as a means of identifying or authenticating individuals. First, we provide a detailed explanation of the common procedure for evaluating touch-based authentication systems and examine the potential pitfalls and concerns that can arise during this process. The impact of the pitfalls is evaluated and quantified on a newly collected large-scale dataset. We also discuss the prevalence of these issues in the related literature and provide recommendations for best practices when developing continuous touch-based authentication systems. Then we provide a comprehensive overview of the techniques that are commonly used for modelling touch-based authentication, including the various features, classifiers, and aggregation methods that are employed in this field. We compare the approaches under controlled, fair conditions in order to determine the top-performing techniques. Based on our findings, we introduce methods that outperform the current state-of-the-art. Finally, as a conclusion to our advancements in the development of touchscreen authentication technology, we explore any negative effects our work may cause to an ordinary user of mobile websites and applications. In particular, we look into any threats that can affect the privacy of the user, such as tracking them and revealing their personal information based on their behaviour on smartphones

    USER AUTHENTICATION ACROSS DEVICES, MODALITIES AND REPRESENTATION: BEHAVIORAL BIOMETRIC METHODS

    Get PDF
    Biometrics eliminate the need for a person to remember and reproduce complex secretive information or carry additional hardware in order to authenticate oneself. Behavioral biometrics is a branch of biometrics that focuses on using a person’s behavior or way of doing a task as means of authentication. These tasks can be any common, day to day tasks like walking, sleeping, talking, typing and so on. As interactions with computers and other smart-devices like phones and tablets have become an essential part of modern life, a person’s style of interaction with them can be used as a powerful means of behavioral biometrics. In this dissertation, we present insights from the analysis of our proposed set of contextsensitive or word-specific keystroke features on desktop, tablet and phone. We show that the conventional features are not highly discriminatory on desktops and are only marginally better on hand-held devices for user identification. By using information of the context, our proposed word-specific features offer superior discrimination among users on all devices. Classifiers, built using our proposed features, perform user identification with high accuracies in range of 90% to 97%, average precision and recall values of 0.914 and 0.901 respectively. Analysis of the word-based impact factors reveal that four or five character words, words with about 50% vowels, and those that are ranked higher on the frequency lists might give better results for the extraction and use of the proposed features for user identification. We also examine a large umbrella of behavioral biometric data such as; keystroke latencies, gait and swipe data on desktop, phone and tablet for the assumption of an underlying normal distribution, which is common in many research works. Using suitable nonparametric normality tests (Lilliefors test and Shapiro-Wilk test) we show that a majority of the features from all activities and all devices, do not follow a normal distribution. In most cases less than 25% of the samples that were tested had p values \u3e 0.05. We discuss alternate solutions to address the non-normality in behavioral biometric data. Openly available datasets did not provide the wide range of modalities and activities required for our research. Therefore, we have collected and shared an open access, large benchmark dataset for behavioral biometrics on IEEEDataport. We describe the collection and analysis of our Syracuse University and Assured Information Security - Behavioral Biometrics Multi-device and multi -Activity data from Same users (SU-AIS BB-MAS) Dataset. Which is an open access dataset on IEEEdataport, with data from 117 subjects for typing (both fixed and free text), gait (walking, upstairs and downstairs) and touch on Desktop, Tablet and Phone. The dataset consists a total of about: 3.5 million keystroke events; 57.1 million data-points for accelerometer and gyroscope each; 1.7 million datapoints for swipes and is listed as one of the most popular datasets on the portal (through IEEE emails to all members on 05/13/2020 and 07/21/2020). We also show that keystroke dynamics (KD) on a desktop can be used to classify the type of activity, either benign or adversarial, that a text sample originates from. We show the inefficiencies of popular temporal features for this task. With our proposed set of 14 features we achieve high accuracies (93% to 97%) and low Type 1 and Type 2 errors (3% to 8%) in classifying text samples of different sizes. We also present exploratory research in (a) authenticating users through musical notes generated by mapping their keystroke latencies to music and (b) authenticating users through the relationship between their keystroke latencies on multiple devices
    • …
    corecore