255 research outputs found
Touchalytics: On the Applicability of Touchscreen Input as a Behavioral Biometric for Continuous Authentication
We investigate whether a classifier can continuously authenticate users based
on the way they interact with the touchscreen of a smart phone. We propose a
set of 30 behavioral touch features that can be extracted from raw touchscreen
logs and demonstrate that different users populate distinct subspaces of this
feature space. In a systematic experiment designed to test how this behavioral
pattern exhibits consistency over time, we collected touch data from users
interacting with a smart phone using basic navigation maneuvers, i.e., up-down
and left-right scrolling. We propose a classification framework that learns the
touch behavior of a user during an enrollment phase and is able to accept or
reject the current user by monitoring interaction with the touch screen. The
classifier achieves a median equal error rate of 0% for intra-session
authentication, 2%-3% for inter-session authentication and below 4% when the
authentication test was carried out one week after the enrollment phase. While
our experimental findings disqualify this method as a standalone authentication
mechanism for long-term authentication, it could be implemented as a means to
extend screen-lock time or as a part of a multi-modal biometric authentication
system.Comment: to appear at IEEE Transactions on Information Forensics & Security;
Download data from http://www.mariofrank.net/touchalytics
GTmoPass: Two-factor Authentication on Public Displays Using Gaze-touch Passwords and Personal Mobile Devices
As public displays continue to deliver increasingly private and personalized content, there is a need to ensure that only the legitimate users can access private information in sensitive contexts. While public displays can adopt similar authentication concepts like those used on public terminals (e.g., ATMs), authentication in public is subject to a number of risks. Namely, adversaries can uncover a user's password through (1) shoulder surfing, (2) thermal attacks, or (3) smudge attacks. To address this problem we propose GTmoPass, an authentication architecture that enables Multi-factor user authentication on public displays. The first factor is a knowledge-factor: we employ a shoulder-surfing resilient multimodal scheme that combines gaze and touch input for password entry. The second factor is a possession-factor: users utilize their personal mobile devices, on which they enter the password. Credentials are securely transmitted to a server via Bluetooth beacons. We describe the implementation of GTmoPass and report on an evaluation of its usability and security, which shows that although authentication using GTmoPass is slightly slower than traditional methods, it protects against the three aforementioned threats
Recommended from our members
A novel word-independent gesture-typing continuous authentication scheme for mobile devices
In this study, we produce a new continuous authentication scheme for gesture-typing on mobile devices. Our scheme is the first scheme that authenticates gesture-typing interactions in a word-independent format. The scheme relies on groupings of features extracted from the word gesture after it has been reduced to parts common to all gestures. We show that movement sensors are also important in differentiating between users. We describe the feature extraction processes and analyse our proposed feature set. The unique process of our authentication scheme is presented and described. We collect our own gesture typing dataset including data collected during sitting, standing and walking activities for realism. We test our features against state-of-the-art touch-screen interaction features and compare feature extraction times on real mobile devices. Our scheme authenticates users with an equal error rate of 3.58% for a single word-gesture. The equal error rate is reduced to 0.81% when 3 word-gestures are used to authenticate
Recommended from our members
Adaptive threshold scheme for touchscreen gesture continuous authentication using sensor trust
In this study we produce a continuous authentication scheme for mobile devices that adjusts an adaptive threshold for touchscreen interactions based on trust in passively collected sensor data. Our framework unobtrusively compares real-time sensor data of a user to historic data and adjusts a trust parameter based on the similarity. We show that the trust parameter can be used to adjust an adaptive threshold in continuous authentication schemes. The framework passively models temporal, spatial and activity scenarios using sensor data such as location, surrounding devices, wi-fi networks, ambient noise, movements, user activity, ambient light, proximity to objects and atmospheric pressure from study participants. Deviations from the models increases the level of threat the device perceives from the scenario. We also model the user touchscreen interactions. The touchscreen interactions are authenticated against a threshold that is continually adjusted based on the perceived trust. This scheme provides greater nuance between security and usability, enabling more refined decisions. We present our novel framework and threshold adjustment criteria and validate our framework on two state-of-the-art sensor datasets. Our framework more than halves the false acceptance and false rejection rates of a static threshold system
Effective Identity Management on Mobile Devices Using Multi-Sensor Measurements
Due to the dramatic increase in popularity of mobile devices in the past decade, sensitive user information is stored and accessed on these devices every day. Securing sensitive data stored and accessed from mobile devices, makes user-identity management a problem of paramount importance. The tension between security and usability renders the task of user-identity verification on mobile devices challenging. Meanwhile, an appropriate identity management approach is missing since most existing technologies for user-identity verification are either one-shot user verification or only work in restricted controlled environments.
To solve the aforementioned problems, we investigated and sought approaches from the sensor data generated by human-mobile interactions. The data are collected from the on-board sensors, including voice data from microphone, acceleration data from accelerometer, angular acceleration data from gyroscope, magnetic force data from magnetometer, and multi-touch gesture input data from touchscreen. We studied the feasibility of extracting biometric and behaviour features from the on-board sensor data and how to efficiently employ the features extracted to perform user-identity verification on the smartphone device. Based on the experimental results of the single-sensor modalities, we further investigated how to integrate them with hardware such as fingerprint and Trust Zone to practically fulfill a usable identity management system for both local application and remote services control. User studies and on-device testing sessions were held for privacy and usability evaluation.Computer Science, Department o
GazeTouchPIN: Protecting Sensitive Data on Mobile Devices Using Secure Multimodal Authentication
Although mobile devices provide access to a plethora of sensitive data, most users still only protect them with PINs or patterns, which are vulnerable to side-channel attacks (e.g., shoulder surfing). How-ever, prior research has shown that privacy-aware users are willing to take further steps to protect their private data. We propose GazeTouchPIN, a novel secure authentication scheme for mobile devices that combines gaze and touch input. Our multimodal approach complicates shoulder-surfing attacks by requiring attackers to ob-serve the screen as well as the user’s eyes to and the password. We evaluate the security and usability of GazeTouchPIN in two user studies (N=30). We found that while GazeTouchPIN requires longer entry times, privacy aware users would use it on-demand when feeling observed or when accessing sensitive data. The results show that successful shoulder surfing attack rate drops from 68% to 10.4%when using GazeTouchPIN
- …