4 research outputs found
Quantifying the Security of Recognition Passwords: Gestures and Signatures
Gesture and signature passwords are two-dimensional figures created by
drawing on the surface of a touchscreen with one or more fingers. Prior results
about their security have used resilience to either shoulder surfing, a human
observation attack, or dictionary attacks. These evaluations restrict
generalizability since the results are: non-comparable to other password
systems (e.g. PINs), harder to reproduce, and attacker-dependent. Strong
statements about the security of a password system use an analysis of the
statistical distribution of the password space, which models a best-case
attacker who guesses passwords in order of most likely to least likely.
Estimating the distribution of recognition passwords is challenging because
many different trials need to map to one password. In this paper, we solve this
difficult problem by: (1) representing a recognition password of continuous
data as a discrete alphabet set, and (2) estimating the password distribution
through modeling the unseen passwords. We use Symbolic Aggregate approXimation
(SAX) to represent time series data as symbols and develop Markov chains to
model recognition passwords. We use a partial guessing metric, which
demonstrates how many guesses an attacker needs to crack a percentage of the
entire space, to compare the security of the distributions for gestures,
signatures, and Android unlock patterns. We found the lower bounds of the
partial guessing metric of gestures and signatures are much higher than the
upper bound of the partial guessing metric of Android unlock patterns