421 research outputs found

    A systematic review

    Get PDF
    This study was conducted at the Psychology Research Centre ( PSI/01662 ), School of Psychology , the University of Minho , supported by the Foundation for Science and Technology (FCT) through the Portuguese State Budget ( UID/PSI/01662/2020 ). This research was also supported by FCT projects PTDC/MHC/PCN/1530/2014. We thank the anonymous reviewers for their insightful comments and recommendations, which has assisted us in improving the quality and presentation of this article. Publisher Copyright: © 2023 The AuthorsThe caregiver's touch behavior during early infancy is linked to multiple developmental outcomes. However, social touch remains a challenging construct to operationalize, and although observational tools have been a gold standard for measuring touch in caregiver-infant interactions, no systematic review has been conducted before. We followed the PRISMA guidelines and reviewed the literature to describe and classify the main characteristics of the available observational instruments. Of the 3042 publications found, we selected 45 that included an observational measure, and from those we identified 12 instruments. Most of the studies were of infants younger than six months of age and assessed touch in two laboratory tasks: face-to-face interaction and still-face procedure. We identified three approaches for evaluating the caregiver's touch behavior: strictly behavioral (the observable touch behavior), functional (the functional role of the touch behavior), or mixed (a combination of the previous two). Half of the instruments were classified as functional, 25% as strictly observational, and 25% as mixed. The lack of conceptual and operational uniformity and consistency between instruments is discussed.publishersversionpublishe

    Touchalytics: On the Applicability of Touchscreen Input as a Behavioral Biometric for Continuous Authentication

    Full text link
    We investigate whether a classifier can continuously authenticate users based on the way they interact with the touchscreen of a smart phone. We propose a set of 30 behavioral touch features that can be extracted from raw touchscreen logs and demonstrate that different users populate distinct subspaces of this feature space. In a systematic experiment designed to test how this behavioral pattern exhibits consistency over time, we collected touch data from users interacting with a smart phone using basic navigation maneuvers, i.e., up-down and left-right scrolling. We propose a classification framework that learns the touch behavior of a user during an enrollment phase and is able to accept or reject the current user by monitoring interaction with the touch screen. The classifier achieves a median equal error rate of 0% for intra-session authentication, 2%-3% for inter-session authentication and below 4% when the authentication test was carried out one week after the enrollment phase. While our experimental findings disqualify this method as a standalone authentication mechanism for long-term authentication, it could be implemented as a means to extend screen-lock time or as a part of a multi-modal biometric authentication system.Comment: to appear at IEEE Transactions on Information Forensics & Security; Download data from http://www.mariofrank.net/touchalytics

    Forgery-Resistant Touch-based Authentication on Mobile Devices

    Full text link
    Mobile devices store a diverse set of private user data and have gradually become a hub to control users' other personal Internet-of-Things devices. Access control on mobile devices is therefore highly important. The widely accepted solution is to protect access by asking for a password. However, password authentication is tedious, e.g., a user needs to input a password every time she wants to use the device. Moreover, existing biometrics such as face, fingerprint, and touch behaviors are vulnerable to forgery attacks. We propose a new touch-based biometric authentication system that is passive and secure against forgery attacks. In our touch-based authentication, a user's touch behaviors are a function of some random "secret". The user can subconsciously know the secret while touching the device's screen. However, an attacker cannot know the secret at the time of attack, which makes it challenging to perform forgery attacks even if the attacker has already obtained the user's touch behaviors. We evaluate our touch-based authentication system by collecting data from 25 subjects. Results are promising: the random secrets do not influence user experience and, for targeted forgery attacks, our system achieves 0.18 smaller Equal Error Rates (EERs) than previous touch-based authentication.Comment: Accepted for publication by ASIACCS'1

    What does touch tell us about emotions in touchscreen-based gameplay?

    Get PDF
    This is the post-print version of the Article. The official published version can be accessed from the link below - Copyright @ 2012 ACM. It is posted here by permission of ACM for your personal use. Not for redistribution.Nowadays, more and more people play games on touch-screen mobile phones. This phenomenon raises a very interesting question: does touch behaviour reflect the player’s emotional state? If possible, this would not only be a valuable evaluation indicator for game designers, but also for real-time personalization of the game experience. Psychology studies on acted touch behaviour show the existence of discriminative affective profiles. In this paper, finger-stroke features during gameplay on an iPod were extracted and their discriminative power analysed. Based on touch-behaviour, machine learning algorithms were used to build systems for automatically discriminating between four emotional states (Excited, Relaxed, Frustrated, Bored), two levels of arousal and two levels of valence. The results were very interesting reaching between 69% and 77% of correct discrimination between the four emotional states. Higher results (~89%) were obtained for discriminating between two levels of arousal and two levels of valence

    Using sound in multi-touch interfaces to change materiality and touch behavior

    Get PDF
    Current development in multimodal interfaces allows us to interact with digitally represented objects. Sadly, these representations are often poor due to technical limitations in representing some of the sensorial properties. Here we explore the possibility of overcoming these limitations by exploiting multisensory integration processes and propose a sound-based interaction technique to alter the perceived materiality of a surface being touched and to shape users' touch behavior. The latter can be seen both as a cue of, and as a means to reinforce, the altered perception. We designed a prototype that dynamically alters the texture-related sound feedback based on touch behavior, as in natural surface touch interactions. A user study showed that the frequency of the sound feedback alters texture perception (coldness and material type) and touch behavior (velocity and pressure). We conclude by discussing lessons learnt from this work in terms of HCI applications and questions opened by this research. Copyright is held by the owner/author(s)
    • …
    corecore