559 research outputs found

    The One Hand Wonder: A Framework for Enhancing One-handed Website Operation on Touchscreen Smartphones

    Get PDF
    Operating a website with one hand on a touchscreen mobile phone remains a challenging task: solutions to adapt websites for mobile users do not address the ergonomic peculiarities of one-handed operation. We present the design and evaluation of the One Hand Wonder (OHW) -- an easily-adaptable cross-platform JavaScript framework to support one-handed website navigation on touchscreen smartphones. It enhances usability without the need to redesign the existing website or to overwrite any CSS styles. User testing and quantitative evaluation confirm learnability and efficiency with clear advantages over non-enhanced browsing, and a discussion of the OHW's versatility is given

    A Client-Side Approach to Improving One-Handed Web Surfing on a Smartphone

    Get PDF
    Our work examines the impact of natural movements on usability and efficiency of one-handed website operation on touchscreen smartphones. We present and evaluate a JavaScript framework which transforms the controls of all page elements into an interface that corresponds to the natural arcing swipe of the thumb. This can be operated via swipe and tap, without having to stretch the thumb or change the grip on the phone. The interface can be toggled on or off as required and keeps the original presentation of the website intact. Two user studies show that this choice of interface and its simplified interaction model can be applied to a diverse range of interactive elements and websites and provides a high degree of usability and efficiency

    Continuous touchscreen biometrics: authentication and privacy concerns

    Get PDF
    In the age of instant communication, smartphones have become an integral part of our daily lives, with a significant portion of the population using them for a variety of tasks such as messaging, banking, and even recording sensitive health information. However, the increasing reliance on smartphones has also made them a prime target for cybercriminals, who can use various tactics to gain access to our sensitive data. In light of this, it is crucial that individuals and organisations prioritise the security of their smartphones to protect against the abundance of threats around us. While there are dozens of methods to verify the identity of users before granting them access to a device, many of them lack effectiveness in terms of usability and potential vulnerabilities. In this thesis, we aim to advance the field of touchscreen biometrics which promises to alleviate some of the recurring issues. This area of research deals with the use of touch interactions, such as gestures and finger movements, as a means of identifying or authenticating individuals. First, we provide a detailed explanation of the common procedure for evaluating touch-based authentication systems and examine the potential pitfalls and concerns that can arise during this process. The impact of the pitfalls is evaluated and quantified on a newly collected large-scale dataset. We also discuss the prevalence of these issues in the related literature and provide recommendations for best practices when developing continuous touch-based authentication systems. Then we provide a comprehensive overview of the techniques that are commonly used for modelling touch-based authentication, including the various features, classifiers, and aggregation methods that are employed in this field. We compare the approaches under controlled, fair conditions in order to determine the top-performing techniques. Based on our findings, we introduce methods that outperform the current state-of-the-art. Finally, as a conclusion to our advancements in the development of touchscreen authentication technology, we explore any negative effects our work may cause to an ordinary user of mobile websites and applications. In particular, we look into any threats that can affect the privacy of the user, such as tracking them and revealing their personal information based on their behaviour on smartphones

    Investigation of dynamic three-dimensional tangible touchscreens: Usability and feasibility

    Get PDF
    The ability for touchscreen controls to move from two physical dimensions to three dimensions may soon be possible. Though solutions exist for enhanced tactile touchscreen interaction using vibrotactile devices, no definitive commercial solution yet exists for providing real, physical shape to the virtual buttons on a touchscreen display. Of the many next steps in interface technology, this paper concentrates on the path leading to tangible, dynamic, touchscreen surfaces. An experiment was performed that explores the usage differences between a flat surface touchscreen and one augmented with raised surface controls. The results were mixed. The combination of tactile-visual modalities had a negative effect on task completion time when visual attention was focused on a single task (single target task time increased by 8% and the serial target task time increased by 6%). On the other hand, the dual modality had a positive effect on error rate when visual attention was divided between two tasks (the serial target error rate decreased by 50%). In addition to the experiment, this study also investigated the feasibility of creating a dynamic, three dimensional, tangible touchscreen. A new interface solution may be possible by inverting the traditional touchscreen architecture and integrating emerging technologies such as organic light emitting diode (OLED) displays and electrorheological fluid based tactile pins

    Exploiting behavioral biometrics for user security enhancements

    Get PDF
    As online business has been very popular in the past decade, the tasks of providing user authentication and verification have become more important than before to protect user sensitive information from malicious hands. The most common approach to user authentication and verification is the use of password. However, the dilemma users facing in traditional passwords becomes more and more evident: users tend to choose easy-to-remember passwords, which are often weak passwords that are easy to crack. Meanwhile, behavioral biometrics have promising potentials in meeting both security and usability demands, since they authenticate users by who you are , instead of what you have . In this dissertation, we first develop two such user verification applications based on behavioral biometrics: the first one is via mouse movements, and the second via tapping behaviors on smartphones; then we focus on modeling user web browsing behaviors by Fitts\u27 Law.;Specifically, we develop a user verification system by exploiting the uniqueness of people\u27s mouse movements. The key feature of our system lies in using much more fine-grained (point-by-point) angle-based metrics of mouse movements for user verification. These new metrics are relatively unique from person to person and independent of the computing platform. We conduct a series of experiments to show that the proposed system can verify a user in an accurate and timely manner, and induced system overhead is minor. Similar to mouse movements, the tapping behaviors of smartphone users on touchscreen also vary from person to person. We propose a non-intrusive user verification mechanism to substantiate whether an authenticating user is the true owner of the smartphone or an impostor who happens to know the passcode. The effectiveness of the proposed approach is validated through real experiments. to further understand user pointing behaviors, we attempt to stress-test Fitts\u27 law in the wild , namely, under natural web browsing environments, instead of restricted laboratory settings in previous studies. Our analysis shows that, while the averaged pointing times follow Fitts\u27 law very well, there is considerable deviations from Fitts\u27 law. We observe that, in natural browsing, a fast movement has a different error model from the other two movements. Therefore, a complete profiling on user pointing performance should be done in more details, for example, constructing different error models for slow and fast movements. as future works, we plan to exploit multiple-finger tappings for smartphone user verification, and evaluate user privacy issues in Amazon wish list

    Detecting Surface Interactions via a Wearable Microphone to Improve Augmented Reality Text Entry

    Get PDF
    This thesis investigates whether we can detect and distinguish between surface interaction events such as tapping or swiping using a wearable mic from a surface. Also, what are the advantages of new text entry methods such as tapping with two fingers simultaneously to enter capital letters and punctuation? For this purpose, we conducted a remote study to collect audio and video of three different ways people might interact with a surface. We also built a CNN classifier to detect taps. Our results show that we can detect and distinguish between surface interaction events such as tap or swipe via a wearable mic on the user\u27s head

    Creating mobile gesture-based interaction design patterns for older adults : a study of tap and swipe gestures with portuguese seniors

    Get PDF
    Tese de mestrado. Multimédia. Faculdade de Engenharia. Universidade do Porto. 201

    スマートフォンを用いた視覚障碍者向け移動支援システムアーキテクチャに関する研究

    Get PDF
    学位の種別: 課程博士審査委員会委員 : (主査)東京大学教授 坂村 健, 東京大学教授 越塚 登, 東京大学教授 暦本 純一, 東京大学教授 中尾 彰宏, 東京大学教授 石川 徹University of Tokyo(東京大学
    corecore