1,056 research outputs found

    Interaction techniques for older adults using touchscreen devices : a literature review

    Get PDF
    International audienceSeveral studies investigated different interaction techniques and input devices for older adults using touchscreen. This literature review analyses the population involved, the kind of tasks that were executed, the apparatus, the input techniques, the provided feedback, the collected data and author's findings and their recommendations. As conclusion, this review shows that age-related changes, previous experience with technologies, characteristics of handheld devices and use situations need to be studied

    Press-n-Paste : Copy-and-Paste Operations with Pressure-sensitive Caret Navigation for Miniaturized Surface in Mobile Augmented Reality

    Get PDF
    Publisher Copyright: © 2021 ACM.Copy-and-paste operations are the most popular features on computing devices such as desktop computers, smartphones and tablets. However, the copy-and-paste operations are not sufficiently addressed on the Augmented Reality (AR) smartglasses designated for real-time interaction with texts in physical environments. This paper proposes two system solutions, namely Granularity Scrolling (GS) and Two Ends (TE), for the copy-and-paste operations on AR smartglasses. By leveraging a thumb-size button on a touch-sensitive and pressure-sensitive surface, both the multi-step solutions can capture the target texts through indirect manipulation and subsequently enables the copy-and-paste operations. Based on the system solutions, we implemented an experimental prototype named Press-n-Paste (PnP). After the eight-session evaluation capturing 1,296 copy-and-paste operations, 18 participants with GS and TE achieve the peak performance of 17,574 ms and 13,951 ms per copy-and-paste operation, with 93.21% and 98.15% accuracy rates respectively, which are as good as the commercial solutions using direct manipulation on touchscreen devices. The user footprints also show that PnP has a distinctive feature of miniaturized interaction area within 12.65 mm∗14.48 mm. PnP not only proves the feasibility of copy-and-paste operations with the flexibility of various granularities on AR smartglasses, but also gives significant implications to the design space of pressure widgets as well as the input design on smart wearables.Peer reviewe

    UsaGame – A new methodology to support user- centered design of touchscreen game applications

    Get PDF
    Dissertação para Obtenção do Grau de Mestre em Engenharia e Gestão IndustrialTouchscreen mobile devices growth resulted in an explosion of the mobile applications. Focusing on touch mobile game applications this study aims to fulfill a research gap, creating appropriate usability guidelines for these applications. Concerns about usability, touch technologies, mobile devices and game testing, provided the background needs for this study. Initial game application tests allowed for the creation and implementation of such proposed usability guidelines into a support checklist (UsaGame), designed to help applications developers. An evaluation test was performed with 20 users in order to assess the validity of the proposed guidelines. Results from the test of the two builds from the same game application allowed comparisons that led to the assessment of the importance of some of the guidelines implemented into the application. Results suggested a usability improvement on the game application implemented with the guidelines. Furthermore results allowed commenting on all proposed usability guidelines

    Conservation of Limited Resources: Design Principles for Security and Usability on Mobile Devices

    Get PDF
    Mobile devices have evolved from an accessory to the primary computing device for an increasing portion of the general population. Not only is mobile the primary device, consumers on average have multiple Internet-connected devices. The trend towards mobile has resulted in a shift to “mobile-first” strategies for delivering information and services in business organizations, universities, and government agencies. Though principles for good security design exist, those principles were formulated based upon the traditional workstation configuration instead of the mobile platform. Security design needs to follow the shift to a “mobile-first” emphasis to ensure the usability of the security interface. The mobile platform has constraints on resources that can adversely impact the usability of security. This research sought to identify design principles for usable security for mobile devices that address the constraints of the mobile platform. Security and usability have been seen as mutually exclusive. To accurately identify design principles, the relationship between principles for good security design and usability design must be understood. The constraints for the mobile environment must also be identified, and then evaluated for their impact on the interaction of a consumer with a security interface. To understand how the application of the proposed mobile security design principles is perceived by users, an artifact was built to instantiate the principles. Through a series of guided interactions, the importance of proposed design principles was measured in a simulation, in human-computer interaction, and in user perception. The measures showed a resounding difference between the usability of the same security design delivered on mobile vs. workstation platform. It also reveals that acknowledging the constraints of an environment and compensating for the constraints yields mobile security that is both usable and secure. Finally, the hidden cost of security design choices that distract the user from the surrounding environment were examined from both the security perspective and public safety perspective

    Improving everyday computing tasks with head-mounted displays

    Get PDF
    The proliferation of consumer-affordable head-mounted displays (HMDs) has brought a rash of entertainment applications for this burgeoning technology, but relatively little research has been devoted to exploring its potential home and office productivity applications. Can the unique characteristics of HMDs be leveraged to improve users’ ability to perform everyday computing tasks? My work strives to explore this question. One significant obstacle to using HMDs for everyday tasks is the fact that the real world is occluded while wearing them. Physical keyboards remain the most performant devices for text input, yet using a physical keyboard is difficult when the user can’t see it. I developed a system for aiding users typing on physical keyboards while wearing HMDs and performed a user study demonstrating the efficacy of my system. Building on this foundation, I developed a window manager optimized for use with HMDs and conducted a user survey to gather feedback. This survey provided evidence that HMD-optimized window managers can provide advantages that are difficult or impossible to achieve with standard desktop monitors. Participants also provided suggestions for improvements and extensions to future versions of this window manager. I explored the issue of distance compression, wherein users tend to underestimate distances in virtual environments relative to the real world, which could be problematic for window managers or other productivity applications seeking to leverage the depth dimension through stereoscopy. I also investigated a mitigation technique for distance compression called minification. I conducted multiple user studies, providing evidence that minification makes users’ distance judgments in HMDs more accurate without causing detrimental perceptual side effects. This work also provided some valuable insight into the human perceptual system. Taken together, this work represents valuable steps toward leveraging HMDs for everyday home and office productivity applications. I developed functioning software for this purpose, demonstrated its efficacy through multiple user studies, and also gathered feedback for future directions by having participants use this software in simulated productivity tasks

    The Effect of Tactile and Audio Feedback in Handheld Mobile Text Entry

    Get PDF
    Effects of tactile and audio feedback are examined in the context of touchscreen and mobile use. Prior experimental research is graphically summarized by task type (handheld text entry, tabletop text entry, non-text input), tactile feedback type (active, passive), and significant findings, revealing a research gap evaluating passive tactile feedback in handheld text entry (a.k.a. texting ). A passive custom tactile overlay is evaluated in a new experiment wherein 24 participants perform a handheld text entry task on an iPhone under four tactile and audio feedback conditions with measures of text entry speed and accuracy. Results indicate audio feedback produces better performance, while the tactile overlay degrades performance, consistent with reviewed literature. Contrary to previous findings, the combined feedback condition did not produce improved performance. Findings are discussed in light of skill-based behavior and feed-forward control principles described by Gibson (1966) and Rasmussen (1983)

    WearPut : Designing Dexterous Wearable Input based on the Characteristics of Human Finger Motions

    Get PDF
    Department of Biomedical Engineering (Human Factors Engineering)Powerful microchips for computing and networking allow a wide range of wearable devices to be miniaturized with high fidelity and availability. In particular, the commercially successful smartwatches placed on the wrist drive market growth by sharing the role of smartphones and health management. The emerging Head Mounted Displays (HMDs) for Augmented Reality (AR) and Virtual Reality (VR) also impact various application areas in video games, education, simulation, and productivity tools. However, these powerful wearables have challenges in interaction with the inevitably limited space for input and output due to the specialized form factors for fitting the body parts. To complement the constrained interaction experience, many wearable devices still rely on other large form factor devices (e.g., smartphones or hand-held controllers). Despite their usefulness, the additional devices for interaction can constrain the viability of wearable devices in many usage scenarios by tethering users' hands to the physical devices. This thesis argues that developing novel Human-Computer interaction techniques for the specialized wearable form factors is vital for wearables to be reliable standalone products. This thesis seeks to address the issue of constrained interaction experience with novel interaction techniques by exploring finger motions during input for the specialized form factors of wearable devices. The several characteristics of the finger input motions are promising to enable increases in the expressiveness of input on the physically limited input space of wearable devices. First, the input techniques with fingers are prevalent on many large form factor devices (e.g., touchscreen or physical keyboard) due to fast and accurate performance and high familiarity. Second, many commercial wearable products provide built-in sensors (e.g., touchscreen or hand tracking system) to detect finger motions. This enables the implementation of novel interaction systems without any additional sensors or devices. Third, the specialized form factors of wearable devices can create unique input contexts while the fingers approach their locations, shapes, and components. Finally, the dexterity of fingers with a distinctive appearance, high degrees of freedom, and high sensitivity of joint angle perception have the potential to widen the range of input available with various movement features on the surface and in the air. Accordingly, the general claim of this thesis is that understanding how users move their fingers during input will enable increases in the expressiveness of the interaction techniques we can create for resource-limited wearable devices. This thesis demonstrates the general claim by providing evidence in various wearable scenarios with smartwatches and HMDs. First, this thesis explored the comfort range of static and dynamic touch input with angles on the touchscreen of smartwatches. The results showed the specific comfort ranges on variations in fingers, finger regions, and poses due to the unique input context that the touching hand approaches a small and fixed touchscreen with a limited range of angles. Then, finger region-aware systems that recognize the flat and side of the finger were constructed based on the contact areas on the touchscreen to enhance the expressiveness of angle-based touch input. In the second scenario, this thesis revealed distinctive touch profiles of different fingers caused by the unique input context for the touchscreen of smartwatches. The results led to the implementation of finger identification systems for distinguishing two or three fingers. Two virtual keyboards with 12 and 16 keys showed the feasibility of touch-based finger identification that enables increases in the expressiveness of touch input techniques. In addition, this thesis supports the general claim with a range of wearable scenarios by exploring the finger input motions in the air. In the third scenario, this thesis investigated the motions of in-air finger stroking during unconstrained in-air typing for HMDs. The results of the observation study revealed details of in-air finger motions during fast sequential input, such as strategies, kinematics, correlated movements, inter-fingerstroke relationship, and individual in-air keys. The in-depth analysis led to a practical guideline for developing robust in-air typing systems with finger stroking. Lastly, this thesis examined the viable locations of in-air thumb touch input to the virtual targets above the palm. It was confirmed that fast and accurate sequential thumb touch can be achieved at a total of 8 key locations with the built-in hand tracking system in a commercial HMD. Final typing studies with a novel in-air thumb typing system verified increases in the expressiveness of virtual target selection on HMDs. This thesis argues that the objective and subjective results and novel interaction techniques in various wearable scenarios support the general claim that understanding how users move their fingers during input will enable increases in the expressiveness of the interaction techniques we can create for resource-limited wearable devices. Finally, this thesis concludes with thesis contributions, design considerations, and the scope of future research works, for future researchers and developers to implement robust finger-based interaction systems on various types of wearable devices.ope
    • 

    corecore