686 research outputs found
Nomadic input on mobile devices: the influence of touch input technique and walking speed on performance and offset modeling
In everyday life people use their mobile phones on-the-go with different walking speeds and with different touch input techniques. Unfortunately, much of the published research in mobile interaction does not quantify the influence of these variables. In this paper, we analyze the influence of walking speed, gait pattern and input techniques on commonly used performance parameters like error rate, accuracy and tapping speed, and we compare the results to the static condition. We examine the influence of these factors on the machine learned offset model used to correct user input and we make design recommendations. The results show that all performance parameters degraded when the subject started to move, for all input techniques. Index finger pointing techniques demonstrated overall better performance compared to thumb-pointing techniques. The influence of gait phase on tap event likelihood and accuracy was demonstrated for all input techniques and all walking speeds. Finally, it was shown that the offset model built on static data did not perform as well as models inferred from dynamic data, which indicates the speed-specific nature of the models. Also, models identified using specific input techniques did not perform well when tested in other conditions, demonstrating the limited validity of offset models to a particular input technique. The model was therefore calibrated using data recorded with the appropriate input technique, at 75% of preferred walking speed, which is the speed to which users spontaneously slow down when they use a mobile device and which presents a tradeoff between accuracy and usability. This led to an increase in accuracy compared to models built on static data. The error rate was reduced between 0.05% and 5.3% for landscape-based methods and between 5.3% and 11.9% for portrait-based methods
Study of Touch Gesture Performance by Four and Five Year-Old Children: Point-and-Touch, Drag-and-Drop, Zoom-in and Zoom-out, and Rotate
Past research has focused on children\u27s interaction with computers through mouse clicks, and mouse research studies focused on point-and-click and drag-and-drop. However, More research is necessary in regard to children\u27s ability to perform touch gestures such as point-and-touch, drag-and-drop, zoom-in and zoom-out, and rotate. Furthermore, research should consider specific gestures such as zoom-in and zoom-out, and rotate tasks for young children. The aim of this thesis is to study the ability of 4 and 5 year-old children to interact with touch devices and perform tasks such as: point-and-touch, drag-and-drop, zoom-in and zoom-out, and rotate. This thesis tests an iPad application with four experiments on 17 four and five-year-old children, 16 without motor impairment and 1 with a motor impairment disability. The results show that 5-year-old children perform better than 4-year-old children in the four experiments. Results indicate that interaction design for young children that uses Point-and-Touch gestures should consider distance between targets, and designs using Drag-and-Drop gestures should consider size of targets, as these have significant effects in the way children perform these gestures. Also, designers should consider size and rotation direction in rotate tasks, as it is smoother for young children to rotate clockwise objects. The result of the four different touch gestures tasks shows that time was not an important factor in children\u27s performance
Enhancing the Performance of Eye and Head Mice: A Validated Assessment Method and an Investigation into the Performance of Eye and Head Based Assistive Technology Pointing Devices
This work poses the question "Could eye and head based assistive technology device
interaction performance approach that of basic hand mouse interaction?" To this aim, the
work constructs, validates, and applies a detailed and comprehensive pointing device
assessment method suitable for assistive technology direct pointing devices, it then uses
this method to add enhancement to these devices, finally it then demonstrates that such
enhanced eye or head based pointing can approach that of basic hand mouse interaction
and be a viable and usable interaction method for people with high-level motor
disabilities.
Eye and head based pointing devices, or eye and head mice, are often used by high-level
motor disabled people to enable computer interaction in the place of a standard desktop
hand mouse. The performance of these eye and head mice pointing devices when used for
direct manipulation on a standard graphical user interface has generally been regarded as
poor in comparison to that of a standard desktop hand mouse, thus putting users of head
and eye mice at a disadvantage when interacting with computers.
The performance of eye and head based pointing devices during direct manipulation on a
standard graphical user interface has not previously been investigated in depth, and the
reasons why these devices seem to demonstrate poor performance have not been
determined in detail. Few proven methods have been demonstrated and investigated that
enhance the performance of these devices based on their performance during direct
manipulation. Importantly, and key to this work is that, no validated assessment method
has been constructed to allow such an investigation.
This work seeks to investigate the performance of eye and head based pointing devices
during direct manipulation by constructing and verifying a test method suitable for the
detailed performance assessment of eye and head based assistive technology pointing
devices. It then uses this method to determine the factors influencing the performance of eye and head mice during direct manipulation. Finally, after identifying these factors, this
work hypothesises, and then demonstrates that applying suitable methods for addressing
these factors can result in enhanced performance for eye and head mice. It shows that the
performance of these enhanced devices can approach the performance of standard desktop
hand mice with the use of highly experienced users, together with the enhancement of a
supporting modality for object manipulation, and a supporting interface enhancement for
object size magnification; thus demonstrating that these devices can approach and equal
the performance of basic hand mouse interaction
Characterizing the Effects of Local Latency on Aim Performance in First Person Shooters
Real-time games such as first-person shooters (FPS) are sensitive to even small amounts of lag. The effects of network latency have been studied, but less is known about local latency -- that is, the lag caused by local sources such as input devices, displays, and the application. While local latency is important to gamers, we do not know how it affects aiming performance and whether we can reduce its negative effects. To explore these issues, we tested local latency in a variety of real-world gaming systems and carried out a controlled study focusing on targeting and tracking activities in an FPS game with varying degrees of local latency. In addition, we tested the ability of a lag compensation technique (based on aim assistance) to mitigate the negative effects. To motivate the need for these studies, we also examined how aim in FPS differs from pointing in standard 2D tasks, showing significant differences in performance metrics. Our studies found local latencies in the real-world range from 23 to 243~ms that cause significant and substantial degradation in performance (even for latencies as low as 41~ms). The studies also showed that our compensation technique worked well, reducing the problems caused by lag in the case of targeting, and removing the problem altogether in the case of tracking. Our work shows that local latency is a real and substantial problem -- but game developers can mitigate the problem with appropriate compensation methods
Experimental Analysis of a Spatialised Audio Interface for People with Visual Impairments
Sound perception is a fundamental skill for many people with severe sight impairments. The research presented in this paper is part of an ongoing project with the aim to create a mobile guidance aid to help people with vision impairments find objects within an unknown indoor environment. This system requires an effective non-visual interface and uses bone-conduction headphones to transmit audio instructions to the user. It has been implemented and tested with spatialised audio cues, which convey the direction of a predefined target in 3D space. We present an in-depth evaluation of the audio interface with several experiments that involve a large number of participants, both blindfolded and with actual visual impairments, and analyse the pros and cons of our design choices. In addition to producing results comparable to the state-of-the-art, we found that Fitts’s Law (a predictive model for human movement) provides a suitable a metric that can be used to improve and refine the quality of the audio interface in future mobile navigation aids
Natural User Interface Usability Research in Context of Curved Displays Systems
Continuous development of information technologies makes us review ex-isting rules and recommendations designed to improve the efficiency of IT use, to ensure optimal working conditions for the users, to increase produc-tivity, security and to protect human health.
Relevant researrch in the field of computer engineering is performed in the dissertation. The thesis analyzes natural user interfaces and their usabil-ity (efficiency, productivity and satisfaction with witch a particular user can reach specific goals in a specific environment) for performing of various functions. This dissertation examines factors, which determine efficiency of usability, and how efficiency is influenced by a curved display. The problem is relevant and the raised goal and objectives are new from the point of view of science. First of all, the thesis examines how to improve working conditions by developing graphical user interface of the infor-mation systems. Secondly, the influence of information submission to human, while one is performing task and specific domain tasks using graph-ical user interface, is examined. As there is no common opinion on how to create natural user interfaces and there is no definite set of parameters which determine the efficiency of usability, performed experimental research is an important contribution to the solution of these problems
Exploring Users' Pointing Performance on Virtual and Physical Large Curved Displays
Large curved displays have emerged as a powerful platform for collaboration,
data visualization, and entertainment. These displays provide highly immersive
experiences, a wider field of view, and higher satisfaction levels. Yet, large
curved displays are not commonly available due to their high costs. With the
recent advancement of Head Mounted Displays (HMDs), large curved displays can
be simulated in Virtual Reality (VR) with minimal cost and space requirements.
However, to consider the virtual display as an alternative to the physical
display, it is necessary to uncover user performance differences (e.g.,
pointing speed and accuracy) between these two platforms. In this paper, we
explored users' pointing performance on both physical and virtual large curved
displays. Specifically, with two studies, we investigate users' performance
between the two platforms for standard pointing factors such as target width,
target amplitude as well as users' position relative to the screen. Results
from user studies reveal no significant difference in pointing performance
between the two platforms when users are located at the same position relative
to the screen. In addition, we observe users' pointing performance improves
when they are located at the center of a semi-circular display compared to
off-centered positions. We conclude by outlining design implications for
pointing on large curved virtual displays. These findings show that large
curved virtual displays are a viable alternative to physical displays for
pointing tasks.Comment: In 29th ACM Symposium on Virtual Reality Software and Technology
(VRST 2023
- …