190 research outputs found
Recommended from our members
Observations on typing from 136 million keystrokes
We report on typing behaviour and performance of 168,000
volunteers in an online study. The large dataset allows detailed
statistical analyses of keystroking patterns, linking them
to typing performance. Besides reporting distributions and
confirming some earlier findings, we report two new findings.
First, letter pairs typed by different hands or fingers are more
predictive of typing speed than, for example, letter repetitions.
Second, rollover-typing, wherein the next key is pressed before
the previous one is released, is surprisingly prevalent. Notwithstanding
considerable variation in typing patterns, unsupervised
clustering using normalised inter-key intervals reveals
that most users can be divided into eight groups of typists that
differ in performance, accuracy, hand and finger usage, and
rollover. The code and dataset are released for scientific use
Fast and Robust Hand Tracking Using Detection-Guided Optimization
Markerless tracking of hands and fingers is a promising enabler for human-computer interaction. However, adoption has been limited because of tracking inaccuracies, incomplete coverage of motions, low framerate, complex camera setups, and high computational requirements. In this paper, we present a fast method for accurately tracking rapid and complex articulations of the hand using a single depth camera. Our algorithm uses a novel detection-guided optimization strategy that increases the robustness and speed of pose estimation. In the detection step, a randomized decision forest classifies pixels into parts of the hand. In the optimization step, a novel objective function combines the detected part labels and a Gaussian mixture representation of the depth to estimate a pose that best fits the depth. Our approach needs comparably less computational resources which makes it extremely fast (50 fps without GPU support). The approach also supports varying static, or moving, camera-to-scene arrangements. We show the benefits of our method by evaluating on public datasets and comparing against previous work
The Emergence of Interactive Behaviour: A Model of Rational Menu Search
One reason that human interaction with technology is difficult to understand is because the way in which people perform interactive tasks is highly adaptive. One such interactive task is menu search. In the current article we test the hypothesis that menu search is rationally adapted to (1) the ecological structure
of interaction, (2) cognitive and perceptual limits, and
(3) the goal to maximise the trade-off between speed and accuracy.
Unlike in previous models, no assumptions are made
about the strategies available to or adopted by users, rather the menu search problem is specified as a reinforcement learning problem and behaviour emerges by finding the optimal policy.
The model is tested against existing empirical findings
concerning the effect of menu organisation and menu length.
The model predicts the effect of these variables on task completion time and eye movements. The discussion considers the pros and cons of the modelling approach relative to other well-known modelling approaches
How do people type on mobile devices? Observations from a study with 37,000 volunteers
© 2019 Association for Computing Machinery. This paper presents a large-scale dataset on mobile text entry collected via a web-based transcription task performed by 37,370 volunteers. The average typing speed was 36.2 WPM with 2.3% uncorrected errors. The scale of the data enables powerful statistical analyses on the correlation between typing performance and various factors, such as demographics, finger usage, and use of intelligent text entry techniques. We report effects of age and finger usage on performance that correspond to previous studies. We also find evidence of relationships between performance and use of intelligent text entry techniques: auto-correct usage correlates positively with entry rates, whereas word prediction usage has a negative correlation. To aid further work on modeling, machine learning and design improvements in mobile text entry, we make the code and dataset openly available
Describing UI Screenshots in Natural Language
Funding Information: We acknowledge the computational resources provided by the Aalto Science-IT project. We thank Homayun Afrabandpey, Daniel Buschek, Jussi Jokinen, and Jörg Tiedemann for reviewing an earlier draft of this article. This work has been supported by the Horizon 2020 FET program of the European Union through the ERA-NET Cofund funding (grant CHIST-ERA-20-BCI-001), the European Innovation Council Pathfinder program (SYMBIOTIK project), and the Academy of Finland (grants 291556, 318559, 310947). Publisher Copyright: © 2022 Copyright held by the owner/author(s). Publication rights licensed to ACM.Being able to describe any user interface (UI) screenshot in natural language can promote understanding of the main purpose of the UI, yet currently it cannot be accomplished with state-of-the-art captioning systems. We introduce XUI, a novel method inspired by the global precedence effect to create informative descriptions of UIs, starting with an overview and then providing fine-grained descriptions about the most salient elements. XUI builds upon computational models for topic classification, visual saliency prediction, and natural language generation (NLG). XUI provides descriptions with up to three different granularity levels that, together, describe what is in the interface and what the user can do with it. We found that XUI descriptions are highly readable, are perceived to accurately describe the UI, and score similarly to human-generated UI descriptions. XUI is available as open-source software.Peer reviewe
The influence of age and gender in the interaction with touch screens
Touch screens are nowadays one of the major interfaces in the interaction between humans and technology, mostly due to the significant growth in the use of smartphones and tablets in the last years. This broad use, that reaches people from all strata of society, makes touch screens a relevant tool to study the mechanisms that influence the way we interact with electronic devices. In this paper we collect data regarding the interaction patterns of different users with mobile devices. We present a way to formalize these interaction patterns and analyze how aspects such as age and gender influence them. The results of this research may be relevant for developing mobile applications that identify and adapt to the users or their characteristics, including impairments in fine motor skills or in cognitive function.Fundos Europeus Estruturais e de Investimento (FEEI)
through Programa Operacional Regional Norte, in the scope of project NORTE01-0145-FEDER-02357
- …