193 research outputs found

    Investigating how the hand interacts with different mobile phones

    Get PDF
    In this paper we investigate the physical interaction between the hand and three types of mobile device interaction: touchscreen, physical keyboard and stylus. Through a controlled study using video observational analysis, we observed firstly, how the participants gripped the three devices and how these grips were device dependent. Secondly we looked closely at these grips to uncover how participants performed what we call micro-movements to facilitate a greater range of interaction, e.g. reaching across the keyboard. The results extend current knowledge by comparing three handheld device input methods and observing the movements, which the hand makes in five grips. The paper concludes by describing the development of a conceptual design, proposed as a provocation for the opening of dialogue on how we conceive hand usage and how it might be optimized when designed for mobile devices

    Kindertivity: Usability and Communicability Strategies for Interactive Surfaces and Pre-Kindergarten Children

    Full text link
    Tesis por compendio[ES] La tecnología multi-táctil se ha convertido en una de las más emergentes tras experimentar un enorme crecimiento desde sus pasos iniciales en los años ochenta hasta su amplia aceptación y uso en la actualidad. Por una parte, la tecnología multi-táctil se basa en el estilo de interacción de manipulación directa el cual proporciona a los usuarios la ventaja de ver los objetos y las acciones de interés, sustituir comandos escritos por acciones de señalado y, además, permite la realización de acciones rápidas, reversibles e incrementales evitando el uso de instrucciones complejas. Por otra parte, diversos trabajos han evaluado las virtudes derivadas de utilizar conjuntamente la manipulación directa con el toque directo mostrando que es posible evitar los problemas inherentes a otras técnicas de interacción como el ratón y el teclado. Por lo tanto, aprovechando la interacción natural e intuitiva proporcionada por la tecnología multi-táctil, ésta parece una forma ideal para dar soporte a la creación de escenarios educativos dirigidos a niños en edad preescolar. Sin embargo, a pesar de la existencia de diversos estudios que evalúan la idoneidad de utilizar el estilo de interacción de manipulación directa, existe una falta de trabajos abordando el uso dispositivos basados en superficies táctiles con niños de una temprana edad. Asimismo, en la actualidad existe una creciente tendencia a diseñar aplicaciones educativas y lúdicas dirigidas a niños en edad preescolar utilizando dispositivos multi-táctiles como los teléfonos inteligentes o las tabletas. Además, diversos informes señalan que los niños son usuarios frecuentes de este tipo de dispositivos y los utilizan incluso antes de ser capaces de hablar. Sin embargo, a pesar de este crecimiento en el uso de la tecnología multi-táctil y su aparente idoneidad para ser utilizado en el desarrollo de aplicaciones educativas para niños en edad preescolar, no existen unas interacciones universales y estandarizadas para preescolares a la hora de utilizar dispositivos táctiles ya que habitualmente sólo se utilizan dos gestos básicos (básicamente, el toque con un dedo para seleccionar y el arrastre con un dedo para el movimiento). Por lo tanto, existe una clara necesidad de llevar a cabo estudios empíricos para contribuir y avanzar en el diseño de aplicaciones que den un soporte adecuado y encaje con las habilidades de los niños en su temprano desarrollo. Por tanto, esta tesis propone, diseña y evalúa diversas estrategias de usabilidad y comunicabilidad adaptadas a los niños en edad preescolar para establecer la base para el diseño y desarrollo de futuras aplicaciones basadas en dispositivos táctiles dirigidas a preescolares. Estas estrategias llevarán a la adecuada definición de guías de diseño que permitirán a los niños aprovechar al máximo la tecnología multi-táctil, harán posible el desarrollo de nuevas y atractivas aplicaciones y, eventualmente, también podrán ayudar al desarrollo cognitivo y motor de los niños.[CA] La tecnologia multi-tàctil s'ha convertit en una de les més emergents després d'experimentar un enorme creixement des dels seus passos inicials als anys vuitanta fins l'actualitat on es àmpliament acceptada i utilitzada. D'una banda, la tecnologia multi-tàctil es basa en l'estil d'interacció de manipulació directa, el qual proporciona als usuaris l'avantatge de veure els objectes i les accions d'interès, substituir comandos escrits per accions d'assenyalament i, a més, permet la realització d'accions, ràpides, reversibles i incrementals evitant l'ús d'instruccions complexes. D'altra banda, diversos treballs han avaluat les virtuts derivades d'utilitzar conjuntament la manipulació directa amb el toc directe mostrant que és possible evitar els problemes inherents a altres tècniques d'interacció com el ratolí i el teclat. Per tant, aprofitant la interacció natural i intuïtiva proporcionada per la tecnologia multi-tàctil, aquesta sembla una forma ideal per donar suport a la creació d'escenaris educatius per a xiquets en edat preescolar. No obstant això, malgrat l'existència de diversos estudis que avaluen la idoneïtat d'utilitzar l'estil d'interacció de manipulació directa, existeix una manca de treballs abordant l'ús de dispositius basats en superfícies tàctils amb xiquets d'edat primerenca. Així mateix, en l'actualitat existeix una creixent tendència a dissenyar aplicacions educatives i lúdiques dirigides a xiquets en edat preescolar utilitzant dispositius tàctils com els telèfons intel¿ligents o les tauletes. A més, diversos informes assenyalen que els xiquets són usuaris freqüents d'aquests tipus de dispositius i els utilitzen fins i tot abans de ser capaços de parlar. Malgrat aquest creixement en l'ús de la tecnologia multi-tàctil i la seua aparent idoneïtat per a ser utilitzada en el desenvolupament d'aplicacions educatives per a xiquets en edat preescolar, no existeixen unes interaccions universals i estandarditzades per a preescolars a l'hora d'utilitzar dispositius tàctils ja que habitualment només s'utilitzen dos gestos bàsics (bàsicament, el toc amb un dit per a seleccionar i l'arrossegament amb un dit per al moviment). Per tant, hi ha una clara necessitat de dur a terme estudis empírics per a contribuir i avançar en el disseny d'aplicacions que donen un suport adequat i s'ajusten amb les habilitats dels xiquets en el seu primerenc desenvolupament. Per tant, la tesi proposa, dissenya i avalua diverses estratègies de usabilitat i comunicabilitat adaptades als xiquets en edat preescolar per tal d'establir la base per al disseny i desenvolupament de futures aplicacions basades en dispositius tàctils dirigides a preescolars. Aquestes estratègies portaran a l'adequada definició de guies de disseny que permetran als xiquets aprofitar al màxim la tecnologia multi-tàctil, faran possible el desenvolupament de noves i atractives aplicacions i, eventualment, podran també ajudar al desenvolupament cognitiu i motor dels xiquets.[EN] Multi-touch technology has become one of the most emergent technologies and has had an enormous growth since its initial steps in the eighties to be widespread accepted and used in the present. On the one hand, multi-touch technology relies on the direct manipulation interaction style which gives users the advantage to view the objects and actions of interest, replace typed commands by pointing actions and to perform rapid, reversible and incremental actions avoiding using complex instructions. On the other hand, several works have evaluated the virtues when joining direct manipulation with direct-touching showing that it solves the problems inherent in other interaction devices, such as those involving mouse or keyboard. Hence, taking advantage of the intuitive and natural interaction provided by multi-touch technology it seems an ideal way to support educational scenarios targeted to kindergarten children. Although several works have assessed the suitability of using the direct manipulation style with children, there is a lack of works addressing the use of touchscreen devices by this specific type of users. Moreover, there is a growing trend of designing educational and playful applications targeted to kindergarten children based on touchscreen devices such as smartphones and tablets. In addition, several reports point out that children use touchscreen devices even before they are able to speak and they are frequent users of devices such as smartphones and tablets. However, despite this growth in the use of multi-touch technology by children and its apparent suitability to be used to develop applications targeted to young children, there is a lack of standardized and universally accepted interactions for young children when using touchscreen devices since only two basic gestures are commonly used (basically, consisting of only one-finger touch for selection and one-finger drag for movement). Hence, there is a need of carrying out empirical studies to help and advance in the design of applications that adequately support and fit with children's development and skills. Therefore, this thesis proposes, designs and evaluates several usability and communicability strategies tailored to children in their early development stage to establish the design and development of future applications targeted to kindergarten children. These strategies will lead to define appropriate design strategies that enable infants to take full advantage of multi-touch technology, would make it possible to develop attractive new applications and, eventually, could also aid children's cognitive and motor development.Nácher Soler, VE. (2019). Kindertivity: Usability and Communicability Strategies for Interactive Surfaces and Pre-Kindergarten Children [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/116833TESISCompendi

    Understanding grip shifts:how form factors impact hand movements on mobile phones

    Get PDF
    In this paper we present an investigation into how hand usage is affected by different mobile phone form factors. Our initial (qualitative) study explored how users interact with various mobile phone types (touchscreen, physical keyboard and stylus). The analysis of the videos revealed that each type of mobile phone affords specific handgrips and that the user shifts these grips and consequently the tilt and rotation of the phone depending on the context of interaction. In order to further investigate the tilt and rotation effects we conducted a controlled quantitative study in which we varied the size of the phone and the type of grips (Symmetric bimanual, Asymmetric bimanual with finger, Asymmetric bimanual with thumb and Single handed) to better understand how they affect the tilt and rotation during a dual pointing task. The results showed that the size of the phone does have a consequence and that the distance needed to reach action items affects the phones’ tilt and rotation. Additionally, we found that the amount of tilt, rotation and reach required corresponded with the participant’s grip preference. We finish the paper by discussing the design lessons for mobile UI and proposing design guidelines and applications for these insights

    WearPut : Designing Dexterous Wearable Input based on the Characteristics of Human Finger Motions

    Get PDF
    Department of Biomedical Engineering (Human Factors Engineering)Powerful microchips for computing and networking allow a wide range of wearable devices to be miniaturized with high fidelity and availability. In particular, the commercially successful smartwatches placed on the wrist drive market growth by sharing the role of smartphones and health management. The emerging Head Mounted Displays (HMDs) for Augmented Reality (AR) and Virtual Reality (VR) also impact various application areas in video games, education, simulation, and productivity tools. However, these powerful wearables have challenges in interaction with the inevitably limited space for input and output due to the specialized form factors for fitting the body parts. To complement the constrained interaction experience, many wearable devices still rely on other large form factor devices (e.g., smartphones or hand-held controllers). Despite their usefulness, the additional devices for interaction can constrain the viability of wearable devices in many usage scenarios by tethering users' hands to the physical devices. This thesis argues that developing novel Human-Computer interaction techniques for the specialized wearable form factors is vital for wearables to be reliable standalone products. This thesis seeks to address the issue of constrained interaction experience with novel interaction techniques by exploring finger motions during input for the specialized form factors of wearable devices. The several characteristics of the finger input motions are promising to enable increases in the expressiveness of input on the physically limited input space of wearable devices. First, the input techniques with fingers are prevalent on many large form factor devices (e.g., touchscreen or physical keyboard) due to fast and accurate performance and high familiarity. Second, many commercial wearable products provide built-in sensors (e.g., touchscreen or hand tracking system) to detect finger motions. This enables the implementation of novel interaction systems without any additional sensors or devices. Third, the specialized form factors of wearable devices can create unique input contexts while the fingers approach their locations, shapes, and components. Finally, the dexterity of fingers with a distinctive appearance, high degrees of freedom, and high sensitivity of joint angle perception have the potential to widen the range of input available with various movement features on the surface and in the air. Accordingly, the general claim of this thesis is that understanding how users move their fingers during input will enable increases in the expressiveness of the interaction techniques we can create for resource-limited wearable devices. This thesis demonstrates the general claim by providing evidence in various wearable scenarios with smartwatches and HMDs. First, this thesis explored the comfort range of static and dynamic touch input with angles on the touchscreen of smartwatches. The results showed the specific comfort ranges on variations in fingers, finger regions, and poses due to the unique input context that the touching hand approaches a small and fixed touchscreen with a limited range of angles. Then, finger region-aware systems that recognize the flat and side of the finger were constructed based on the contact areas on the touchscreen to enhance the expressiveness of angle-based touch input. In the second scenario, this thesis revealed distinctive touch profiles of different fingers caused by the unique input context for the touchscreen of smartwatches. The results led to the implementation of finger identification systems for distinguishing two or three fingers. Two virtual keyboards with 12 and 16 keys showed the feasibility of touch-based finger identification that enables increases in the expressiveness of touch input techniques. In addition, this thesis supports the general claim with a range of wearable scenarios by exploring the finger input motions in the air. In the third scenario, this thesis investigated the motions of in-air finger stroking during unconstrained in-air typing for HMDs. The results of the observation study revealed details of in-air finger motions during fast sequential input, such as strategies, kinematics, correlated movements, inter-fingerstroke relationship, and individual in-air keys. The in-depth analysis led to a practical guideline for developing robust in-air typing systems with finger stroking. Lastly, this thesis examined the viable locations of in-air thumb touch input to the virtual targets above the palm. It was confirmed that fast and accurate sequential thumb touch can be achieved at a total of 8 key locations with the built-in hand tracking system in a commercial HMD. Final typing studies with a novel in-air thumb typing system verified increases in the expressiveness of virtual target selection on HMDs. This thesis argues that the objective and subjective results and novel interaction techniques in various wearable scenarios support the general claim that understanding how users move their fingers during input will enable increases in the expressiveness of the interaction techniques we can create for resource-limited wearable devices. Finally, this thesis concludes with thesis contributions, design considerations, and the scope of future research works, for future researchers and developers to implement robust finger-based interaction systems on various types of wearable devices.ope

    Investigating How Smartphone Movement is Affected by Body Posture

    Get PDF

    A computational approach to gestural interactions of the upper limb on planar surfaces

    Get PDF
    There are many compelling reasons for proposing new gestural interactions: one might want to use a novel sensor that affords access to data that couldn’t be previously captured, or transpose a well-known task into a different unexplored scenario. After an initial design phase, the creation, optimisation or understanding of new interactions remains, however, a challenge. Models have been used to foresee interaction properties: Fitts’ law, for example, accurately predicts movement time in pointing and steering tasks. But what happens when no existing models apply? The core assertion to this work is that a computational approach provides frameworks and associated tools that are needed to model such interactions. This is supported through three research projects, in which discriminative models are used to enable interactions, optimisation is included as an integral part of their design and reinforcement learning is used to explore motions users produce in such interactions

    Touch-Enhanced Gesture Control Scheme

    Get PDF
    We present an approach for improving gesture control by combining it with touch input to address a key shortcoming of gesture live mic syndrome by using touch- screen commands as a virtual clutch. The touch-enhanced gesture control scheme is designed and developed using a generic smartphone. For performance evaluation, this scheme was compared to the commercially available Myo armband device. Two tasks designed to measure selection accuracy and speed in a within-subject user study (n=30) reveal our touch-enhanced control scheme is faster and more accurate when executing selection commands. Additionally, qualitative results from a post-study questionnaire showed a majority of participants selected the touch-enhanced as easier to use over the Myo.M.S., Digital Media -- Drexel University, 201

    Understanding Grip Shifts: How Form Factors Impact Hand Movements on Mobile Phones

    Get PDF
    In this paper we present an investigation into how hand usage is affected by different mobile phone form factors. Our initial (qualitative) study explored how users interact with various mobile phone types (touchscreen, physical keyboard and stylus). The analysis of the videos revealed that each type of mobile phone affords specific handgrips and that the user shifts these grips and consequently the tilt and rotation of the phone depending on the context of interaction. In order to further investigate the tilt and rotation effects we conducted a controlled quantitative study in which we varied the size of the phone and the type of grips (Symmetric bimanual, Asymmetric bimanual with finger, Asymmetric bimanual with thumb and Single handed) to better understand how they affect the tilt and rotation during a dual pointing task. The results showed that the size of the phone does have a consequence and that the distance needed to reach action items affects the phones’ tilt and rotation. Additionally, we found that the amount of tilt, rotation and reach required corresponded with the participant’s grip preference. We finish the paper by discussing the design lessons for mobile UI and proposing design guidelines and applications for these insights

    Investigating Data Exploration Techniques Involving Map Based Geotagged Data in a Collaborative Sensemaking Environment

    Get PDF
    The recent advancement in Global Positioning Systems (GPS) using satellite and geotagging has opened many opportunities for data-driven decision-making in fields such as emergency response, military intelligence, oil exploration and urban planning. The enormity and explosion of geospatial data necessitates the development of improved tools to support analysis and decision-making around this complex data – a process often known as sensemaking. A typical geotagged map can have hundreds of data points that are multi-dimensional, with each point having meaningful information associated with its location, as well as project specific information e.g., photographs, graphs, charts, bulletin data among many other information parameters. Sensemaking activities involving such complex data often involve a team of trained professionals who aim to make sense of this data to answer specific sets of questions, and make key decisions. Researchers are currently exploring the use of surface computing technology, such as, interactive digital tabletops and touch-based tablets to form methodologies to enhance collaborative sensemaking. This thesis examined the impact of two multi-surface interaction techniques that allowed individual group members to explore detailed geotagged data on separate peripheral tablets while sharing a large geographical overview on a digital tabletop. The two interaction techniques differed in the type of user input needed to control the location on the tabletop overview of a bounded “region of interest” (ROI) corresponding to the geotagged data displayed on the personal tablets. One technique (TOUCH) required the ROI to be positioned on the tabletop using direct touch interaction. The other technique (TILT) required the ROI to be positioned via 3-dimensional (up-down, left-right) tilt-gesture made with the personal tablet. Findings from the study revealed that the effectiveness of the respective interaction techniques depended on the stage of sensemaking process, and on which collaboration strategy groups employed during collaborative sensemaking
    corecore