24 research outputs found

    From head to toe:body movement for human-computer interaction

    Get PDF
    Our bodies are the medium through which we experience the world around us, so human-computer interaction can highly benefit from the richness of body movements and postures as an input modality. In recent years, the widespread availability of inertial measurement units and depth sensors led to the development of a plethora of applications for the body in human-computer interaction. However, the main focus of these works has been on using the upper body for explicit input. This thesis investigates the research space of full-body human-computer interaction through three propositions. The first proposition is that there is more to be inferred by natural users’ movements and postures, such as the quality of activities and psychological states. We develop this proposition in two domains. First, we explore how to support users in performing weight lifting activities. We propose a system that classifies different ways of performing the same activity; an object-oriented model-based framework for formally specifying activities; and a system that automatically extracts an activity model by demonstration. Second, we explore how to automatically capture nonverbal cues for affective computing. We developed a system that annotates motion and gaze data according to the Body Action and Posture coding system. We show that quality analysis can add another layer of information to activity recognition, and that systems that support the communication of quality information should strive to support how we implicitly communicate movement through nonverbal communication. Further, we argue that working at a higher level of abstraction, affect recognition systems can more directly translate findings from other areas into their algorithms, but also contribute new knowledge to these fields. The second proposition is that the lower limbs can provide an effective means of interacting with computers beyond assistive technology To address the problem of the dispersed literature on the topic, we conducted a comprehensive survey on the lower body in HCI, under the lenses of users, systems and interactions. To address the lack of a fundamental understanding of foot-based interactions, we conducted a series of studies that quantitatively characterises several aspects of foot-based interaction, including Fitts’s Law performance models, the effects of movement direction, foot dominance and visual feedback, and the overhead incurred by using the feet together with the hand. To enable all these studies, we developed a foot tracker based on a Kinect mounted under the desk. We show that the lower body can be used as a valuable complementary modality for computing input. Our third proposition is that by treating body movements as multiple modalities, rather than a single one, we can enable novel user experiences. We develop this proposition in the domain of 3D user interfaces, as it requires input with multiple degrees of freedom and offers a rich set of complex tasks. We propose an approach for tracking the whole body up close, by splitting the sensing of different body parts across multiple sensors. Our setup allows tracking gaze, head, mid-air gestures, multi-touch gestures, and foot movements. We investigate specific applications for multimodal combinations in the domain of 3DUI, specifically how gaze and mid-air gestures can be combined to improve selection and manipulation tasks; how the feet can support the canonical 3DUI tasks; and how a multimodal sensing platform can inspire new 3D game mechanics. We show that the combination of multiple modalities can lead to enhanced task performance, that offloading certain tasks to alternative modalities not only frees the hands, but also allows simultaneous control of multiple degrees of freedom, and that by sensing different modalities separately, we achieve a more detailed and precise full body tracking

    Human-Computer Interaction

    Get PDF
    In this book the reader will find a collection of 31 papers presenting different facets of Human Computer Interaction, the result of research projects and experiments as well as new approaches to design user interfaces. The book is organized according to the following main topics in a sequential order: new interaction paradigms, multimodality, usability studies on several interaction mechanisms, human factors, universal design and development methodologies and tools

    Proceedings of the 5th international conference on disability, virtual reality and associated technologies (ICDVRAT 2004)

    Get PDF
    The proceedings of the conferenc

    A Taxonomy of Freehand Grasping Patterns in Virtual Reality

    Get PDF
    Grasping is the most natural and primary interaction paradigm people perform every day, which allows us to pick up and manipulate objects around us such as drinking a cup of coffee or writing with a pen. Grasping has been highly explored in real environments, to understand and structure the way people grasp and interact with objects by presenting categories, models and theories for grasping approach. Due to the complexity of the human hand, classifying grasping knowledge to provide meaningful insights is a challenging task, which led to researchers developing grasp taxonomies to provide guidelines for emerging grasping work (such as in anthropology, robotics and hand surgery) in a systematic way. While this body of work exists for real grasping, the nuances of grasping transfer in virtual environments is unexplored. The emerging development of robust hand tracking sensors for virtual devices now allow the development of grasp models that enable VR to simulate real grasping interactions. However, present work has not yet explored the differences and nuances that are present in virtual grasping compared to real object grasping, which means that virtual systems that create grasping models based on real grasping knowledge, might make assumptions which are yet to be proven true or untrue around the way users intuitively grasp and interact with virtual objects. To address this, this thesis presents the first user elicitation studies to explore grasping patterns directly in VR. The first study presents main similarities and differences between real and virtual object grasping, the second study furthers this by exploring how virtual object shape influences grasping patterns, the third study focuses on visual thermal cues and how this influences grasp metrics, and the fourth study focuses on understanding other object characteristics such as stability and complexity and how they influence grasps in VR. To provide structured insights on grasping interactions in VR, the results are synthesized in the first VR Taxonomy of Grasp Types, developed following current methods for developing grasping and HCI taxonomies and re-iterated to present an updated and more complete taxonomy. Results show that users appear to mimic real grasping behaviour in VR, however they also illustrate that users present issues around object size estimation and generally a lower variability in grasp types is used. The taxonomy shows that only five grasps account for the majority of grasp data in VR, which can be used for computer systems aiming to achieve natural and intuitive interactions at lower computational cost. Further, findings show that virtual object characteristics such as shape, stability and complexity as well as visual cues for temperature influence grasp metrics such as aperture, category, type, location and dimension. These changes in grasping patterns together with virtual object categorisation methods can be used to inform design decisions when developing intuitive interactions and virtual objects and environments and therefore taking a step forward in achieving natural grasping interaction in VR

    Towards observable haptics: Novel sensors for capturing tactile interaction patterns

    Get PDF
    Kõiva R. Towards observable haptics: Novel sensors for capturing tactile interaction patterns. Bielefeld: Bielefeld University; 2014.Touch is one of the primary senses humans use when performing coordinated interaction, but the lack of a sense of touch in the majority of contemporary interactive technical systems, such as robots, which operate in non-deterministic environments, results in interactions that can at best be described as clumsy. Observing human haptics and extracting the salient information from the gathered data is not only relevant if we are to try to understand the involved underlying cognitive processes, but should also provide us with significant clues to design future intelligent interactive systems. Such systems could one day help to take the burden of tedious tasks off our hands in a similar fashion to how industrial robots revolutionized manufacturing. The aim of the work in this thesis was to provide significant advancements in tactile sensing technology, and thus move us a step closer to realizing this goal. The contributions contained herein can be broken into two major parts. The first part investigates capturing interaction patterns in humans with the goals of better understanding manual intelligence and improving the lives of hand amputees, while the second part is focused on augmenting technical systems with a sense of touch. tacTiles, a wireless tactile sensitive surface element attached to a deformable textile, was developed to capture human full-body interactions with large surfaces we come into contact with in our daily lives, such as floors, chairs, sofas or other furniture. The Tactile Dataglove, iObject and the Tactile Pen were developed especially to observe human manual intelligence. Whereas iObject allows motion sensing and a higher definition tactile signal to be captured than the Tactile Dataglove (220 tactile cells in the first iObject prototype versus 54 cells in the glove), the wearable glove makes haptic interactions with arbitrary objects observable. The Tactile Pen was designed to measure grip force during handwriting in order to better facilitate therapeutic treatment assessments. These sensors have already been extensively used by various research groups, including our own, to gain a better understanding of human manual intelligence. The Finger-Force-Linear-Sensor and the Tactile Bracelet are two novel sensors that were developed to facilitate more natural control of dexterous multi Degree-of-Freedom (DOF) hand prostheses. The Finger-Force-Linear-Sensor is a very accurate bidirectional single finger force ground-truth measurement device that was designed to enable testing and development of single finger forces and muscle activations mapping algorithms. The Tactile Bracelet was designed with the goal to provide a more robust and intuitive means of control for multi-DOF hand prostheses by measuring the muscle bulgings of the remnant muscles of lower arm amputees. It is currently in development and will eventually cover the complete forearm circumference with high spatial resolution tactile sensitive surfaces. An experiment involving a large number of lower arm amputees has already been planned. The Modular flat tactile sensor system, the Fabric-based touch sensitive artificial skin and the 3D shaped tactile sensor were developed to cover and to add touch sensing capabilities to the surfaces of technical systems. The rapid augmentation of systems with a sense of touch was the main goal of the modular flat tactile sensor system. The developed sensor modules can be used alone or in an array to form larger tactile sensitive surfaces such as tactile sensitive tabletops. As many robots have curved surfaces, using flat rigid modules severely limits the areas that can be covered with tactile sensors. The Fabric-based tactile sensor, originally developed to form a tactile dataglove for human hands, can with minor modifications also function as an artificial skin for technical systems. Finally, the 3D shaped tactile sensor based on Laser-Direct-Structuring technology is a novel tactile sensor that has a true 3D shape and provides high sensitivity and a high spatial resolution. These sensors take us further along the path towards creating general purpose technical systems that in time can be of great help to us in our daily lives. The desired tactile sensor characteristics differ significantly according to which haptic interaction patterns we wish to measure. Large tactile sensor arrays that are used to capture full body haptic interactions with floors and upholstered furniture, or that are designed to cover large areas of technical system surfaces, need to be scalable, have low power consumption and should ideally have a low material cost. Two examples of such sensors are tacTiles and the Fabric-based sensor for curved surfaces. At the other end of the tactile sensor development spectrum, if we want to observe manual interactions, high spatial and temporal resolution are crucial to enable the measurement of fine grasping and manipulation actions. Our fingertips contain the highest density area of mechanoreceptors, the organs that sense mechanical pressure and distortions. Thus, to construct biologically inspired anthropomorphic robotic hands, the artificial tactile sensors for the fingertips require similar high-fidelity sensors with surfaces that are curved under small bending radii in 2 dimensions, have high spatial densities, while simultaneously providing high sensitivity. With the fingertip tactile sensor, designed to fit the Shadow Robot Hands' fingers, I show that such sensors can indeed be constructed in the 3D-shaped high spatial resolution tactile sensor section of my thesis. With my work I have made a significant contribution towards making haptics more observable. I achieved this by developing a high number of novel tactile sensors that are usable, give a deeper insight into human haptic interactions, have great potential to help amputees and that make technical systems, such as robots, more capable

    Developing an Open Source Exertion Interface for Two-Handed 3D and 6DOF Motion Tracking and Visualisation

    No full text
    corecore