321 research outputs found

    Review of three-dimensional human-computer interaction with focus on the leap motion controller

    Get PDF
    Modern hardware and software development has led to an evolution of user interfaces from command-line to natural user interfaces for virtual immersive environments. Gestures imitating real-world interaction tasks increasingly replace classical two-dimensional interfaces based on Windows/Icons/Menus/Pointers (WIMP) or touch metaphors. Thus, the purpose of this paper is to survey the state-of-the-art Human-Computer Interaction (HCI) techniques with a focus on the special field of three-dimensional interaction. This includes an overview of currently available interaction devices, their applications of usage and underlying methods for gesture design and recognition. Focus is on interfaces based on the Leap Motion Controller (LMC) and corresponding methods of gesture design and recognition. Further, a review of evaluation methods for the proposed natural user interfaces is given

    An Abstraction Framework for Tangible Interactive Surfaces

    Get PDF
    This cumulative dissertation discusses - by the example of four subsequent publications - the various layers of a tangible interaction framework, which has been developed in conjunction with an electronic musical instrument with a tabletop tangible user interface. Based on the experiences that have been collected during the design and implementation of that particular musical application, this research mainly concentrates on the definition of a general-purpose abstraction model for the encapsulation of physical interface components that are commonly employed in the context of an interactive surface environment. Along with a detailed description of the underlying abstraction model, this dissertation also describes an actual implementation in the form of a detailed protocol syntax, which constitutes the common element of a distributed architecture for the construction of surface-based tangible user interfaces. The initial implementation of the presented abstraction model within an actual application toolkit is comprised of the TUIO protocol and the related computer-vision based object and multi-touch tracking software reacTIVision, along with its principal application within the Reactable synthesizer. The dissertation concludes with an evaluation and extension of the initial TUIO model, by presenting TUIO2 - a next generation abstraction model designed for a more comprehensive range of tangible interaction platforms and related application scenarios

    Recent developments in biofeedback for neuromotor rehabilitation

    Get PDF
    The original use of biofeedback to train single muscle activity in static positions or movement unrelated to function did not correlate well to motor function improvements in patients with central nervous system injuries. The concept of task-oriented repetitive training suggests that biofeedback therapy should be delivered during functionally related dynamic movement to optimize motor function improvement. Current, advanced technologies facilitate the design of novel biofeedback systems that possess diverse parameters, advanced cue display, and sophisticated control systems for use in task-oriented biofeedback. In light of these advancements, this article: (1) reviews early biofeedback studies and their conclusions; (2) presents recent developments in biofeedback technologies and their applications to task-oriented biofeedback interventions; and (3) discusses considerations regarding the therapeutic system design and the clinical application of task-oriented biofeedback therapy. This review should provide a framework to further broaden the application of task-oriented biofeedback therapy in neuromotor rehabilitation

    Understanding expressive action

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2000.Also available online at the MIT Theses Online homepage Includes bibliographical references (p. 117-120).This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.We strain our eyes, cramp our necks, and destroy our hands trying to interact with computer on their terms. At the extreme, we strap on devices and weigh ourselves down with cables trying to re-create a sense of place inside the machine, while cutting ourselves off from the world and people around us. The alternative is to make the real environment responsive to our actions. It is not enough for environments to respond simply to the presence of people or objects: they must also be aware of the subtleties of changing situations. If all the spaces we inhabit are to be responsive, they must not require encumbering devices to be worn and they must be adaptive to changes in the environment and changes of context. This dissertation examines a body of sophisticated perceptual mechanisms developed in response to these needs as well as a selection of human-computer interface sketches designed to push the technology forward and explore the possibilities of this novel interface idiom. Specifically, the formulation of a fully recursive framework for computer vision called DYNA that improves performance of human motion tracking will be examined in depth. The improvement in tracking performance is accomplished with the combination of a three-dimensional, physics-based model of the human body with modifications to the pixel classification algorithms that enable them to take advantage of this high-level knowledge. The result is a novel vision framework that has no completely bottom-up processes, and is therefore significantly faster and more stable than other approaches.by Christopher R. Wren.Ph.D

    A methodology for investigation of bowed string performance through measurement of violin bowing technique

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2007.Includes bibliographical references (leaves 181-186).Virtuosic bowed string performance in many ways exemplifies the incredible potential of human physical performance and expression. Today, a great deal is known about the physics of the violin family and those factors responsible for its sound capabilities. However, there remains much to be discovered about the intricacies of how players control these instruments in order to achieve their characteristic range and nuance of sound. Today, technology offers the ability to study this player control under realistic, unimpeded playing conditions to lead to greater understanding of these performance skills. Presented here is a new methodology for investigation of bowed string performance that uses a playable hardware measurement system to capture the gestures of right hand violin bowing technique. Building upon previous Hyperstring research, this measurement system was optimized to be small, lightweight, and portable and was installed on a carbon fiber violin bow and an electric violin to enable study of realistic, unencumbered violin performances. Included in the system are inertial and force sensors, and an electric field position sensor. In order to maximize the applicability of the gesture data provided by this system to related fields of interest, all of the sensors were calibrated in SI units.(cont.) The gesture data captured by these sensors are recorded together with the audio data from the violin as they are produced by violinists in typical playing scenarios. To explore the potential of the bowing measurement system created, a study of standard bowing techniques, such as detache, martele and spiccato, was conducted with expert violinist participants. Gesture data from these trials were evaluated and input to a classifier to examine physical distinctions between bowing techniques, as well as between players. Results from this analysis, and their implications on this methodology will be presented. In addition to this examination of bowing techniques, applications of the measurement system for study of bowed string acoustics and digital music instrument performance, with focus on virtual instruments created from physical models, will be discussed.by Diana Young.Ph.D

    Hand Motion Tracking System using Inertial Measurement Units and Infrared Cameras

    Get PDF
    This dissertation presents a novel approach to develop a system for real-time tracking of the position and orientation of the human hand in three-dimensional space, using MEMS inertial measurement units (IMUs) and infrared cameras. This research focuses on the study and implementation of an algorithm to correct the gyroscope drift, which is a major problem in orientation tracking using commercial-grade IMUs. An algorithm to improve the orientation estimation is proposed. It consists of: 1.) Prediction of the bias offset error while the sensor is static, 2.) Estimation of a quaternion orientation from the unbiased angular velocity, 3.) Correction of the orientation quaternion utilizing the gravity vector and the magnetic North vector, and 4.) Adaptive quaternion interpolation, which determines the final quaternion estimate based upon the current conditions of the sensor. The results verified that the implementation of the orientation correction algorithm using the gravity vector and the magnetic North vector is able to reduce the amount of drift in orientation tracking and is compatible with position tracking using infrared cameras for real-time human hand motion tracking. Thirty human subjects participated in an experiment to validate the performance of the hand motion tracking system. The statistical analysis shows that the error of position tracking is, on average, 1.7 cm in the x-axis, 1.0 cm in the y-axis, and 3.5 cm in the z-axis. The Kruskal-Wallis tests show that the orientation correction algorithm using gravity vector and magnetic North vector can significantly reduce the errors in orientation tracking in comparison to fixed offset compensation. Statistical analyses show that the orientation correction algorithm using gravity vector and magnetic North vector and the on-board Kalman-based orientation filtering produced orientation errors that were not significantly different in the Euler angles, Phi, Theta and Psi, with the p-values of 0.632, 0.262 and 0.728, respectively. The proposed orientation correction algorithm represents a contribution to the emerging approaches to obtain reliable orientation estimates from MEMS IMUs. The development of a hand motion tracking system using IMUs and infrared cameras in this dissertation enables future improvements in natural human-computer interactions within a 3D virtual environment

    Acoustic chase : designing an interactive audio environment to stimulate human body movement

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2004.Includes bibliographical references (p. 58-60).An immersive audio environment was created that explores how humans react to commands imposed by a machine generating its acoustic stimuli on the basis of tracked body movement. In this environment, different states of human and machine action are understood as a balance of power that moves back and forth between the apparatus and the human being. This system is based on spatial sounds that are designed to stimulate body movements. The physical set-up consists of headphones with attached sensors to pick up the movements of the head. Mathematic models calculate the behavior of the sound, its virtual motion path relative to the person, and how it changes over time.by Simon Karl Josef Schiessl.S.M

    ThirdLight: low-cost and high-speed 3D interaction using photosensor markers

    No full text
    We present a low-cost 3D tracking system for virtual reality, gesture modeling, and robot manipulation applications which require fast and precise localization of headsets, data gloves, props, or controllers. Our system removes the need for cameras or projectors for sensing, and instead uses cheap LEDs and printed masks for illumination, and low-cost photosensitive markers. The illumination device transmits a spatiotemporal pattern as a series of binary Gray-code patterns. Multiple illumination devices can be combined to localize each marker in 3D at high speed (333Hz). Our method has strengths in accuracy, speed, cost, ambient performance, large working space (1m-5m) and robustness to noise compared with conventional techniques. We compare with a state-of-the-art instrumented glove and vision-based systems to demonstrate the accuracy, scalability, and robustness of our approach. We propose a fast and accurate method for hand gesture modeling using an inverse kinematics approach with the six photosensitive markers. We additionally propose a passive markers system and demonstrate various interaction scenarios as practical applications

    Design Strategies for Adaptive Social Composition: Collaborative Sound Environments

    Get PDF
    In order to develop successful collaborative music systems a variety of subtle interactions need to be identified and integrated. Gesture capture, motion tracking, real-time synthesis, environmental parameters and ubiquitous technologies can each be effectively used for developing innovative approaches to instrument design, sound installations, interactive music and generative systems. Current solutions tend to prioritise one or more of these approaches, refining a particular interface technology, software design or compositional approach developed for a specific composition, performer or installation environment. Within this diverse field a group of novel controllers, described as ‘Tangible Interfaces’ have been developed. These are intended for use by novices and in many cases follow a simple model of interaction controlling synthesis parameters through simple user actions. Other approaches offer sophisticated compositional frameworks, but many of these are idiosyncratic and highly personalised. As such they are difficult to engage with and ineffective for groups of novices. The objective of this research is to develop effective design strategies for implementing collaborative sound environments using key terms and vocabulary drawn from the available literature. This is articulated by combining an empathic design process with controlled sound perception and interaction experiments. The identified design strategies have been applied to the development of a new collaborative digital instrument. A range of technical and compositional approaches was considered to define this process, which can be described as Adaptive Social Composition. Dan Livingston
    • 

    corecore