441 research outputs found

    Visual tracking of highly articulated objects using massively parallel processors

    Get PDF
    Hand gesture recognition has the potential of simplifying human computer interactions. However, the human hand is a highly articulated object, capable of taking on many different appearances. In this work, we consider an analysis by synthesis approach to this difficult tracking problem. We attempt to overcome the vast amount of computation required by implementing the algorithm on commodity GPUs. We also collect a lengthy sequence of hand motions from five cameras in order to train and test our algorithm. We show that to achieve good tracking performance, it is important to understand the way that the hand moves. It is of secondary importance to have a good estimate of the hand shape and to be able to process the frames as quickly as possible. Under heavily controlled circumstances, we are able to achieve full tracking accuracy

    Towards gestural understanding for intelligent robots

    Get PDF
    Fritsch JN. Towards gestural understanding for intelligent robots. Bielefeld: Universität Bielefeld; 2012.A strong driving force of scientific progress in the technical sciences is the quest for systems that assist humans in their daily life and make their life easier and more enjoyable. Nowadays smartphones are probably the most typical instances of such systems. Another class of systems that is getting increasing attention are intelligent robots. Instead of offering a smartphone touch screen to select actions, these systems are intended to offer a more natural human-machine interface to their users. Out of the large range of actions performed by humans, gestures performed with the hands play a very important role especially when humans interact with their direct surrounding like, e.g., pointing to an object or manipulating it. Consequently, a robot has to understand such gestures to offer an intuitive interface. Gestural understanding is, therefore, a key capability on the way to intelligent robots. This book deals with vision-based approaches for gestural understanding. Over the past two decades, this has been an intensive field of research which has resulted in a variety of algorithms to analyze human hand motions. Following a categorization of different gesture types and a review of other sensing techniques, the design of vision systems that achieve hand gesture understanding for intelligent robots is analyzed. For each of the individual algorithmic steps – hand detection, hand tracking, and trajectory-based gesture recognition – a separate Chapter introduces common techniques and algorithms and provides example methods. The resulting recognition algorithms are considering gestures in isolation and are often not sufficient for interacting with a robot who can only understand such gestures when incorporating the context like, e.g., what object was pointed at or manipulated. Going beyond a purely trajectory-based gesture recognition by incorporating context is an important prerequisite to achieve gesture understanding and is addressed explicitly in a separate Chapter of this book. Two types of context, user-provided context and situational context, are reviewed and existing approaches to incorporate context for gestural understanding are reviewed. Example approaches for both context types provide a deeper algorithmic insight into this field of research. An overview of recent robots capable of gesture recognition and understanding summarizes the currently realized human-robot interaction quality. The approaches for gesture understanding covered in this book are manually designed while humans learn to recognize gestures automatically during growing up. Promising research targeted at analyzing developmental learning in children in order to mimic this capability in technical systems is highlighted in the last Chapter completing this book as this research direction may be highly influential for creating future gesture understanding systems

    LOCALIZATION AND RECOGNITION OF DYNAMIC HAND GESTURES BASED ON HIERARCHY OF MANIFOLD CLASSIFIERS

    Get PDF

    Toward the autism motor signature : gesture patterns during smart tablet gameplay identify children with autism

    Get PDF
    Autism is a developmental disorder evident from infancy. Yet, its clinical identification requires expert diagnostic training. New evidence indicates disruption to motor timing and integration may underpin the disorder, providing a potential new computational marker for its early identification. In this study, we employed smart tablet computers with touch-sensitive screens and embedded inertial movement sensors to record the movement kinematics and gesture forces made by 37 children 3-6 years old with autism and 45 age- and gender-matched children developing typically. Machine learning analysis of the children’s motor patterns identified autism with up to 93% accuracy. Analysis revealed these patterns consisted of greater forces at contact and with a different distribution of forces within a gesture, and gesture kinematics were faster and larger, with more distal use of space. These data support the notion disruption to movement is core feature of autism, and demonstrate autism can be computationally assessed by fun, smart device gameplay
    corecore