4 research outputs found

    A Continuous Grasp Representation for the Imitation Learning of Grasps on Humanoid Robots

    Get PDF
    Models and methods are presented which enable a humanoid robot to learn reusable, adaptive grasping skills. Mechanisms and principles in human grasp behavior are studied. The findings are used to develop a grasp representation capable of retaining specific motion characteristics and of adapting to different objects and tasks. Based on the representation a framework is proposed which enables the robot to observe human grasping, learn grasp representations, and infer executable grasping actions

    Guitarist Fingertip Tracking by Integrating a Bayesian Classifier into Particle Filters

    Get PDF
    We propose a vision-based method for tracking guitar fingerings made by guitar players. We present it as a new framework for tracking colored finger markers by integrating a Bayesian classifier into particle filters. This adds the useful abilities of automatic track initialization and recovery from tracking failures in a dynamic background. Furthermore, by using the online adaptation of color probabilities, this method is able to cope with illumination changes. Augmented Reality Tag (ARTag) is then utilized to calculate the projection matrix as an online process which allows the guitar to be moved while being played. Representative experimental results are also included. The method presented can be used to develop the application of human-computer interaction (HCI) to guitar playing by recognizing the chord being played by a guitarist in virtual spaces. The aforementioned application would assist guitar learners by allowing them to automatically identify if they are using the correct chords required by the musical piece
    corecore