5 research outputs found

    Self-Generated Auditory Feedback as a Cue to Support Rhythmic Motor Stability

    No full text
    A goal of the SKILLS project is to develop Virtual Reality (VR)-based training simulators for different application domains, one of which is juggling. Within this context the value of multimodal VR environments for skill acquisition is investigated. In this study, we investigated whether it was necessary to render the sounds of virtual balls hitting virtual hands within the juggling training simulator. First, we recorded sounds at the jugglers’ ears and found the sound of ball hitting hands to be audible. Second, we asked 24 jugglers to juggle under normal conditions (Audible) or while listening to pink noise intended to mask the juggling sounds (Inaudible). We found that although the jugglers themselves reported no difference in their juggling across these two conditions, external juggling experts rated rhythmic stability worse in the Inaudible condition than in the Audible condition. This result suggests that auditory information should be rendered in the VR juggling training simulator

    Evaluation of the light weight Juggling system

    No full text
    This paper presents the training of juggling skills, a highly complex problem of coordination, under tight time and space constraints. This training is achieved with a simple training platform, a light weight juggling platform (LWJ), and is compared to training with real balls. The principle directing the design of the platform is to obtain a simple tool. The simplification, in comparison to real world juggling, is based on the identification of invariant properties in the spatiotemporal coordination of intermediate and expert jugglers. Two classes training solutions were added to the LWJ: An audio-tactile pacing or augmented multimodal environment, and the manipulation of cognitive components of the juggling skills. The transfer to juggling with real balls was evaluated in four different experiments
    corecore