6 research outputs found

    Synthesis of variable dancing styles based on a compact spatiotemporal representation of dance

    Get PDF
    Dance as a complex expressive form of motion is able to convey emotion, meaning and social idiosyncrasies that opens channels for non-verbal communication, and promotes rich cross-modal interactions with music and the environment. As such, realistic dancing characters may incorporate crossmodal information and variability of the dance forms through compact representations that may describe the movement structure in terms of its spatial and temporal organization. In this paper, we propose a novel method for synthesizing beatsynchronous dancing motions based on a compact topological model of dance styles, previously captured with a motion capture system. The model was based on the Topological Gesture Analysis (TGA) which conveys a discrete three-dimensional point-cloud representation of the dance, by describing the spatiotemporal variability of its gestural trajectories into uniform spherical distributions, according to classes of the musical meter. The methodology for synthesizing the modeled dance traces back the topological representations, constrained with definable metrical and spatial parameters, into complete dance instances whose variability is controlled by stochastic processes that considers both TGA distributions and the kinematic constraints of the body morphology. In order to assess the relevance and flexibility of each parameter into feasibly reproducing the style of the captured dance, we correlated both captured and synthesized trajectories of samba dancing sequences in relation to the level of compression of the used model, and report on a subjective evaluation over a set of six tests. The achieved results validated our approach, suggesting that a periodic dancing style, and its musical synchrony, can be feasibly reproduced from a suitably parametrized discrete spatiotemporal representation of the gestural motion trajectories, with a notable degree of compression

    ANIMASI GAMELAN BERBASIS FREKUENSI SUARA

    Get PDF
    "Gamelan Animation Based Base On Sound Frequency" is a study aimed to find out how to make 3D gamelan animation automatically in accordance with the input of gamelan sound. The gamelan instrument used in this research is the type of saron instrument. The number of blades used in this study 7 pieces, according to the number of saron blades. The final result of this research is 3D animation of saron gamelan. Software used in this research is Blender 3D. This research requires several processes such as in sound frequency analysis, each saron gamelan blade has a slightly different frequency value. In the object hammer required armature or rigging that serves to move the hammer. Each armature is supplied by the driver as a link between the armature and the sound frequency graph. The object of the saron hammer, can move based on the frequency graph. The results showed that with the input frequency of saron the object of animation can move according to the time of the sound of the gamelan saron

    The effect of practice with the Beatnik Rhythmic Analyzer on rhythm accuracy of non-percussion undergraduate music majors.

    Get PDF
    The purpose of this study was to measure the effectiveness of the Beatnik Rhythmic Analyzer on rhythmic accuracy. Non-percussion music majors ( N =19) were randomly divided to practice with either the Beatnik Rhythmic Analyzer ( n =9) or a metronome ( n =10). Five exercises were administered for one minute each over a three-week period. Pre- posttest scores were analyzed using a Mann-Whitney U to test for differences between the groups. While the mean posttest scores of the treatment group were higher than the control group, results indicated no significant difference between the groups (a=.05). Lastly, two out of the five exercises resulted in large effect sizes in favor of the treatment group, suggesting that the Beatnik Rhythmic Analyzer is highly effective for developing specific fundamental techniques in snare drum playing

    Automatic synchronization of background music and motion in computer animation

    No full text
    We synchronize background music with an animation by changing the timing of both, an approach which minimizesthe damage to either. Starting from a MIDI file and motion data, feature points are extracted from both sources,paired, and then synchronized using dynamic programming to time-scale the music and to timewarp the motion.We also introduce the music graph, a directed graph which encapsulates connections between many short musicsequences. By traversing a music graph we can generate large amounts of new background music, in which weexpect to find a sequence which matches the motion better than the original music.ope
    corecore