6 research outputs found

    Time Synchronization in Graphic Domain - A new paradigm for Augmented Music Scores

    Get PDF
    International audienceWe propose a simple method for synchronization of arbitrary graphic objects, based on their time relations. This method relies on segmentation and mappings that are relations between segmentations. The paper gives a formal description of segmentations and mappings and presents Interlude, a framework that implements the proposed method under the form of an augmented music score viewer, opening a new space to music notation

    Music-Related Media-Contents Synchronization over theWeb: the IEEE 1599 Initiative

    Get PDF
    IEEE 1599 is an international standard originally conceived for music, which aims at providing a comprehensive description of the media contents related to a music piece within a multi-layer and synchronized environment. A number of o_- line and stand-alone software prototypes has been realized after its standardization, occurred in 2008. Recently, thanks to some technological advances (e.g. the release of HTML5), the engine of the IEEE 1599 parser has been ported on the Web. Some non-trivial problems have been solved, e.g. the management of multiple simultaneous media streams in a client-server architecture. After providing an overview of the IEEE 1599 standard, this article presents a survey of the recent initiatives regarding audio-driven synchronization over the Web

    Segments and Mapping for Scores and Signal Representations

    Get PDF
    We present a general theoretical framework to describe segments and the different possible mapping that can be established between them. Each segment can be related to different music representations, graphical scores, music signals or gesture signals. This theoretical formalism is general and is compatible with large number of problems found in sound and gesture computing. We describe some examples we developed in interactive score representation, superposed with signal representation, and the description of synchronization between gesture and sound signals

    Meta-instrument and Natural User Interface: a New Paradigm in Music Education

    Get PDF
    Music is gaining a growing role in children and teenager education, but playing music is not as natural as listening, requiring a large amount of time to spend in learning how to play a music instrument. Since a number of technical skills should be acquired through practice, this is a strong limitation for educational contexts where student\u2019s time is a limited resource. The meta-instrument is a new paradigm for music instruments that overcomes these limitations and enables the student to instantly execute a music score without a specific training. Thanks to the natural ability of the human beings to tap the music, the meta-instrument moves the interaction level to the timing, the interpretation, and the natural interaction, embedding the pitch control in its smart logic implementation. The typical interaction of a natural user interface is applicable to a meta-instrument because it can emulate both a traditional music instrument and a virtual one. The student can play a music instrument regardless to the interaction mode, thus focusing on interpretation rather than on playing. The proposed meta-instrument framework refers to multiple and heterogeneous contents applying the specification of IEEE 1599, an XML-based standard for full representation of music

    Local and Global Semantic Networks for the Representation of Music Information

    Get PDF
    In the field of music informatics, multilayer representation formats are becoming increasingly important, since they enable an integrated and synchronized representation of the various entities that describe a piece of music, from the digital encoding of score symbols to its typographic aspects and audio recordings. Often these formats are based on the eXtensible Markup Language (XML), that allows information embedding, hierarchical structuring and interconnection within a single document. Simultaneously, the advent of the so-called Semantic Web is leading to the transformation of the World Wide Web into an environment where documents are associated with data and metadata. XML is extensively used also in the Semantic Web, since this format supports not only human- but also machine-readable tags. On the one side the Semantic Web aims to create a set of automatically-detectable relationships among data, thus providing users with a number of non-trivial paths to navigate information in a geographically distributed framework; on the other side, multilayer formats typically operate in a similar way, but at a \u201clocal\u201d level. The goal of the present work is to discuss the possibilities emerging from a combined approach, namely by adopting multilayer formats in the Semantic Web, addressing in particular augmented-reality applications. An XML-based international standard known as IEEE 1599 will be employed to show a number of innovative applications in music

    IEEE 1599: Music Encoding and Interaction

    No full text
    IEEE Std 1599 allows interaction with music content such as notes and sounds in video applications and in any interactive device
    corecore