9,811 research outputs found
INSTRUMENTATION-BASED MUSIC SIMILARITY USING SPARSE REPRESENTATIONS
© 2012 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works
Recommended from our members
People-Powered Music: Using User-Generated Tags and Structure in Recommendations
Music recommenders often rely on experts to classify song facets like genre and mood, but user-generated folksonomies hold some advantages over expert classifications—folksonomies can reflect the same real-world vocabularies and categorizations that end users employ. We present an approach for using crowd-sourced common sense knowledge to structure user-generated music tags into a folksonomy, and describe how to use this approach to make music recommendations. We then empirically evaluate our “people-powered” structured content recommender against a more traditional recommender. Our results show that participants slightly preferred the unstructured recommender, rating more of its recommendations as “perfect” than they did for our approach. An exploration of the reasons behind participants’ ratings revealed that users behaved differently when tagging songs than when evaluating recommendations, and we discuss the implications of our results for future tagging and recommendation approaches
Analyzing Visual Mappings of Traditional and Alternative Music Notation
In this paper, we postulate that combining the domains of information
visualization and music studies paves the ground for a more structured analysis
of the design space of music notation, enabling the creation of alternative
music notations that are tailored to different users and their tasks. Hence, we
discuss the instantiation of a design and visualization pipeline for music
notation that follows a structured approach, based on the fundamental concepts
of information and data visualization. This enables practitioners and
researchers of digital humanities and information visualization, alike, to
conceptualize, create, and analyze novel music notation methods. Based on the
analysis of relevant stakeholders and their usage of music notation as a mean
of communication, we identify a set of relevant features typically encoded in
different annotations and encodings, as used by interpreters, performers, and
readers of music. We analyze the visual mappings of musical dimensions for
varying notation methods to highlight gaps and frequent usages of encodings,
visual channels, and Gestalt laws. This detailed analysis leads us to the
conclusion that such an under-researched area in information visualization
holds the potential for fundamental research. This paper discusses possible
research opportunities, open challenges, and arguments that can be pursued in
the process of analyzing, improving, or rethinking existing music notation
systems and techniques.Comment: 5 pages including references, 3rd Workshop on Visualization for the
Digital Humanities, Vis4DH, IEEE Vis 201
Towards a style-specific basis for computational beat tracking
Outlined in this paper are a number of sources of evidence, from psychological, ethnomusicological and engineering grounds, to suggest that current approaches to computational beat tracking are incomplete. It is contended that the degree to which cultural knowledge, that is, the specifics of style and associated learnt representational schema, underlie the human faculty of beat tracking has been severely underestimated. Difficulties in building general beat tracking solutions, which can provide both period and phase locking across a large corpus of styles, are highlighted. It is probable that no universal beat tracking model exists which does not utilise a switching model to recognise style and context prior to application
- …