334,078 research outputs found
Human Computer Music Performance
Human Computer Music Performance (HCMP) is the study of music performance by live human performers and real-time computer-based performers. One goal of HCMP is to create a highly autonomous artificial performer that can fill the role of a human, especially in a popular music setting. This will require advances in automated music listening and understanding, new representations for music, techniques for music synchronization, real-time human-computer communication, music generation, sound synthesis, and sound diffusion. Thus, HCMP is an ideal framework to motivate and integrate advanced music research. In addition, HCMP has the potential to benefit millions of practicing musicians, both amateurs and professionals alike. The vision of HCMP, the problems that must be solved, and some recent progress are presented
Feature extraction for speech and music discrimination
Driven by the demand of information retrieval, video editing and human-computer interface, in this paper we propose a novel spectral feature for music and speech discrimination. This scheme attempts to simulate a biological model using the averaged cepstrum, where human perception tends to pick up the areas of large cepstral changes. The cepstrum data that is away from the mean value will be exponentially reduced in magnitude. We conduct experiments of music/speech discrimination by comparing the performance of the proposed feature with that of previously proposed features in classification. The dynamic time warping based classification verifies that the proposed feature has the best quality of music/speech classification in the test database
Recommended from our members
Understanding Music Interaction, and Why It Matters
This is the introductory chapter of a book dedicated to new research in, and emerging new understandings of, music and human-computer interaction—known for short as music interaction. Music interaction research plays a key role in innovative approaches to diverse musical activities, including performance, composition, education, analysis, production and collaborative music making. Music interaction is pivotal in new research directions in a range of activities, including audience participation, interaction between music and dancers, tools for algorithmic music, music video games, audio games, turntablism and live coding. More generally, music provides a powerful source of challenges and new ideas for human-computer interaction (HCI). This introductory chapter reviews the relationship between music and human-computer interaction and outlines research themes and issues that emerge from the collected work of researchers and practitioners in this book
A Participatory Live Music Performance with the Open Symphony System
Our Open Symphony system reimagines the music experience for a digital age, fostering alliances between performer and audience and our digital selves. Open Symphony enables live participatory music performance where the audience actively engages in the music creation process. This is made possible by using stateof- the-art web technologies and data visualisation techniques. Through collaborations with local performers we will conduct a series of interactive music performance revolutionizing the performance experience both for performers and audiences. The system throws open music-creating possibilities to every participant and is a genuine novel way to demonstrate the field of Human Computer Interaction through computer-supported cooperative creation and multimodal music and visual perception
Human-Computer Music Performance: From Synchronized Accompaniment to Musical Partner
Live music performance with computers has motivated many research projects in science, engineering, and the arts. In spite of decades of work, it is surprising that there is not more technology for, and a better understanding of the computer as music performer. We review the development of techniques for live music performance and outline our efforts to establish a new direction, Human-Computer Music Performance (HCMP), as a framework for a variety of coordinated studies. Our work in this area spans performance analysis, synchronization techniques, and interactive performance systems. Our goal is to enable musicians to ncorporate computers into performances easily and effectively through a better understanding of requirements, new techniques, and practical, performance-worthy implementations. We conclude with directions for future work
A Human-Computer Duet System for Music Performance
Virtual musicians have become a remarkable phenomenon in the contemporary
multimedia arts. However, most of the virtual musicians nowadays have not been
endowed with abilities to create their own behaviors, or to perform music with
human musicians. In this paper, we firstly create a virtual violinist, who can
collaborate with a human pianist to perform chamber music automatically without
any intervention. The system incorporates the techniques from various fields,
including real-time music tracking, pose estimation, and body movement
generation. In our system, the virtual musician's behavior is generated based
on the given music audio alone, and such a system results in a low-cost,
efficient and scalable way to produce human and virtual musicians'
co-performance. The proposed system has been validated in public concerts.
Objective quality assessment approaches and possible ways to systematically
improve the system are also discussed
Complexity Measures of Music
We present a technique to search for the presence of crucial events in music,
based on the analysis of the music volume. Earlier work on this issue was based
on the assumption that crucial events correspond to the change of music notes,
with the interesting result that the complexity index of the crucial events is
mu ~ 2, which is the same inverse power-law index of the dynamics of the brain.
The search technique analyzes music volume and confirms the results of the
earlier work, thereby contributing to the explanation as to why the brain is
sensitive to music, through the phenomenon of complexity matching. Complexity
matching has recently been interpreted as the transfer of multifractality from
one complex network to another. For this reason we also examine the
mulifractality of music, with the observation that the multifractal spectrum of
a computer performance is significantly narrower than the multifractal spectrum
of a human performance of the same musical score. We conjecture that although
crucial events are demonstrably important for information transmission, they
alone are not suficient to define musicality, which is more adequately measured
by the multifractality spectrum
Primary and secondary aspects of musical gestures in live coding performance
Embodied interaction in live coding performance has been reported and is particularly present in dance performance. Here, we review studies from music phychology and perception along with studies from live coding and human-computer interaction. We examined how the live coder may experience primary aspects of musical gestures and we discuss how music listening and musical imagery may trigger mental models of gestural unfoldings. Disciplinary gaps between music psychology and the live coding community are discussed
Paradigms of Music Software Development
On the way to a more comprehensive and integrative historiography of music software, this paper proposes a survey of the main paradigms of music software development from the 1950s to the present. Concentrating on applications for music composition, production and performance, the analysis focusses on the concept and design of the human-computer-interaction as well as the implicit user
Biomechanical Modelling of Musical Performance: A Case Study of the Guitar
Merged with duplicate record 10026.1/2517 on 07.20.2017 by CS (TIS)Computer-generated musical performances are often criticised for being unable
to match the expressivity found in performances by humans. Much research
has been conducted in the past two decades in order to create computer
technology able to perform a given piece music as expressively as humans,
largely without success. Two approaches have been often adopted to research
into modelling expressive music performance on computers. The first focuses
on sound; that is, on modelling patterns of deviations between a recorded
human performance and the music score. The second focuses on modelling the
cognitive processes involved in a musical performance. Both approaches are
valid and can complement each other. In this thesis we propose a third
complementary approach, focusing on the guitar, which concerns the physical
manipulation of the instrument by the performer: a biomechanical approach.
The essence of this thesis is a study on capturing, analyzing and modelling
information about motor and biomechanical processes of guitar performance.
The focus is on speed, precision, and force of a guitarist's left-hand. The
overarching questions behind our study are:
1) Do unintentional actions originating from motor and biomechanical
functions during musical performance contribute a material "human feel"
to the performance?
2) Would it be possible determine and quantify such unintentional actions? 3) Would it be possible to model and embed such information in a computer
system?
The contributionst o knowledgep ursued in this thesis include:
a) An unprecedented study of guitar mechanics, ergonomics, and
playability;
b) A detailed study of how the human body performs actions when playing
the guitar;
c) A methodologyt o formally record quantifiable data about such actionsin
performance;
d) An approach to model such information, and
e) A demonstration of how the above knowledge can be embeddedin a
system for music performance
- …