423 research outputs found

    Computational Creativity and Music Generation Systems: An Introduction to the State of the Art

    Get PDF
    Computational Creativity is a multidisciplinary field that tries to obtain creative behaviors from computers. One of its most prolific subfields is that of Music Generation (also called Algorithmic Composition or Musical Metacreation), that uses computational means to compose music. Due to the multidisciplinary nature of this research field, it is sometimes hard to define precise goals and to keep track of what problems can be considered solved by state-of-the-art systems and what instead needs further developments. With this survey, we try to give a complete introduction to those who wish to explore Computational Creativity and Music Generation. To do so, we first give a picture of the research on the definition and the evaluation of creativity, both human and computational, needed to understand how computational means can be used to obtain creative behaviors and its importance within Artificial Intelligence studies. We then review the state of the art of Music Generation Systems, by citing examples for all the main approaches to music generation, and by listing the open challenges that were identified by previous reviews on the subject. For each of these challenges, we cite works that have proposed solutions, describing what still needs to be done and some possible directions for further research

    A Functional Taxonomy of Music Generation Systems

    Get PDF
    Digital advances have transformed the face of automatic music generation since its beginnings at the dawn of computing. Despite the many breakthroughs, issues such as the musical tasks targeted by different machines and the degree to which they succeed remain open questions. We present a functional taxonomy for music generation systems with reference to existing systems. The taxonomy organizes systems according to the purposes for which they were designed. It also reveals the inter-relatedness amongst the systems. This design-centered approach contrasts with predominant methods-based surveys and facilitates the identification of grand challenges to set the stage for new breakthroughs.Comment: survey, music generation, taxonomy, functional survey, survey, automatic composition, algorithmic compositio

    Robotic Musicianship - Musical Interactions Between Humans and Machines

    Get PDF

    Mobile Music Development Tools for Creative Coders

    Get PDF
    This project is a body of work that facilitates the creation of musical mobile artworks. The project includes a code toolkit that enhances and simplifies the development of mobile music iOS applications, a flexible notation system designed for mobile musical interactions, and example apps and scored compositions to demonstrate the toolkit and notation system. The code library is designed to simplify the technical aspect of user-centered design and development with a more direct connection between concept and deliverable. This sim- plification addresses learning problems (such as motivation, self-efficacy, and self-perceived understanding) by bridging the gap between idea and functional prototype and improving the ability to contextualize the development process for musicians and other creatives. The toolkit helps to circumvent the need to learn complex iOS development patterns and affords more readable code. CSPD (color, shape, pattern, density) notation is a pseudo-tablature that describes performance interactions. The system leverages visual density and patterns of both color and shape to describe types of gestures (physical or musical) and their relationships rather than focusing on strict rhythmic or pitch/frequency content. The primary design goal is to visualize macro musical concepts that create middleground structure

    Interfacing Jazz: A Study in Computer-Mediated Jazz Music Creation And Performance

    Get PDF
    O objetivo central desta dissertação Ă© o estudo e desenvolvimento de algoritmos e interfaces mediados por computador para performance e criação musical. É sobretudo centrado em acompanhamentos em Jazz clĂĄssico e explora um meta-controlo dos parĂąmetros musicais como forma de potenciar a experiĂȘncia de tocar Jazz por mĂșsicos e nĂŁo-mĂșsicos, quer individual quer coletivamente. Pretende contribuir para a pesquisa existente nas ĂĄreas de geração automĂĄtica de mĂșsica e de interfaces para expressĂŁo musical, apresentando um conjunto de algoritmos e interfaces de controlo especialmente criados para esta dissertação. Estes algoritmos e interfaces implementam processos inteligentes e musicalmente informados, para gerar eventos musicais sofisticados e corretos musical estilisticamente, de forma automĂĄtica, a partir de um input simplificado e intuitivo do utilizador, e de forma coerente gerir a experiĂȘncia de grupo, estabelecendo um controlo integrado sobre os parĂąmetros globais. A partir destes algoritmos sĂŁo apresentadas propostas para diferentes aplicaçÔes dos conceitos e tĂ©cnicas, de forma a ilustrar os benefĂ­cios e potencial da utilização de um meta-controlo como extensĂŁo dos paradigmas existentes para aplicaçÔes musicais, assim como potenciar a criação de novos. Estas aplicaçÔes abordam principalmente trĂȘs ĂĄreas onde a mĂșsica mediada por computador pode trazer grandes benefĂ­cios, nomeadamente a performance, a criação e a educação. Uma aplicação, PocketBand, implementada no ambiente de programação Max, permite a um grupo de utilizadores tocarem em grupo como uma banda de jazz, quer sejam ou nĂŁo treinados musicalmente, cada um utilizando um teclado de computador ou um dispositivo iOS multitoque. O segundo protĂłtipo visa a utilização em contextos coletivos e participativos. Trata-se de uma instalação para vĂĄrios utilizadores, para ecrĂŁ multitoque, intitulada MyJazzBand, que permite atĂ© quatro utilizadores tocarem juntos como membros de uma banda de jazz virtual. Ambas as aplicaçÔes permitem que os utilizadores experienciem e participem de forma eficaz como mĂșsicos de jazz, quer sejam ou nĂŁo mĂșsicos profissionais. As aplicaçÔes podem ser utilizadas para fins educativos, seja como um sistema de acompanhamento automĂĄtico em tempo real para qualquer instrumentista ou cantor, seja como uma fonte de informação para procedimentos harmĂłnicos, ou como uma ferramenta prĂĄtica para criar esboços ou conteĂșdos para aulas. Irei tambĂ©m demonstrar que esta abordagem reflete uma tendĂȘncia crescente entre as empresas de software musical comercial, que jĂĄ começaram a explorar a mediação por computador e algoritmos musicais inteligentes.Abstract : This dissertation focuses on the study and development of computer-mediated interfaces and algorithms for music performance and creation. It is mainly centered on traditional Jazz music accompaniment and explores the meta-control over musical events to potentiate the rich experience of playing jazz by musicians and non-musicians alike, both individually and collectively. It aims to complement existing research on automatic generation of jazz music and new interfaces for musical expression, by presenting a group of specially designed algorithms and control interfaces that implement intelligent, musically informed processes to automatically produce sophisticated and stylistically correct musical events. These algorithms and control interfaces are designed to have a simplified and intuitive input from the user, and to coherently manage group playing by establishing an integrated control over global common parameters. Using these algorithms, two proposals for different applications are presented, in order to illustrate the benefits and potential of this meta-control approach to extend existing paradigms for musical applications, as well as to create new ones. These proposals focus on two main perspectives where computer-mediated music can benefit by using this approach, namely in musical performance and creation, both of which can also be observed from an educational perspective. A core framework, implemented in the Max programming environment, integrates all the functionalities of the instrument algorithms and control strategies, as well as global control, synchronization and communication between all the components. This platform acts as a base, from which different applications can be created. For this dissertation, two main application concepts were developed. The first, PocketBand, has a single-user, one-man-band approach, where a single interface allows a single user to play up to three instruments. This prototype application, for a multi- touch tablet, was the test bed for several experiments with the user interface and playability issues that helped define and improve the mediated interface concept and the instrument algorithms. The second prototype aims the creation of a collective experience. It is a multi-user installation for a multi-touch table, called MyJazzBand, that allows up to four users to play together as members of a virtual jazz band. Both applications allow the users to experience and effectively participate as jazz band musicians, whether they are musically trained or not. The applications can be used for educational purposes, whether as a real-time accompaniment system for any jazz instrument practitioner or singer, as a source of information for harmonic procedures, or as a practical tool for creating quick arrangement drafts or music lesson contents. I will also demonstrate that this approach reflects a growing trend on commercial music software that has already begun to explore and implement mediated interfaces and intelligent music algorithms

    Interactive real-time musical systems

    Get PDF
    PhDThis thesis focuses on the development of automatic accompaniment systems. We investigate previous systems and look at a range of approaches that have been attempted for the problem of beat tracking. Most beat trackers are intended for the purposes of music information retrieval where a `black box' approach is tested on a wide variety of music genres. We highlight some of the diffculties facing offline beat trackers and design a new approach for the problem of real-time drum tracking, developing a system, B-Keeper, which makes reasonable assumptions on the nature of the signal and is provided with useful prior knowledge. Having developed the system with offline studio recordings, we look to test the system with human players. Existing offline evaluation methods seem less suitable for a performance system, since we also wish to evaluate the interaction between musician and machine. Although statistical data may reveal quantifiable measurements of the system's predictions and behaviour, we also want to test how well it functions within the context of a live performance. To do so, we devise an evaluation strategy to contrast a machine-controlled accompaniment with one controlled by a human. We also present recent work on a real-time multiple pitch tracking, which is then extended to provide automatic accompaniment for harmonic instruments such as guitar. By aligning salient notes in the output from a dual pitch tracking process, we make changes to the tempo of the accompaniment in order to align it with a live stream. By demonstrating the system's ability to align offline tracks, we can show that under restricted initial conditions, the algorithm works well as an alignment tool

    AN APPROACH TO MACHINE DEVELOPMENT OF MUSICAL ONTOGENY

    Get PDF
    This Thesis pursues three main objectives: (i) to use computational modelling to explore how music is perceived, cognitively processed and created by human beings; (ii) to explore interactive musical systems as a method to model and achieve the transmission of musical influence in artificial worlds and between humans and machines; and (iii) to experiment with artificial and alternative developmental musical routes in order to observe the evolution of musical styles. In order to achieve these objectives, this Thesis introduces a new paradigm for the design of computer interactive musical systems called the Ontomemetical Model of Music Evolution - OMME, which includes the fields of musical ontogenesis and memetlcs. OMME-based systems are designed to artificially explore the evolution of music centred on human perceptive and cognitive faculties. The potential of the OMME is illustrated with two interactive musical systems, the Rhythmic Meme Generator (RGeme) and the Interactive Musical Environments (iMe). which have been tested in a series of laboratory experiments and live performances. The introduction to the OMME is preceded by an extensive and critical overview of the state of the art computer models that explore musical creativity and interactivity, in addition to a systematic exposition of the major issues involved in the design and implementation of these systems. This Thesis also proposes innovative solutions for (i) the representation of musical streams based on perceptive features, (ii) music segmentation, (iii) a memory-based music model, (iv) the measure of distance between musical styles, and (v) an impi*ovisation-based creative model

    Music Learning with Massive Open Online Courses

    Get PDF
    Steels, Luc et al.-- Editors: Luc SteelsMassive Open Online Courses, known as MOOCs, have arisen as the logical consequence of marrying long-distance education with the web and social media. MOOCs were confidently predicted by advanced thinkers decades ago. They are undoubtedly here to stay, and provide a valuable resource for learners and teachers alike. This book focuses on music as a domain of knowledge, and has three objectives: to introduce the phenomenon of MOOCs; to present ongoing research into making MOOCs more effective and better adapted to the needs of teachers and learners; and finally to present the first steps towards 'social MOOCs’, which support the creation of learning communities in which interactions between learners go beyond correcting each other's assignments. Social MOOCs try to mimic settings for humanistic learning, such as workshops, small choirs, or groups participating in a Hackathon, in which students aided by somebody acting as a tutor learn by solving problems and helping each other. The papers in this book all discuss steps towards social MOOCs; their foundational pedagogy, platforms to create learning communities, methods for assessment and social feedback and concrete experiments. These papers are organized into five sections: background; the role of feedback; platforms for learning communities; experiences with social MOOCs; and looking backwards and looking forward. Technology is not a panacea for the enormous challenges facing today's educators and learners, but this book will be of interest to all those striving to find more effective and humane learning opportunities for a larger group of students.Funded by the European Commission's OpenAIRE2020 project.Peer reviewe

    Algorithmic composition of music in real-time with soft constraints

    Get PDF
    Music has been the subject of formal approaches for a long time, ranging from Pythagoras’ elementary research on tonal systems to J. S. Bach’s elaborate formal composition techniques. Especially in the 20th century, much music was composed based on formal techniques: Algorithmic approaches for composing music were developed by composers like A. Schoenberg as well as in the scientific area. So far, a variety of mathematical techniques have been employed for composing music, e.g. probability models, artificial neural networks or constraint-based reasoning. In the recent time, interactive music systems have become popular: existing songs can be replayed with musical video games and original music can be interactively composed with easy-to-use applications running e.g. on mobile devices. However, applications which algorithmically generate music in real-time based on user interaction are mostly experimental and limited in either interactivity or musicality. There are many enjoyable applications but there are also many opportunities for improvements and novel approaches. The goal of this work is to provide a general and systematic approach for specifying and implementing interactive music systems. We introduce an algebraic framework for interactively composing music in real-time with a reasoning-technique called ‘soft constraints’: this technique allows modeling and solving a large range of problems and is suited particularly well for problems with soft and concurrent optimization goals. Our framework is based on well-known theories for music and soft constraints and allows specifying interactive music systems by declaratively defining ‘how the music should sound’ with respect to both user interaction and musical rules. Based on this core framework, we introduce an approach for interactively generating music similar to existing melodic material. With this approach, musical rules can be defined by playing notes (instead of writing code) in order to make interactively generated melodies comply with a certain musical style. We introduce an implementation of the algebraic framework in .NET and present several concrete applications: ‘The Planets’ is an application controlled by a table-based tangible interface where music can be interactively composed by arranging planet constellations. ‘Fluxus’ is an application geared towards musicians which allows training melodic material that can be used to define musical styles for applications geared towards non-musicians. Based on musical styles trained by the Fluxus sequencer, we introduce a general approach for transforming spatial movements to music and present two concrete applications: the first one is controlled by a touch display, the second one by a motion tracking system. At last, we investigate how interactive music systems can be used in the area of pervasive advertising in general and how our approach can be used to realize ‘interactive advertising jingles’.Musik ist seit langem Gegenstand formaler Untersuchungen, von Phytagoras‘ grundlegender Forschung zu tonalen Systemen bis hin zu J. S. Bachs aufwĂ€ndigen formalen Kompositionstechniken. Vor allem im 20. Jahrhundert wurde vielfach Musik nach formalen Methoden komponiert: Algorithmische AnsĂ€tze zur Komposition von Musik wurden sowohl von Komponisten wie A. Schoenberg als auch im wissenschaftlichem Bereich entwickelt. Bislang wurde eine Vielzahl von mathematischen Methoden zur Komposition von Musik verwendet, z.B. statistische Modelle, kĂŒnstliche neuronale Netze oder Constraint-Probleme. In der letzten Zeit sind interaktive Musiksysteme populĂ€r geworden: Bekannte Songs können mit Musikspielen nachgespielt werden, und mit einfach zu bedienenden Anwendungen kann man neue Musik interaktiv komponieren (z.B. auf mobilen GerĂ€ten). Allerdings sind die meisten Anwendungen, die basierend auf Benutzerinteraktion in Echtzeit algorithmisch Musik generieren, eher experimentell und in InteraktivitĂ€t oder MusikalitĂ€t limitiert. Es gibt viele unterhaltsame Anwendungen, aber ebenso viele Möglichkeiten fĂŒr Verbesserungen und neue AnsĂ€tze. Das Ziel dieser Arbeit ist es, einen allgemeinen und systematischen Ansatz zur Spezifikation und Implementierung von interaktiven Musiksystemen zu entwickeln. Wir stellen ein algebraisches Framework zur interaktiven Komposition von Musik in Echtzeit vor welches auf sog. ‚Soft Constraints‘ basiert, einer Methode aus dem Bereich der kĂŒnstlichen Intelligenz. Mit dieser Methode ist es möglich, eine große Anzahl von Problemen zu modellieren und zu lösen. Sie ist besonders gut geeignet fĂŒr Probleme mit unklaren und widersprĂŒchlichen Optimierungszielen. Unser Framework basiert auf gut erforschten Theorien zu Musik und Soft Constraints und ermöglicht es, interaktive Musiksysteme zu spezifizieren, indem man deklarativ angibt, ‚wie sich die Musik anhören soll‘ in Bezug auf sowohl Benutzerinteraktion als auch musikalische Regeln. Basierend auf diesem Framework stellen wir einen neuen Ansatz vor, um interaktiv Musik zu generieren, die Ă€hnlich zu existierendem melodischen Material ist. Dieser Ansatz ermöglicht es, durch das Spielen von Noten (nicht durch das Schreiben von Programmcode) musikalische Regeln zu definieren, nach denen interaktiv generierte Melodien an einen bestimmten Musikstil angepasst werden. Wir prĂ€sentieren eine Implementierung des algebraischen Frameworks in .NET sowie mehrere konkrete Anwendungen: ‚The Planets‘ ist eine Anwendung fĂŒr einen interaktiven Tisch mit der man Musik komponieren kann, indem man Planetenkonstellationen arrangiert. ‚Fluxus‘ ist eine Anwendung, die sich an Musiker richtet. Sie erlaubt es, melodisches Material zu trainieren, das wiederum als Musikstil in Anwendungen benutzt werden kann, die sich an Nicht-Musiker richten. Basierend auf diesen trainierten Musikstilen stellen wir einen generellen Ansatz vor, um rĂ€umliche Bewegungen in Musik umzusetzen und zwei konkrete Anwendungen basierend auf einem Touch-Display bzw. einem Motion-Tracking-System. Abschließend untersuchen wir, wie interaktive Musiksysteme im Bereich ‚Pervasive Advertising‘ eingesetzt werden können und wie unser Ansatz genutzt werden kann, um ‚interaktive Werbejingles‘ zu realisieren
    • 

    corecore