360 research outputs found
End-User Service Computing: Spreadsheets as a Service Composition Tool
In this paper, we show how spreadsheets, an end-user development paradigm proven to be highly productive and simple to learn and use, can be used for complex service compositions. We identify the requirements for spreadsheet-based service composition, and present our framework that implements these requirements. Our framework enables spreadsheets to send requests and retrieve results from various local and remote services. We show how our tools support different composition patterns, and how the style of declarative dependencies of spreadsheets can facilitate service composition. We also discuss novel issues identified by using the framework in several projects and education
Algorithmic composition of music in real-time with soft constraints
Music has been the subject of formal approaches for a long time, ranging from Pythagorasâ elementary research on tonal systems to J. S. Bachâs elaborate formal composition techniques. Especially in the 20th century, much music was composed based on formal techniques: Algorithmic approaches for composing music were developed by composers like A. Schoenberg as well as in the scientific area. So far, a variety of mathematical techniques have been employed for composing music, e.g. probability models, artificial neural networks or constraint-based reasoning. In the recent time, interactive music systems have become popular: existing songs can be replayed with musical video games and original music can be interactively composed with easy-to-use applications running e.g. on mobile devices. However, applications which algorithmically generate music in real-time based on user interaction are mostly experimental and limited in either interactivity or musicality. There are many enjoyable applications but there are also many opportunities for improvements and novel approaches.
The goal of this work is to provide a general and systematic approach for specifying and implementing interactive music systems. We introduce an algebraic framework for interactively composing music in real-time with a reasoning-technique called âsoft constraintsâ: this technique allows modeling and solving a large range of problems and is suited particularly well for problems with soft and concurrent optimization goals. Our framework is based on well-known theories for music and soft constraints and allows specifying interactive music systems by declaratively defining âhow the music should soundâ with respect to both user interaction and musical rules. Based on this core framework, we introduce an approach for interactively generating music similar to existing melodic material. With this approach, musical rules can be defined by playing notes (instead of writing code) in order to make interactively generated melodies comply with a certain musical style. We introduce an implementation of the algebraic framework in .NET and present several concrete applications: âThe Planetsâ is an application controlled by a table-based tangible interface where music can be interactively composed by arranging planet constellations. âFluxusâ is an application geared towards musicians which allows training melodic material that can be used to define musical styles for applications geared towards non-musicians. Based on musical styles trained by the Fluxus sequencer, we introduce a general approach for transforming spatial movements to music and present two concrete applications: the first one is controlled by a touch display, the second one by a motion tracking system. At last, we investigate how interactive music systems can be used in the area of pervasive advertising in general and how our approach can be used to realize âinteractive advertising jinglesâ.Musik ist seit langem Gegenstand formaler Untersuchungen, von Phytagorasâ grundlegender Forschung zu tonalen Systemen bis hin zu J. S. Bachs aufwĂ€ndigen formalen Kompositionstechniken. Vor allem im 20. Jahrhundert wurde vielfach Musik nach formalen Methoden komponiert: Algorithmische AnsĂ€tze zur Komposition von Musik wurden sowohl von Komponisten wie A. Schoenberg als auch im wissenschaftlichem Bereich entwickelt. Bislang wurde eine Vielzahl von mathematischen Methoden zur Komposition von Musik verwendet, z.B. statistische Modelle, kĂŒnstliche neuronale Netze oder Constraint-Probleme. In der letzten Zeit sind interaktive Musiksysteme populĂ€r geworden: Bekannte Songs können mit Musikspielen nachgespielt werden, und mit einfach zu bedienenden Anwendungen kann man neue Musik interaktiv komponieren (z.B. auf mobilen GerĂ€ten). Allerdings sind die meisten Anwendungen, die basierend auf Benutzerinteraktion in Echtzeit algorithmisch Musik generieren, eher experimentell und in InteraktivitĂ€t oder MusikalitĂ€t limitiert. Es gibt viele unterhaltsame Anwendungen, aber ebenso viele Möglichkeiten fĂŒr Verbesserungen und neue AnsĂ€tze.
Das Ziel dieser Arbeit ist es, einen allgemeinen und systematischen Ansatz zur Spezifikation und Implementierung von interaktiven Musiksystemen zu entwickeln. Wir stellen ein algebraisches Framework zur interaktiven Komposition von Musik in Echtzeit vor welches auf sog. âSoft Constraintsâ basiert, einer Methode aus dem Bereich der kĂŒnstlichen Intelligenz. Mit dieser Methode ist es möglich, eine groĂe Anzahl von Problemen zu modellieren und zu lösen. Sie ist besonders gut geeignet fĂŒr Probleme mit unklaren und widersprĂŒchlichen Optimierungszielen. Unser Framework basiert auf gut erforschten Theorien zu Musik und Soft Constraints und ermöglicht es, interaktive Musiksysteme zu spezifizieren, indem man deklarativ angibt, âwie sich die Musik anhören sollâ in Bezug auf sowohl Benutzerinteraktion als auch musikalische Regeln. Basierend auf diesem Framework stellen wir einen neuen Ansatz vor, um interaktiv Musik zu generieren, die Ă€hnlich zu existierendem melodischen Material ist. Dieser Ansatz ermöglicht es, durch das Spielen von Noten (nicht durch das Schreiben von Programmcode) musikalische Regeln zu definieren, nach denen interaktiv generierte Melodien an einen bestimmten Musikstil angepasst werden. Wir prĂ€sentieren eine Implementierung des algebraischen Frameworks in .NET sowie mehrere konkrete Anwendungen: âThe Planetsâ ist eine Anwendung fĂŒr einen interaktiven Tisch mit der man Musik komponieren kann, indem man Planetenkonstellationen arrangiert. âFluxusâ ist eine Anwendung, die sich an Musiker richtet. Sie erlaubt es, melodisches Material zu trainieren, das wiederum als Musikstil in Anwendungen benutzt werden kann, die sich an Nicht-Musiker richten. Basierend auf diesen trainierten Musikstilen stellen wir einen generellen Ansatz vor, um rĂ€umliche Bewegungen in Musik umzusetzen und zwei konkrete Anwendungen basierend auf einem Touch-Display bzw. einem Motion-Tracking-System. AbschlieĂend untersuchen wir, wie interaktive Musiksysteme im Bereich âPervasive Advertisingâ eingesetzt werden können und wie unser Ansatz genutzt werden kann, um âinteraktive Werbejinglesâ zu realisieren
Extempore: The design, implementation and application of a cyber-physical programming language
There is a long history of experimental and exploratory
programming
supported by systems that expose interaction through a
programming
language interface. These live programming systems enable
software
developers to create, extend, and modify the behaviour of
executing
software by changing source code without perceptual breaks for
recompilation. These live programming systems have taken many
forms,
but have generally been limited in their ability to express
low-level
programming concepts and the generation of efficient native
machine
code. These shortcomings have limited the effectiveness of live
programming in domains that require highly efficient numerical
processing and explicit memory management.
The most general questions addressed by this thesis are what a
systems
language designed for live programming might look like and how
such a
language might influence the development of live programming in
performance sensitive domains requiring real-time support,
direct
hardware control, or high performance computing. This thesis
answers
these questions by exploring the design, implementation and
application of Extempore, a new systems programming language,
designed specifically for live interactive programming
I am who I am : LGBTQ+ student experiences at a Baptist liberal arts University.
Though studies exploring the experiences of LGBTQ+ students on Christian college and university (CCU) campuses are increasingly prevalent, research continues to demonstrate that CCU environments are often unwelcoming. Gender and sexual minority students often face additional challenges or risks in attending a faith-based institution. To drive meaningful change, recommendations need to be tailored to individual institutions. This study sought to make meaning alongside LGBTQ+ students at a single institution â Baptist Heritage University (BHU) â with a decidedly appreciative approach. Grounded in Bronfenbrennerâs ecology of human development, we conducted an arts- based action research study to gain a deeper understanding of how LGBTQ+ students know themselves, and how they relate to other people and environments. To gather data, we held focus group meetings, individual interviews, and collaborated to compose pieces of music as a way of sharing stories and elevating the voices of LGBTQ+ students. Analysis of the interviews utilized the Listening Guide approach to identifying and understanding voices within an individualâs story. The data gleaned from the interviews would serve as the basis upon which the musical compositions would be created. We found that each co-researcher had a voice of advocacy and, that although vii their campus environment is not perceived as welcoming, they were able to identify employees and peers who have had a positive impact on their LGBTQ+ identity development. Typically, however, it is the expectation of LGBTQ+ co-researchers to anticipate an exclusionary environment. Because of BHUâs identity as a Christian institution, LGBTQ+ students may enter college at BHU assuming or expecting to be excluded or marginalized because of their gender and/or sexual identities. Still, co-researchers demonstrated resilience and remained hopeful that BHU can leverage its connections within the local community, especially those of faith communities, to promote LGBTQ+ inclusivity. In addition to their voices of advocacy, co-researchers are also exceptionally empathetic and compassionate. It is out of this responsibility for others that we collectively composed a piece of music to share with BHU as a resource for generating greater understanding and promoting empathy
Computational Creativity and Music Generation Systems: An Introduction to the State of the Art
Computational Creativity is a multidisciplinary field that tries to obtain creative behaviors from computers. One of its most prolific subfields is that of Music Generation (also called Algorithmic Composition or Musical Metacreation), that uses computational means to compose music. Due to the multidisciplinary nature of this research field, it is sometimes hard to define precise goals and to keep track of what problems can be considered solved by state-of-the-art systems and what instead needs further developments. With this survey, we try to give a complete introduction to those who wish to explore Computational Creativity and Music Generation. To do so, we first give a picture of the research on the definition and the evaluation of creativity, both human and computational, needed to understand how computational means can be used to obtain creative behaviors and its importance within Artificial Intelligence studies. We then review the state of the art of Music Generation Systems, by citing examples for all the main approaches to music generation, and by listing the open challenges that were identified by previous reviews on the subject. For each of these challenges, we cite works that have proposed solutions, describing what still needs to be done and some possible directions for further research
- âŠ