3 research outputs found

    Pipeline for recording datasets and running neural networks on the Bela embedded hardware platform

    Get PDF
    Deploying deep learning models on embedded devices is an arduous task: oftentimes, there exist no platform-specific instructions, and compilation times can be considerably large due to the limited computational resources available on-device. Moreover, many music-making applications de- mand real-time inference. Embedded hardware platforms for audio, such as Bela, offer an entry point for beginners into physical audio computing; however, the need for cross- compilation environments and low-level software develop- ment tools for deploying embedded deep learning models imposes high entry barriers on non-expert users. We present a pipeline for deploying neural networks in the Bela embedded hardware platform. In our pipeline, we include a tool to record a multichannel dataset of sen- sor signals. Additionally, we provide a dockerised cross- compilation environment for faster compilation. With this pipeline, we aim to provide a template for programmers and makers to prototype and experiment with neural networks for real-time embedded musical applications

    Agential Instruments Design Workshop

    No full text
    Physical and gestural musical instruments that take advantage of artificial intelligence and machine learning to explore instrumental agency are becoming more accessible due to the development of new tools and workflows specialised for mobility, portability, efficiency and low latency. This full-day, hands-on workshop will provide all of these tools to participants along with support from their creators, enabling rapid creative exploration of their applications a musical instrument design
    corecore