Skip to main content

Research Repository

Advanced Search

Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan

Nash, Chris

Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan Thumbnail


Authors

Chris Nash Chris.Nash@uwe.ac.uk
Senior Lecturer in Music Tech - Software Development



Contributors

Romain Michon
Editor

Franziska Schroeder
Editor

Abstract

This paper details technologies and artistic approaches to crowd-driven music, discussed in the context of a live public installation in which activity in a public space (e.g. a busy railway platform) is used to drive the automated composition and performance of music. The approach presented uses realtime machine vision applied to a live video feed of a scene, from which detected objects and people are fed into Manhattan (Nash, 2014), a digital music notation that integrates sequencing and programming to support the live creation of complex musical works that combine static, algorithmic, and interactive elements. The paper discusses the technical details of the system and artistic development of specific musical works, introducing novel techniques to mapping chaotic systems to musical expression and exploring issues of agency, aesthetic, accessibility, and adaptability relating to composing interactive music for crowds and public spaces. In particular, performances as part of an installation for BBC Music Day 2018 are described. The paper subsequently details a practical workshop in crowd-driven music, delivered digitally, exploring the development of interactive performances in which the audience or general public actively or passively control live generation of a musical piece. Exercises support discussions on technical, aesthetic, and ontological issues arising from the identification and mapping of structure, order, and meaning in non-musical domains to analogous concepts in musical expression. Materials for the workshop are available freely within the Manhattan software.

Citation

Nash, C. (2020). Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan. In F. Schroeder, & R. Michon (Eds.), Proceedings of the International Conference on New Interfaces for Musical Expression (259-264)

Conference Name New Interfaces for Musical Expression
Conference Location Birmingham, UK
Start Date Jul 21, 2020
End Date Jul 25, 2020
Acceptance Date Mar 30, 2020
Online Publication Date Jul 25, 2020
Publication Date Jul 31, 2020
Deposit Date Jun 24, 2020
Publicly Available Date May 4, 2021
Pages 259-264
Series ISSN 2220-4806
Book Title Proceedings of the International Conference on New Interfaces for Musical Expression
Keywords crowd-driven music, audience participation, open works, interactive art, generative scores, machine learning, machine vision CCS Concepts, applied computing, sound and music computing, performing arts, human-centered computing, mixed / augmented reality,
Public URL https://uwe-repository.worktribe.com/output/6052512
Publisher URL https://www.nime.org/proceedings/2020/nime2020_paper49.pdf

Files







Related Outputs



You might also like



Downloadable Citations