Chris Nash Chris.Nash@uwe.ac.uk
Senior Lecturer in Music Tech - Software Development
Crowd-driven Music: Interactive and Generative Approaches using Machine Vision and Manhattan
Nash, Chris
Authors
Abstract
This paper details technologies and artistic approaches to crowd-driven music, discussed in the context of a live public installation in which activity in a public space (e.g. a busy railway platform) is used to drive the automated composition and performance of music. The approach presented uses realtime machine vision applied to a live video feed of a scene, from which detected objects and people are fed into Manhattan (Nash, 2014), a digital music notation that integrates sequencing and programming to support the live creation of complex musical works that combine static, algorithmic, and interactive elements. The paper discusses the technical details of the system and artistic development of specific musical works, introducing novel techniques to mapping chaotic systems to musical expression and exploring issues of agency, aesthetic, accessibility, and adaptability relating to composing interactive music for crowds and public spaces. In particular, performances as part of an installation for BBC Music Day 2018 are described. The paper subsequently details a practical workshop in crowd-driven music, delivered digitally, exploring the development of interactive performances in which the audience or general public actively or passively control live generation of a musical piece. Exercises support discussions on technical, aesthetic, and ontological issues arising from the identification and mapping of structure, order, and meaning in non-musical domains to analogous concepts in musical expression. Materials for the workshop are available freely within the Manhattan software.
Citation
Nash, C. (in press). Crowd-driven Music: Interactive and Generative Approaches using Machine Vision and Manhattan
Conference Name | New Interfaces for Musical Expression |
---|---|
Start Date | Jul 27, 2020 |
Acceptance Date | Mar 30, 2020 |
Online Publication Date | Jul 27, 2020 |
Deposit Date | Jun 24, 2020 |
Publicly Available Date | Sep 28, 2020 |
Keywords | Author Keywords crowd-driven music, audience participation, open works, interactive art, generative scores, machine learning, machine vision CCS Concepts • Applied computing → Sound and music computing; Performing arts; • Human-centered computing → Mixed |
Public URL | https://uwe-repository.worktribe.com/output/6052512 |
Files
NIME2020-Paper-CAMERA
(5.3 Mb)
PDF
Licence
http://creativecommons.org/licenses/by/4.0/
Publisher Licence URL
http://creativecommons.org/licenses/by/4.0/
You might also like
Automating algorithmic representations of musical structure using IGME: The Interactive Generative Music Environment
(2019)
Presentation / Conference
Understanding user-defined mapping design in mid-air musical performance
(2018)
Conference Proceeding
A cognitive dimensions approach for the design of an interactive generative score editor
(2018)
Presentation / Conference
The Manhattan project: Creativity and computing synthesised
(2018)
Presentation / Conference