Chris Nash Chris.Nash@uwe.ac.uk
Senior Lecturer in Music Tech - Software Development
Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan
Nash, Chris
Authors
Contributors
Romain Michon
Editor
Franziska Schroeder
Editor
Abstract
This paper details technologies and artistic approaches to crowd-driven music, discussed in the context of a live public installation in which activity in a public space (e.g. a busy railway platform) is used to drive the automated composition and performance of music. The approach presented uses realtime machine vision applied to a live video feed of a scene, from which detected objects and people are fed into Manhattan (Nash, 2014), a digital music notation that integrates sequencing and programming to support the live creation of complex musical works that combine static, algorithmic, and interactive elements. The paper discusses the technical details of the system and artistic development of specific musical works, introducing novel techniques to mapping chaotic systems to musical expression and exploring issues of agency, aesthetic, accessibility, and adaptability relating to composing interactive music for crowds and public spaces. In particular, performances as part of an installation for BBC Music Day 2018 are described. The paper subsequently details a practical workshop in crowd-driven music, delivered digitally, exploring the development of interactive performances in which the audience or general public actively or passively control live generation of a musical piece. Exercises support discussions on technical, aesthetic, and ontological issues arising from the identification and mapping of structure, order, and meaning in non-musical domains to analogous concepts in musical expression. Materials for the workshop are available freely within the Manhattan software.
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | New Interfaces for Musical Expression |
Start Date | Jul 21, 2020 |
End Date | Jul 25, 2020 |
Acceptance Date | Mar 30, 2020 |
Online Publication Date | Jul 25, 2020 |
Publication Date | Jul 31, 2020 |
Deposit Date | Jun 24, 2020 |
Publicly Available Date | May 4, 2021 |
Pages | 259-264 |
Series ISSN | 2220-4806 |
Book Title | Proceedings of the International Conference on New Interfaces for Musical Expression |
Keywords | crowd-driven music, audience participation, open works, interactive art, generative scores, machine learning, machine vision CCS Concepts, applied computing, sound and music computing, performing arts, human-centered computing, mixed / augmented reality, |
Public URL | https://uwe-repository.worktribe.com/output/6052512 |
Publisher URL | https://www.nime.org/proceedings/2020/nime2020_paper49.pdf |
Files
Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan
(1.9 Mb)
PDF
Licence
http://creativecommons.org/licenses/by/4.0/
Publisher Licence URL
http://creativecommons.org/licenses/by/4.0/
You might also like
Turnector: Tangible control widgets for capacitive touchscreen devices
(2014)
Presentation / Conference Contribution
Flow of creative interaction with digital music notations
(2014)
Book Chapter
The cognitive dimensions of music notations
(2015)
Presentation / Conference Contribution
A cognitive dimensions approach for the design of an interactive generative score editor
(2018)
Presentation / Conference Contribution
Downloadable Citations
About UWE Bristol Research Repository
Administrator e-mail: repository@uwe.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search