Skip to main content

Research Repository

Advanced Search

Chris Nash

Image

Chris Nash

Senior Lecturer in Music Tech - Software Development


Automatic for the people: Two pieces of crowd-driven music (2021)
Exhibition / Performance
Nash, C. Automatic for the people: Two pieces of crowd-driven music. [http://nash.audio/manhattan/tenor2020/performance.mp4]. Performed at Hamburg, Germany. 10 May 2021 - 13 May 2021. (Unpublished)

A video presenting excepts from two pieces of live generative works developed for BBC Music Day 2018, performed continuously throughout the day on the busy main platform of Bristol railway station. Both pieces are generated (and played) live using th... Read More about Automatic for the people: Two pieces of crowd-driven music.

Creativity in children's digital music composition (2021)
Conference Proceeding
Nash, C., Ford, C., & Bryan-Kinns, N. (in press). Creativity in children's digital music composition

Composing is a neglected area of music education. To increase participation, many technologies provide open-ended interfaces to motivate child autodidactic use, drawing influence from Papert’s LOGO philosophy (to support children’s learning through p... Read More about Creativity in children's digital music composition.

Automatic for the people: Crowd-driven generative scores using Manhattan and machine vision (2021)
Conference Proceeding
Nash, C. (in press). Automatic for the people: Crowd-driven generative scores using Manhattan and machine vision

This paper details a workshop and optional public installation based on the development of situational scores that combine music notation, AI, and code to create dynamic interactive art driven by the realtime movements of objects and people in a live... Read More about Automatic for the people: Crowd-driven generative scores using Manhattan and machine vision.

Was that me?: Exploring the effects of error in gestural digital musical instruments (2020)
Conference Proceeding
Brown, D., Nash, C., & Mitchell, T. J. (2020). Was that me?: Exploring the effects of error in gestural digital musical instruments. In AM '20: Proceedings of the 15th International Conference on Audio Mostly. , (168-174). https://doi.org/10.1145/3411109.3411137

Traditional Western musical instruments have evolved to be robust and predictable, responding consistently to the same player actions with the same musical response. Consequently, errors occurring in a performance scenario are typically attributed to... Read More about Was that me?: Exploring the effects of error in gestural digital musical instruments.

Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan (2020)
Conference Proceeding
Nash, C. (2020). Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan. In F. Schroeder, & R. Michon (Eds.), Proceedings of the International Conference on New Interfaces for Musical Expression. , (259-264)

This paper details technologies and artistic approaches to crowd-driven music, discussed in the context of a live public installation in which activity in a public space (e.g. a busy railway platform) is used to drive the automated composition and pe... Read More about Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan.

Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment (2020)
Conference Proceeding
Hunt, S. J., Mitchell, T. J., & Nash, C. (2020). Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment

Computer composed music remains a novel and challenging problem to solve. Despite an abundance of techniques and systems little research has explored how these might be useful for end-users looking to compose with generative and algorithmic music tec... Read More about Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment.

Automating algorithmic representations of musical structure using IGME: The Interactive Generative Music Environment (2019)
Presentation / Conference
Hunt, S. J., Mitchell, T., & Nash, C. (2019, December). Automating algorithmic representations of musical structure using IGME: The Interactive Generative Music Environment. Paper presented at Innovation In Music 2019, University of West London

In this paper we explore the recreation of existing musical compositions by representing the music as a series of unique musical bars, and other bars that can be replicated through various algorithmic transformations, inside the Interactive Generativ... Read More about Automating algorithmic representations of musical structure using IGME: The Interactive Generative Music Environment.

Simple mappings, expressive movement: a qualitative investigation into the end-user mapping design of experienced mid-air musicians (2018)
Journal Article
Brown, D., Nash, C., & Mitchell, T. (2018). Simple mappings, expressive movement: a qualitative investigation into the end-user mapping design of experienced mid-air musicians. Digital Creativity, 29(2-3), 129-148. https://doi.org/10.1080/14626268.2018.1510841

© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. In a New Interface for Musical Expression (NIME), the design of the relationship between a musician’s actions and the instrument’s sound response is critical in creating instrume... Read More about Simple mappings, expressive movement: a qualitative investigation into the end-user mapping design of experienced mid-air musicians.

Understanding user-defined mapping design in mid-air musical performance (2018)
Conference Proceeding
Brown, D., Nash, C., & Mitchell, T. (2018). Understanding user-defined mapping design in mid-air musical performance. https://doi.org/10.1145/3212721.3212810

© 2018 Copyright held by the owner/author(s). Publication rights licensed to Association for Computing Machinery. Modern gestural interaction and motion capture technology is frequently incorporated into Digital Musical Instruments (DMIs) to enable n... Read More about Understanding user-defined mapping design in mid-air musical performance.

A cognitive dimensions approach for the design of an interactive generative score editor (2018)
Presentation / Conference
Hunt, S., Mitchell, T., & Nash, C. (2018, May). A cognitive dimensions approach for the design of an interactive generative score editor. Paper presented at Fourth International Conference on Technologies for Music Notation and Representation, Montréal, Canada

This paper describes how the Cognitive Dimensions of Notation can guide the design of algorithmic composition tools. Prior research has also used the cognitive dimensions for analysing interaction design for algorithmic composition software. This wor... Read More about A cognitive dimensions approach for the design of an interactive generative score editor.