Skip to main content

Research Repository

Advanced Search

All Outputs (7)

Automatic for the people: Crowd-driven generative scores using Manhattan and machine vision (2021)
Conference Proceeding
Nash, C. (2021). Automatic for the people: Crowd-driven generative scores using Manhattan and machine vision.

This paper details a workshop and optional public installation based on the development of situational scores that combine music notation, AI, and code to create dynamic interactive art driven by the realtime movements of objects and people in a live... Read More about Automatic for the people: Crowd-driven generative scores using Manhattan and machine vision.

Creativity in children's digital music composition (2021)
Conference Proceeding
Ford, C., Bryan-Kinns, N., & Nash, C. (2021). Creativity in children's digital music composition. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME). https://doi.org/10.21428/92fbeb44.e83deee9

Composing is a neglected area of music education. To increase participation, many technologies provide open-ended interfaces to motivate child autodidactic use, drawing influence from Papert’s LOGO philosophy to support children’s learning through pl... Read More about Creativity in children's digital music composition.

Was that me?: Exploring the effects of error in gestural digital musical instruments (2020)
Conference Proceeding
Brown, D., Nash, C., & Mitchell, T. J. (2020). Was that me?: Exploring the effects of error in gestural digital musical instruments. In AM '20: Proceedings of the 15th International Conference on Audio Mostly (168-174). https://doi.org/10.1145/3411109.3411137

Traditional Western musical instruments have evolved to be robust and predictable, responding consistently to the same player actions with the same musical response. Consequently, errors occurring in a performance scenario are typically attributed to... Read More about Was that me?: Exploring the effects of error in gestural digital musical instruments.

Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan (2020)
Conference Proceeding
Nash, C. (2020). Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan. In F. Schroeder, & R. Michon (Eds.), Proceedings of the International Conference on New Interfaces for Musical Expression (259-264)

This paper details technologies and artistic approaches to crowd-driven music, discussed in the context of a live public installation in which activity in a public space (e.g. a busy railway platform) is used to drive the automated composition and pe... Read More about Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan.

Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment (2020)
Conference Proceeding
Hunt, S. J., Mitchell, T. J., & Nash, C. (2020). Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment.

Computer composed music remains a novel and challenging problem to solve. Despite an abundance of techniques and systems little research has explored how these might be useful for end-users looking to compose with generative and algorithmic music tec... Read More about Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment.

Understanding user-defined mapping design in mid-air musical performance (2018)
Conference Proceeding
Brown, D., Nash, C., & Mitchell, T. (2018). Understanding user-defined mapping design in mid-air musical performance. https://doi.org/10.1145/3212721.3212810

© 2018 Copyright held by the owner/author(s). Publication rights licensed to Association for Computing Machinery. Modern gestural interaction and motion capture technology is frequently incorporated into Digital Musical Instruments (DMIs) to enable n... Read More about Understanding user-defined mapping design in mid-air musical performance.

The ‘E’ in QWERTY: Musical expression with old computer interfaces (2016)
Conference Proceeding
Nash, C. (2016). The ‘E’ in QWERTY: Musical expression with old computer interfaces. In Proceedings of the International Conference on New Interfaces for Musical Expression (224-229). https://doi.org/10.5281/zenodo.1176088

This paper presents a development of the ubiquitous computer keyboard to capture velocity and other continuous musical properties, in order to support more expressive interaction with music software. Building on existing ‘virtual piano’ utilities, th... Read More about The ‘E’ in QWERTY: Musical expression with old computer interfaces.