Skip to main content

Research Repository

Advanced Search

All Outputs (27)

Automatic for the people: Crowd-driven generative scores using Manhattan and machine vision (2021)
Presentation / Conference Contribution

This paper details a workshop and optional public installation based on the development of situational scores that combine music notation, AI, and code to create dynamic interactive art driven by the realtime movements of objects and people in a live... Read More about Automatic for the people: Crowd-driven generative scores using Manhattan and machine vision.

Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan (2020)
Presentation / Conference Contribution

This paper details technologies and artistic approaches to crowd-driven music, discussed in the context of a live public installation in which activity in a public space (e.g. a busy railway platform) is used to drive the automated composition and pe... Read More about Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan.

Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment (2020)
Presentation / Conference Contribution

Computer composed music remains a novel and challenging problem to solve. Despite an abundance of techniques and systems little research has explored how these might be useful for end-users looking to compose with generative and algorithmic music tec... Read More about Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment.

Automating algorithmic representations of musical structure using IGME: The Interactive Generative Music Environment (2019)
Presentation / Conference Contribution

In this paper we explore the recreation of existing musical compositions by representing the music as a series of unique musical bars, and other bars that can be replicated through various algorithmic transformations, inside the Interactive Generativ... Read More about Automating algorithmic representations of musical structure using IGME: The Interactive Generative Music Environment.

Simple mappings, expressive movement: a qualitative investigation into the end-user mapping design of experienced mid-air musicians (2018)
Journal Article

© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. In a New Interface for Musical Expression (NIME), the design of the relationship between a musician’s actions and the instrument’s sound response is critical in creating instrume... Read More about Simple mappings, expressive movement: a qualitative investigation into the end-user mapping design of experienced mid-air musicians.

Manhattan Circus (2017)
Other

Collaboration is an interesting endeavour! Particularly when it involves examining and subsequently articulating aspects of one’s compositional process that can then be expressed as computer code and then realised by software through collaboration.... Read More about Manhattan Circus.

The trinity test: Workshop on unified notations for practices and pedagogies in music and programming (2016)
Presentation / Conference Contribution

This paper outlines a workshop to explore intersections of programming and music in digital notation. With the aid of the Manhattan music programming and sequencing environment (Nash, 2014), methods for representing both high-level processes and low-... Read More about The trinity test: Workshop on unified notations for practices and pedagogies in music and programming.