Samuel Hunt Samuel.Hunt@uwe.ac.uk
Lecturer - Graduate Tutor
Automating algorithmic representations of musical structure using IGME: The Interactive Generative Music Environment
Hunt, Samuel J; Mitchell, Thomas; Nash, Chris
Authors
Tom Mitchell Tom.Mitchell@uwe.ac.uk
Professor of Audio and Music Interaction
Chris Nash Chris.Nash@uwe.ac.uk
Senior Lecturer in Music Tech - Software Development
Abstract
In this paper we explore the recreation of existing musical compositions by representing the music as a series of unique musical bars, and other bars that can be replicated through various algorithmic transformations, inside the Interactive Generative Music Environment software, or IGME. This re-composition approach is intended to explore whether the pre-existing music could have been created using the processed based approaches offered by the IGME software. If music can be expressed by algorithmic processes then we propose that original works of music can be expressed or created in the same way. Such a justification can provide a rationale for creating the unique compositional processes and workflows that IGME affords to those looking to compose with generative and algorithmic music techniques, and avoid many of the pitfalls of generative music. Music can be imported into IGME and automatically analysed to find unique bars, and bars that have been transformed from them. The overall timeline can be visualised to quickly demonstrate the structure of the music, using colour to differentiate unique musical ideas, and arrow-arcs to show the relationships between different parts. Such a process reduces the overall entropy of the music data and provides an educational insight into macro level music structures. Each of the techniques are explained and examples given. In addition, data sets have been pre-computed for several genres of music, showcasing the distribution of different types of techniques.
Citation
Hunt, S. J., Mitchell, T., & Nash, C. (2019, December). Automating algorithmic representations of musical structure using IGME: The Interactive Generative Music Environment. Paper presented at Innovation In Music 2019, University of West London
Presentation Conference Type | Conference Paper (unpublished) |
---|---|
Conference Name | Innovation In Music 2019 |
Conference Location | University of West London |
Start Date | Dec 5, 2019 |
End Date | Dec 7, 2019 |
Deposit Date | Feb 25, 2020 |
Publicly Available Date | May 5, 2020 |
Public URL | https://uwe-repository.worktribe.com/output/5455331 |
Files
InMusic19 SamuelHunt-submission Version
(4.3 Mb)
PDF
You might also like
Automatic for the people: Crowd-driven generative scores using Manhattan and machine vision
(2021)
Conference Proceeding
Automatic for the people: Two pieces of crowd-driven music
(2021)
Exhibition / Performance
Creativity in children's digital music composition
(2021)
Conference Proceeding
Was that me?: Exploring the effects of error in gestural digital musical instruments
(2020)
Conference Proceeding
Crowd-driven music: Interactive and generative approaches using machine vision and Manhattan
(2020)
Conference Proceeding
Downloadable Citations
About UWE Bristol Research Repository
Administrator e-mail: repository@uwe.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search