Skip to main content

Research Repository

Advanced Search

All Outputs (13)

A systematic review of reverberation and accessibility for B/blind users in virtual environments (2023)
Conference Proceeding
Child, L., Mitchell, T., & Ford, N. (2023). A systematic review of reverberation and accessibility for B/blind users in virtual environments.

Reverberation is often used in linear and non-linear media to convey the acoustic characteristics of a space. This information is presented alongside visual stimuli to create a multi-modal experience, assisting participants in developing visual and a... Read More about A systematic review of reverberation and accessibility for B/blind users in virtual environments.

Hear here: Sonification as a design strategy for robot teleoperation using virtual reality (2023)
Conference Proceeding
Simmons, J., Bown, A., Bremner, P., McIntosh, V., & Mitchell, T. J. (2023). Hear here: Sonification as a design strategy for robot teleoperation using virtual reality.

This paper introduces a novel methodology for the sonification of data, and shares the results of a usability study, putting the method- ology into practice within an industrial use case. Working with partners at Sellafield nuclear facility, we explo... Read More about Hear here: Sonification as a design strategy for robot teleoperation using virtual reality.

Participatory conceptual design of accessible digital musical instruments using generative AI (2023)
Conference Proceeding
Aynsley, H., Mitchell, T. J., & Meckin, D. (in press). Participatory conceptual design of accessible digital musical instruments using generative AI.

This paper explores the potential of AI text-to-image diffusion models (e.g. DALLE-2 and Midjourney) to support the early phase design of new digital musical instruments in collaboration with Disabled musicians. The paper presents initial findings fr... Read More about Participatory conceptual design of accessible digital musical instruments using generative AI.

Studying how digital luthiers choose their tools (2022)
Conference Proceeding
Renney, N., Renney, H., Mitchell, T. J., & Gaster, B. R. (2022). Studying how digital luthiers choose their tools. . https://doi.org/10.1145/3491102.3517656

Digital lutherie is a sub-domain of digital craft focused on creating digital musical instruments: high-performance devices for musical expression. It represents a nuanced and challenging area of human-computer interaction that is well established an... Read More about Studying how digital luthiers choose their tools.

HyperModels - A framework for GPU accelerated physical modelling sound synthesis (2022)
Conference Proceeding
Renney, H., Willemsen, S., Gaster, B. R., & Mitchell, T. J. (2022). HyperModels - A framework for GPU accelerated physical modelling sound synthesis. . https://doi.org/10.21428/92fbeb44.98a4210a

Physical modelling sound synthesis methods generate vast and intricate sound spaces that are navigated using meaningful parameters. Numerical based physical modelling nsynthesis methods provide authentic representations of the physics they model. Unf... Read More about HyperModels - A framework for GPU accelerated physical modelling sound synthesis.

Was that me?: Exploring the effects of error in gestural digital musical instruments (2020)
Conference Proceeding
Brown, D., Nash, C., & Mitchell, T. J. (2020). Was that me?: Exploring the effects of error in gestural digital musical instruments. In AM '20: Proceedings of the 15th International Conference on Audio Mostly (168-174). https://doi.org/10.1145/3411109.3411137

Traditional Western musical instruments have evolved to be robust and predictable, responding consistently to the same player actions with the same musical response. Consequently, errors occurring in a performance scenario are typically attributed to... Read More about Was that me?: Exploring the effects of error in gestural digital musical instruments.

Towards molecular musical instruments: Interactive sonifications of 17-alanine, graphene and carbon nanotubes (2020)
Conference Proceeding
Mitchell, T. J., Jones, A. J., O’Connor, M. B., Wonnacott, M. D., Glowacki, D. R., & Hyde, J. (2020). Towards molecular musical instruments: Interactive sonifications of 17-alanine, graphene and carbon nanotubes. In AM '20: Proceedings of the 15th International Conference on Audio Mostly (214-221). https://doi.org/10.1145/3411109.3411143

Scientists increasingly rely on computational models of atoms and molecules to observe, understand and make predictions about the microscopic world. Atoms and molecules are in constant motion, with vibrations and structural fluctuations occurring at... Read More about Towards molecular musical instruments: Interactive sonifications of 17-alanine, graphene and carbon nanotubes.

Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment (2020)
Conference Proceeding
Hunt, S. J., Mitchell, T. J., & Nash, C. (2020). Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment.

Computer composed music remains a novel and challenging problem to solve. Despite an abundance of techniques and systems little research has explored how these might be useful for end-users looking to compose with generative and algorithmic music tec... Read More about Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment.

The use of different feedback modalities and verbal collaboration in tele-robotic assistance (2019)
Conference Proceeding
Bolarinwa, J., Eimontaite, I., Dogramadzi, S., Mitchell, T., & Caleb-Solly, P. (2019). The use of different feedback modalities and verbal collaboration in tele-robotic assistance. https://doi.org/10.1109/rose.2019.8790412

This paper explores the effects of different feedback modalities (gripper orientation via peripheral (side) vision and haptic feedback) and verbal collaboration with the service-user on the performance of tele-operators in completing a tele-robotic a... Read More about The use of different feedback modalities and verbal collaboration in tele-robotic assistance.

OpenCL vs: Accelerated finite-difference digital synthesis (2019)
Conference Proceeding
Renney, H., Gaster, B. R., & Mitchell, T. (2019). OpenCL vs: Accelerated finite-difference digital synthesis. . https://doi.org/10.1145/3318170.3318172

© 2019 Copyright held by the owner/author(s). Publication rights licensed to ACM. Digital audio synthesis has become an important component of modern music production with techniques that can produce realistic simulations of real instruments. Physica... Read More about OpenCL vs: Accelerated finite-difference digital synthesis.

Understanding user-defined mapping design in mid-air musical performance (2018)
Conference Proceeding
Brown, D., Nash, C., & Mitchell, T. (2018). Understanding user-defined mapping design in mid-air musical performance. https://doi.org/10.1145/3212721.3212810

© 2018 Copyright held by the owner/author(s). Publication rights licensed to Association for Computing Machinery. Modern gestural interaction and motion capture technology is frequently incorporated into Digital Musical Instruments (DMIs) to enable n... Read More about Understanding user-defined mapping design in mid-air musical performance.

Making the most of Wi-Fi: Optimisations for robust wireless live music performance (2014)
Conference Proceeding
Mitchell, T. J., Madgwick, S., Rankine, S., Hilton, G., Freed, A., & Nix, A. (2014). Making the most of Wi-Fi: Optimisations for robust wireless live music performance. In Proceedings of the International Conference on New Interfaces for Musical Expression (251-256). https://doi.org/10.5281/zenodo.1178875

Wireless technology is growing increasingly prevalent in the development of new interfaces for live music performance. However, with a number of different wireless technologies operating in the 2.4 GHz band, there is a high risk of interference and c... Read More about Making the most of Wi-Fi: Optimisations for robust wireless live music performance.

Convergence synthesis of dynamic frequency modulation tones using an evolution strategy (2005)
Conference Proceeding
Mitchell, T. J., & Pipe, A. G. (2005). Convergence synthesis of dynamic frequency modulation tones using an evolution strategy. In F. Rothlauf, J. Branke, S. Cagnoni, D. Wolfe Corne, R. Drechsler, Y. Jin, …G. Squillero (Eds.), Applications on Evolutionary Computing. , (533-538). https://doi.org/10.1007/978-3-540-32003-6_54

This paper reports on steps that have been taken to enhance previously presented evolutionary sound matching work. In doing so, the convergence characteristics are shown to provide a synthesis method that produces interesting sounds. The method imple... Read More about Convergence synthesis of dynamic frequency modulation tones using an evolution strategy.