Efstratios Doukakis
Audiovisual resource allocation for bimodal virtual environments
Doukakis, Efstratios; Debattista, Kurt; Harvey, Carlo; Chalmers, Alan; Bashford-Rogers, T.
Authors
Kurt Debattista
Carlo Harvey
Alan Chalmers
Tom Bashford-Rogers Tom.Bashford-Rogers@uwe.ac.uk
Associate Lecturer - CATE - CCT - UCCT0001
Abstract
© 2017 The Authors and 2017 The Eurographics Association and John Wiley & Sons Ltd. Fidelity is of key importance if virtual environments are to be used as authentic representations of real environments. However, simulating the multitude of senses that comprise the human sensory system is computationally challenging. With limited computational resources, it is essential to distribute these carefully in order to simulate the most ideal perceptual experience. This paper investigates this balance of resources across multiple scenarios where combined audiovisual stimulation is delivered to the user. A subjective experiment was undertaken where participants (N=35) allocated five fixed resource budgets across graphics and acoustic stimuli. In the experiment, increasing the quality of one of the stimuli decreased the quality of the other. Findings demonstrate that participants allocate more resources to graphics; however, as the computational budget is increased, an approximately balanced distribution of resources is preferred between graphics and acoustics. Based on the results, an audiovisual quality prediction model is proposed and successfully validated against previously untested budgets and an untested scenario.
Journal Article Type | Article |
---|---|
Acceptance Date | Jun 4, 2017 |
Publication Date | Jan 1, 2018 |
Deposit Date | Oct 12, 2017 |
Publicly Available Date | Jul 14, 2018 |
Journal | Computer Graphics Forum |
Print ISSN | 0167-7055 |
Electronic ISSN | 1467-8659 |
Publisher | Wiley |
Peer Reviewed | Peer Reviewed |
Volume | 37 |
Issue | 1 |
Pages | 172-183 |
DOI | https://doi.org/10.1111/cgf.13258 |
Keywords | audio, visual, multi-modal, human perception |
Public URL | https://uwe-repository.worktribe.com/output/884216 |
Publisher URL | http://dx.doi.org/10.1111/cgf.13258 |
Additional Information | Additional Information : This is the peer reviewed version of the following article: Doukakis, E., Debattista, K., Harvey, C., Bashford-Rogers, T. and Chalmers, A. (2017) Audio-visual resource allocation for bimodal virtual environments. Computer Graphics Forum. ISSN 0167-7055, which has been published in final form at http://dx.doi.org/10.1111/cgf.13258. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving. |
Contract Date | Oct 12, 2017 |
Files
AVRABVE.pdf
(21.6 Mb)
PDF
You might also like
Learning preferential perceptual exposure for HDR displays
(2019)
Journal Article
Olfaction and selective rendering
(2017)
Journal Article
Subjective evaluation of high-fidelity virtual environments for driving simulations
(2017)
Journal Article
Downloadable Citations
About UWE Bristol Research Repository
Administrator e-mail: repository@uwe.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search