Skip to main content

Research Repository

Advanced Search

Olfaction and selective rendering

Harvey, Carlo; Bashford-Rogers, Thomas; Debattista, Kurt; Doukakis, Efstratios; Chalmers, Alan


Carlo Harvey

Kurt Debattista

Efstratios Doukakis

Alan Chalmers


© 2017 The Authors & 2017 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd. Accurate simulation of all the senses in virtual environments is a computationally expensive task. Visual saliency models have been used to improve computational performance for rendered content, but this is insufficient for multi-modal environments. This paper considers cross-modal perception and, in particular, if and how olfaction affects visual attention. Two experiments are presented in this paper. Firstly, eye tracking is gathered from a number of participants to gain an impression about where and how they view virtual objects when smell is introduced compared to an odourless condition. Based on the results of this experiment a new type of saliency map in a selective-rendering pipeline is presented. A second experiment validates this approach, and demonstrates that participants rank images as better quality, when compared to a reference, for the same rendering budget.


Harvey, C., Bashford-Rogers, T., Debattista, K., Doukakis, E., & Chalmers, A. (2018). Olfaction and selective rendering. Computer Graphics Forum, 37(1), 350-362.

Journal Article Type Article
Acceptance Date Aug 19, 2017
Online Publication Date Sep 14, 2017
Publication Date Feb 1, 2018
Deposit Date Oct 3, 2017
Journal Computer Graphics Forum
Print ISSN 0167-7055
Electronic ISSN 1467-8659
Publisher Wiley
Peer Reviewed Peer Reviewed
Volume 37
Issue 1
Pages 350-362
Keywords multi?modal; cross?modal; saliency; olfaction; graphics; selective rendering
Public URL
Publisher URL

You might also like

Downloadable Citations