Skip to main content

Research Repository

Advanced Search

Feel the force, see the force: Exploring visual-tactile associations of deformable surfaces with colours and shapes

Steer, Cameron; Dinca, Teodora; Jicol, Crescent; Proulx, Michael J.; Alexander, Jason

Authors

Cameron Steer

Teodora Dinca

Crescent Jicol

Michael J. Proulx

Jason Alexander



Abstract

Deformable interfaces provide unique interaction potential for force input, for example, when users physically push into a soft display surface. However, there remains limited understanding of which visual-tactile design elements signify the presence and stiffness of such deformable force-input components. In this paper, we explore how people correspond surface stiffness to colours, graphical shapes, and physical shapes. We conducted a cross-modal correspondence (CC) study, where 30 participants associated different surface stiffnesses with colours and shapes. Our findings evidence the CCs between stiffness levels for a subset of the 2D/3D shapes and colours used in the study. We distil our findings in three design recommendations: (1) lighter colours should be used to indicate soft surfaces, and darker colours should indicate stiff surfaces; (2) rounded shapes should be used to indicate soft surfaces, while less-curved shapes should be used to indicate stiffer surfaces, and; (3) longer 2D drop-shadows should be used to indicate softer surfaces, while shorter drop-shadows should be used to indicate stiffer surfaces.

Presentation Conference Type Conference Paper (Published)
Conference Name Conference on Human Factors in Computing Systems - Proceedings
Start Date Apr 17, 2023
Acceptance Date Mar 1, 2023
Online Publication Date Apr 19, 2023
Publication Date Apr 19, 2023
Deposit Date Oct 5, 2023
Publisher Association for Computing Machinery (ACM)
Pages 1-13
Book Title CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems
ISBN 9781450394215
DOI https://doi.org/10.1145/3544548.3580830
Keywords Multisensory Interaction, Force, Crossmodal Correspondences, Colour, Touch
Public URL https://uwe-repository.worktribe.com/output/11123746