Skip to main content

Research Repository

Advanced Search

Predicting user confidence during visual decision making

Smith, Jim; Legg, Phil; Matovis, Milos; Kinsey, Kris

Predicting user confidence during visual decision making Thumbnail


Authors

Profile image of Jim Smith

Jim Smith James.Smith@uwe.ac.uk
Professor in Interactive Artificial Intelligence

Milos Matovis

Kris Kinsey Kris.Kinsey@uwe.ac.uk
Senior Lecturer in Psychology



Abstract

© 2018 ACM People are not infallible consistent “oracles”: their confidence in decision-making may vary significantly between tasks and over time. We have previously reported the benefits of using an interface and algorithms that explicitly captured and exploited users’ confidence: error rates were reduced by up to 50% for an industrial multi-class learning problem; and the number of interactions required in a design-optimisation context was reduced by 33%. Having access to users’ confidence judgements could significantly benefit intelligent interactive systems in industry, in areas such as intelligent tutoring systems and in health care. There are many reasons for wanting to capture information about confidence implicitly. Some are ergonomic, but others are more “social”—such as wishing to understand (and possibly take account of) users’ cognitive state without interrupting them. We investigate the hypothesis that users’ confidence can be accurately predicted from measurements of their behaviour. Eye-tracking systems were used to capture users’ gaze patterns as they undertook a series of visual decision tasks, after each of which they reported their confidence on a 5-point Likert scale. Subsequently, predictive models were built using “conventional” machine learning approaches for numerical summary features derived from users’ behaviour. We also investigate the extent to which the deep learning paradigm can reduce the need to design features specific to each application by creating “gaze maps”—visual representations of the trajectories and durations of users’ gaze fixations—and then training deep convolutional networks on these images. Treating the prediction of user confidence as a two-class problem (confident/not confident), we attained classification accuracy of 88% for the scenario of new users on known tasks, and 87% for known users on new tasks. Considering the confidence as an ordinal variable, we produced regression models with a mean absolute error of ≈0.7 in both cases. Capturing just a simple subset of non-task-specific numerical features gave slightly worse, but still quite high accuracy (e.g., MAE ≈ 1.0). Results obtained with gaze maps and convolutional networks are competitive, despite not having access to longer-term information about users and tasks, which was vital for the “summary” feature sets. This suggests that the gaze-map-based approach forms a viable, transferable alternative to handcrafting features for each different application. These results provide significant evidence to confirm our hypothesis, and offer a way of substantially improving many interactive artificial intelligence applications via the addition of cheap non-intrusive hardware and computationally cheap prediction algorithms.

Journal Article Type Article
Acceptance Date Feb 1, 2018
Online Publication Date Jun 1, 2018
Publication Date Jun 1, 2018
Deposit Date Jan 26, 2018
Publicly Available Date Mar 6, 2018
Journal ACM Transactions on Interactive Intelligent Systems
Print ISSN 2160-6455
Electronic ISSN 2160-6463
Publisher Association for Computing Machinery (ACM)
Peer Reviewed Not Peer Reviewed
Volume 8
Issue 2
Article Number 10
DOI https://doi.org/10.1145/3185524
Keywords human-centred machine learning, confidence
Public URL https://uwe-repository.worktribe.com/output/867441
Publisher URL https://doi.org/10.1145/3185524
Additional Information Additional Information : © ACM, 2018. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ACM Transactions on Interactive Intelligent Systems, 8 (2), June 2018
Contract Date Jan 26, 2018

Files






You might also like



Downloadable Citations