Dr Kait Clark Kait.Clark@uwe.ac.uk
Senior Lecturer in Psychology (Cognitive and Neuro)
Test-retest reliability for common tasks in vision science
Clark, Kait; Birch-Hurst, Kayley; Pennington, Charlotte R.; Petrie, Austin C. P.; Lee, Joshua T.; Hedge, Craig
Authors
Kayley Birch-Hurst
Charlotte R. Pennington
Austin C. P. Petrie
Joshua T. Lee
Craig Hedge
Abstract
Research in perception and attention has typically sought to evaluate cognitive mechanisms according to the average response to a manipulation. Recently, there has been a shift toward appreciating the value of individual differences and the insight gained by exploring the impacts of between-participant variation on human cognition. However, a recent study suggests that many robust, well-established cognitive control tasks suffer from surprisingly low levels of test-retest reliability (Hedge, Powell, & Sumner, 2018b). We tested a large sample of undergraduate students (n = 160) in two sessions (separated by 1-3 weeks) on four commonly used tasks in vision science. We implemented measures that spanned a range of perceptual and attentional processes, including motion coherence (MoCo), useful field of view (UFOV), multiple-object tracking (MOT), and visual working memory (VWM). Intraclass correlations ranged from good to poor, suggesting that some task measures are more suitable for assessing individual differences than others. VWM capacity (intraclass correlation coefficient [ICC] = 0.77), MoCo threshold (ICC = 0.60), UFOV middle accuracy (ICC = 0.60), and UFOV outer accuracy (ICC = 0.74) showed good-to-excellent reliability. Other measures, namely the maximum number of items tracked in MOT (ICC = 0.41) and UFOV number accuracy (ICC = 0.48), showed moderate reliability; the MOT threshold (ICC = 0.36) and UFOV inner accuracy (ICC = 0.30) showed poor reliability. In this paper, we present these results alongside a summary of reliabilities estimated previously for other vision science tasks. We then offer useful recommendations for evaluating test-retest reliability when considering a task for use in evaluating individual differences.
Journal Article Type | Article |
---|---|
Acceptance Date | May 24, 2022 |
Online Publication Date | Jul 29, 2022 |
Publication Date | Jul 29, 2022 |
Deposit Date | Aug 1, 2022 |
Publicly Available Date | Aug 1, 2022 |
Journal | Journal of vision |
Electronic ISSN | 1534-7362 |
Publisher | Association for Research in Vision and Ophthalmology |
Peer Reviewed | Peer Reviewed |
Volume | 22 |
Issue | 8 |
Pages | 1-18 |
DOI | https://doi.org/10.1167/jov.22.8.18 |
Keywords | individual differences; perception; attention; visual cognition |
Public URL | https://uwe-repository.worktribe.com/output/9780250 |
Publisher URL | https://jov.arvojournals.org/article.aspx?articleid=2783521 |
Files
Test-retest reliability for common tasks in vision science
(1.7 Mb)
PDF
Licence
http://creativecommons.org/licenses/by/4.0/
Publisher Licence URL
http://creativecommons.org/licenses/by/4.0/
You might also like
Man vs. Mouse: The act of walking does not alter spatial suppression in humans
(2015)
Presentation / Conference Contribution
Variation in visual search abilities and performance
(2014)
Thesis
Perception and human information processing in visual search
(2015)
Book Chapter
What can 1 billion trials tell us about visual search?
(2015)
Journal Article
Downloadable Citations
About UWE Bristol Research Repository
Administrator e-mail: repository@uwe.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search