Dr Kait Clark Kait.Clark@uwe.ac.uk
Senior Lecturer in Psychology (Cognitive and Neuro)
Dr Kait Clark Kait.Clark@uwe.ac.uk
Senior Lecturer in Psychology (Cognitive and Neuro)
Kayley Birch-Hurst
Charlotte R. Pennington
Austin C. P. Petrie
Joshua T. Lee
Craig Hedge
Research in perception and attention has typically sought to evaluate cognitive mechanisms according to the average response to a manipulation. Recently, there has been a shift toward appreciating the value of individual differences and the insight gained by exploring the impacts of between-participant variation on human cognition. However, a recent study suggests that many robust, well-established cognitive control tasks suffer from surprisingly low levels of test-retest reliability (Hedge, Powell, & Sumner, 2018b). We tested a large sample of undergraduate students (n = 160) in two sessions (separated by 1-3 weeks) on four commonly used tasks in vision science. We implemented measures that spanned a range of perceptual and attentional processes, including motion coherence (MoCo), useful field of view (UFOV), multiple-object tracking (MOT), and visual working memory (VWM). Intraclass correlations ranged from good to poor, suggesting that some task measures are more suitable for assessing individual differences than others. VWM capacity (intraclass correlation coefficient [ICC] = 0.77), MoCo threshold (ICC = 0.60), UFOV middle accuracy (ICC = 0.60), and UFOV outer accuracy (ICC = 0.74) showed good-to-excellent reliability. Other measures, namely the maximum number of items tracked in MOT (ICC = 0.41) and UFOV number accuracy (ICC = 0.48), showed moderate reliability; the MOT threshold (ICC = 0.36) and UFOV inner accuracy (ICC = 0.30) showed poor reliability. In this paper, we present these results alongside a summary of reliabilities estimated previously for other vision science tasks. We then offer useful recommendations for evaluating test-retest reliability when considering a task for use in evaluating individual differences.
Journal Article Type | Article |
---|---|
Acceptance Date | May 24, 2022 |
Online Publication Date | Jul 29, 2022 |
Publication Date | Jul 29, 2022 |
Deposit Date | Aug 1, 2022 |
Publicly Available Date | Aug 1, 2022 |
Journal | Journal of vision |
Electronic ISSN | 1534-7362 |
Publisher | Association for Research in Vision and Ophthalmology |
Peer Reviewed | Peer Reviewed |
Volume | 22 |
Issue | 8 |
Pages | 1-18 |
DOI | https://doi.org/10.1167/jov.22.8.18 |
Keywords | individual differences; perception; attention; visual cognition |
Public URL | https://uwe-repository.worktribe.com/output/9780250 |
Publisher URL | https://jov.arvojournals.org/article.aspx?articleid=2783521 |
Test-retest reliability for common tasks in vision science
(1.7 Mb)
PDF
Licence
http://creativecommons.org/licenses/by/4.0/
Publisher Licence URL
http://creativecommons.org/licenses/by/4.0/
Altering facial movements abolishes neural mirroring of facial expressions
(2021)
Journal Article
Predicting multiple-target search performance using eye movements and individual differences
(2023)
Presentation / Conference Contribution
Neural mechanisms underlying enhanced visual search performance in action video game players
(2021)
Presentation / Conference Contribution
Test-retest reliability for multiple-target visual search: Eye-tracking and performance metrics
(2024)
Presentation / Conference Contribution
About UWE Bristol Research Repository
Administrator e-mail: repository@uwe.ac.uk
This application uses the following open-source libraries:
Apache License Version 2.0 (http://www.apache.org/licenses/)
Apache License Version 2.0 (http://www.apache.org/licenses/)
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search