Dr Kait Clark Kait.Clark@uwe.ac.uk
Senior Lecturer in Psychology (Cognitive and Neuro)
Dr Kait Clark Kait.Clark@uwe.ac.uk
Senior Lecturer in Psychology (Cognitive and Neuro)
Charlotte R Pennington
Craig Hedge
Joshua T Lee
Austin C P Petrie
Historically, research in cognitive psychology has sought to evaluate cognitive mechanisms according to the average response to a manipulation. Differences between individuals have been dismissed as “noise” with an aim toward characterising an overall effect and how it can inform human cognition. More recently, research has shifted toward appreciating the value of individual differences between participants and the insight gained by exploring the impacts of between-subject variation on human cognition. However, recent research has suggested that many robust, well-established cognitive tasks suffer from surprisingly low levels of test-retest reliability (Hedge, Powell, & Sumner, 2018). While the tasks may produce reliable effects at the group level (i.e., they are replicable), they may not produce a reliable measurement of a given individual. If individual performance on a task is not consistent from one time point to another, the task is therefore unfit for the assessment of individual differences. To evaluate the reliability of commonly used tasks in vision science, we tested a large sample of undergraduate students in two sessions (separated by 1-3 weeks). Our battery included tasks that spanned the range of visual processing from basic sensitivity (motion coherence) to transient spatial attention (useful field of view) to sustained attention (multiple-object tracking) to visual working memory (change detection). Reliabilities (intraclass correlations) ranged from 0.4 to 0.7, suggesting that most of these measures suffer from lower reliability than would be desired for research in individual differences. These results do not detract from the value of the tasks in an experimental setting; however, higher levels of test-retest reliability would be required for a meaningful assessment of individual differences. Implications for using tools from vision science to understand processing in both healthy and neuropsychological populations are discussed.
Clark, K., Pennington, C. R., Hedge, C., Lee, J. T., & Petrie, A. C. P. (2019, May). Test-retest reliability for common tasks in vision science. Poster presented at Vision Sciences Society, St Pete Beach, Florida, United States
Presentation Conference Type | Poster |
---|---|
Conference Name | Vision Sciences Society |
Conference Location | St Pete Beach, Florida, United States |
Start Date | May 1, 2019 |
End Date | May 1, 2019 |
Acceptance Date | Jan 25, 2019 |
Publicly Available Date | Jun 7, 2019 |
Peer Reviewed | Not Peer Reviewed |
Public URL | https://uwe-repository.worktribe.com/output/853293 |
Additional Information | Title of Conference or Conference Proceedings : Vision Sciences Society |
Clark_VSS2019.pdf
(1.8 Mb)
PDF
Visual search alpha: A novel window into lateralized visual attentional processes
(2018)
Presentation / Conference
Knowing where one will hit a moving object influences eye-head-hand coordination
(2017)
Presentation / Conference
Who should be searching? Differences in personality can affect visual search accuracy
(2017)
Journal Article
The role of motion parallax in the perception of egocentric direction
(2017)
Presentation / Conference
About UWE Bristol Research Repository
Administrator e-mail: repository@uwe.ac.uk
This application uses the following open-source libraries:
Apache License Version 2.0 (http://www.apache.org/licenses/)
Apache License Version 2.0 (http://www.apache.org/licenses/)
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Advanced Search