Kerry E Jordan
See an object, hear an object file: Object correspondence transcends sensory modality
Jordan, Kerry E; Clark, Kait; Mitroff, Stephen R
Authors
Dr Kait Clark Kait.Clark@uwe.ac.uk
Senior Lecturer in Psychology (Cognitive and Neuro)
Stephen R Mitroff
Abstract
An important task of perceptual processing is to parse incoming information from the external world into distinct units and to subsequently keep track of those units over time as the same, persisting internal representations. Within the realm of visual perception, this concept of maintaining persisting object representations has been theorized as being mediated by “object files” - episodic representations that store (and update) information about objects' properties and track objects over time and motion via spatiotemporal information (e.g., Kahneman et al., 1992). Although object files are typically conceptualized as visual representations, here, we demonstrate that object-file correspondence can be computed across sensory modalities. We employed a novel version of the object-reviewing paradigm: Line-drawn pictures (e.g., a phone and a dog) were briefly presented within two distinct objects in a preview display. Once the pictures disappeared, the objects moved (to decouple objecthood from location) and then a sound (e.g., a dog bark) occurred. The sound was localized to the left or right of the display, corresponding to the end locations of the two objects. Participants were instructed to indicate whether the sound matched either preview picture or whether it was novel (e.g., a dog bark would “match” if either preview picture was a dog). Participants were significantly faster to respond when the sound occurred with the object originally containing the associated picture compared to when the sound occurred with the other object. This significant response time benefit provides the first evidence for visual and auditory information working in tandem to underlie object-file correspondence. An object file can be initially formed with visual input and later accessed with corresponding auditory information. Object files may thus operate at a highly abstract level of perceptual processing that is not tied to specific modalities.
Presentation Conference Type | Poster |
---|---|
Conference Name | Vision Sciences Society |
Start Date | May 1, 2009 |
End Date | May 1, 2009 |
Acceptance Date | Jan 25, 2009 |
Publication Date | Aug 1, 2009 |
Peer Reviewed | Not Peer Reviewed |
Public URL | https://uwe-repository.worktribe.com/output/993843 |
Publisher URL | http://dx.doi.org/10.1167/9.8.724 |
Additional Information | Title of Conference or Conference Proceedings : Vision Sciences Society |
You might also like
The open science bible psychology needed
(2024)
Newspaper / Magazine
Predicting multiple-target search performance using eye movements and individual differences
(2023)
Presentation / Conference Contribution
Downloadable Citations
About UWE Bristol Research Repository
Administrator e-mail: repository@uwe.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search