Our research aims at understanding how perceptual information is selectively processed by attention and represented in working memory. We are guided by the premise that the organization of perceptual representations constrains attention and working memory processes, and that each of these processes is influenced by current contextual information as well as prior knowledge. We are also interested in how diverse sensory inputs from different modalities are integrated to form coherent, multimodal representations.
SOME CURRENT RESEARCH QUESTIONS
What are the fundamental principles of selective attention?
Our lab studies how we select relevant information from the overwhelming influx of incoming information. Our work shows that the limits and principles of selective processing are constrained by the nature of our mental representations and the neural architecture that supports them. For example, we have found that spatial processing limits are constrained at the level of the visual hemifield, and that the selection of visual features (color) is determined by their perceptual similarity.
Feature-based attention is constrained by similarity in feature space:
Target-distractor similarity predicts visual search efficiency but only for highly similar features.Chapman, A. F., & Störmer, V. S. (2024). Attention, Perception, & Psychophysics.
Feature-based attention warps the perception of visual features. Chapman, A.F., Chunharas, C. & Störmer, V.S. (2023). Scientific Reports.
Efficient tuning of attention to narrow and broad ranges of task-relevant feature values. Chapman, A. F., & Störmer, V. S. (2023). Visual Cognition.
Feature similarity is non-linearly related to attentional selection: evidence from visual search and sustained attention tasks. Chapman, A.F., & Störmer, V.S. (2022). Journal of Vision.
Feature-based attention is not confined by object boundaries: spatially global enhancement of irrelevant features. Chapman, A.F. & Störmer, V.S. (2021). Psychonomic Bulletin & Review.
Feature-based attention elicits surround-suppression in feature space. Störmer, V.S., & Alvarez, G.A. (2014). Current Biology.
Spatial selection is determined by the neural architecture of the visual system:
When spatial attention cannot be divided: Quadrantic enhancement of early visual processing across task–relevant and irrelevant locations. Özkan, M. & Störmer, V.S. (2024). Imaging Neuroscience.
Within-hemifield competition in early visual areas limits the ability to track multiple objects with attention. Störmer, V.S., Alvarez, G.A., & Cavanagh, P. (2014). The Journal of Neuroscience.
Review paper:
Representational structures as a unifying framework for attention. Chapman, A. F., & Störmer, V. S. (2024). Trends in Cognitive Science
What are the limits of visual working memory?
Seminal models of visual working memory – the system that holds visual information in an active state – postulate that the capacity to actively maintain visual information is fixed (to 3-4 objects or a fixed pool of resources). Our work has shown that working memory capacity is increased for meaningful and familiar stimuli, indicating that cognitive limits depend on the type of information that is encoded. For example, we find higher performance and neural delay activity for real-world objects relative to simple colors or scrambled objects. Furthermore, in more recent studies we have found that simple features like color are better retained when they are encoded as a part of a meaningful object, suggesting that conceptual knowledge and familiarity can act as an effective scaffold to memorize simple and abstract features.
Meaningful and familiar objects are better remembered than simple and unrecognizable objects:
Working memory is not fixed capacity: More active storage capacity for real-world objects than simple stimuli. Brady T.F., Störmer, V.S., & Alvarez, G.A. (2016). Proc Natl Acad Sci USA.
Greater visual working memory capacity for visually-matched stimuli when they are recognized as meaningful.Asp, I.E., Störmer, V.S., & Brady, T.F. (2021). Journal of Cognitive Neuroscience.
The role of meaning in visual working memory: Real-world objects, but not simple features, benefit from deeper processing. Brady, T.F. & Störmer, V.S. (2022). Journal of Experimental Psychology: Learning, Memory, and Cognition.
Comparing memory capacity across stimuli requires maximally dissimilar foils: Using deep convolutional neural networks to understand visual working memory capacity for real-world objects.Brady, T.F. & Störmer, V.S. (2023). Memory & Cognition.
Simple features are better remembered when they are encoded as parts of real-world objects:
No Fixed Limit for Storing Simple Visual Features: Realistic Objects Provide an Efficient Scaffold for Holding Features in Mind. Chung, Y. H., Brady, T. F., & Störmer, V. S. (2023). Psychological Science.
Sequential encoding aids working memory for meaningful objects’ identities but not for their colors. Chung, Y. H., Brady, T. F., & Störmer, V. S. (2023). Memory & Cognition.
Review paper:
Meaningfulness and familiarity expand visual working memory capacity. Chung, Y.H., Brady, T.F., Störmer, V.S., (2024). Current Directions in Psychological Science.
How do auditory stimuli influence visual perception?
Our research shows that sounds improve visual perception and activate visual cortex activity. Our work has also shown that these cross-modal effects scale up to more complex and higher level stimuli. For example, hearing the sound of an object or auditory scene (the ambient sound of a train station) can facilitate visual object recognition and shift visual representations towards the feature of the sound, effectively resolving visual ambiguities. Broadly, this line of work shows that our senses are inextricably connected – and perceptual processing in one modality influences perceptual processing in another modality, at least for audition and vision.
Naturalistic sounds influence visual object perception:
Cutting through the noise: Auditory scenes and their effects on visual object processing. Williams, J. & Störmer, V.S. (2024). Psychological Science.
What you see is what you hear: Sounds alter the contents of visual perception. Williams, J., Markov, Y., Tiurina, N., & Störmer, V.S. (2022). Psychological Science.
Salient sounds enhance visual processing and visual cortex activity in a spatially selective way:
Cross-modal cueing of attention alters appearance and early cortical processing of visual stimuli. Störmer, V.S., McDonald, J.J., & Hillyard, S.A. (2009). Proc Natl Acad Sci USA, 106, 22456-22461.
Salient sounds activate human visual cortex automatically. McDonald, J.J., Störmer, V.S., Martinez, A., Feng, W., & Hillyard, S.A. (2013). The Journal of Neuroscience, 33(21), 9194-9201.
Sounds activate visual cortex and improve visual discrimination. Feng, W., Störmer, V.S., Martinez, A., McDonald, J.J.,& Hillyard, S.A. (2014). The Journal of Neuroscience, 34(29), 9817-9824.
Salient, irrelevant sounds reflexively induce alpha rhythm desynchronization in parallel with slow potential shifts in visual cortex. Störmer, V.S., Feng, W., Martinez, A., McDonald, J.J.,& Hillyard, S.A. (2016). Journal of Cognitive Neuroscience, 28(3), 433-445.
Lateralized alpha activity and slow potential shifts over visual cortex track the time course of both endogenous and exogenous orienting of attention. Keefe, J.M. & Störmer, V.S. (2021). NeuroImage, 225, 117495.
Review papers:
Orienting spatial attention to sounds enhances visual processing. Störmer, V.S. (2019). Current Opinion in Psychology, 29, 193-198.
Cross-modal orienting of visual attention. Hillyard, S.A., Störmer, V.S., Feng, W., Martinez, A., & McDonald, J.J. (2015). Neuropsychologia. 83, 170-178.