top of page

Characterizing the relationship between eye movements and memory

Given that we can only see a small portion of our visual surroundings within a single fixation, we must move our eyes continuously to explore the world around us.  Where we look not only has critical implications for what we can encode into memory, but also for what we can later retrieve. Our work has shown that when we retrieve an encoded event or stimulus from memory, we tend to look in regions associated with previously encoded information, even when looking at a blank screen. Moreover, the extent to which we engage this looking pattern predicts performance across a number of memory tasks. This research suggests that beyond reflecting attention and memory processes, eye movements play an active and functional role in memory retrieval. Specifically, eye movements may support memory retrieval by reactivating stored spatiotemporal relations from memory. In other words, by projecting internal representations of context onto the external world, eye movements help to lay a foundation upon which episodes can be retrieved and (re)constructed. 

Eye movements reflect memory content and integrity

Our work leverages eyetracking technology to inform models of human memory and cognitive aging. Unlike other behavioural measures, eye movements can reveal mnemonic content that is not consciously accessible. For example, we tend to look more at novel items compared to items we have seen before, at regions where a previously-presented item has disappeared or changed, and at items


that were paired or associated. These viewing effects are present even when behavioural responses show no evidence of memory formation, making eyetracking an ideal tool to study non-declarative processes and special populations. Accordingly, using eye movement monitoring, our research has demonstrated that memory impairments typically seen with age may be attributed in part to differences in the ways younger and older adults view the world (Wynn et al., 2021, Trends Cogn Sci). For example, during visual search for a target (e.g., phone) in a scene (e.g., office), older adults tend to prioritize looking in regions consistent with prior knowledge and expectations (e.g., desk), even when they are aware that the region does not contain the target (Wynn et al., 2020, J Exp Psychol Gen). This gaze pattern is not only predictive of search success, but also of subsequent memory for targets. Similarly, our work has shown that when tasked with encoding an image across multiple presentations, younger adults scan different image features with each presentation, allowing them to build up a complete memory representation. Older adults, however, scan the same regions over and over, reflecting poor encoding and leading to deficient memory representations (Wynn et al., 2021, Cognition). This pattern of increased gaze similarity across image presentations is even more pronounced in older adults diagnosed with mild cognitive impairment, and even more in patients with amnesia (Wynn et al., 2021; in prep). Thus, by looking at gaze patterns, we can infer the contents of memory and assess the extent of memory function in different populations. 

Eye movements are functional for memory


Research suggests that memory is disrupted when eye movements are restricted (e.g., to a fixation cross). When we are free to move our eyes, however, more fixations is typically predictive of better memory. What accounts for these effects? How do gaze fixations support memory retrieval? To address this question, our work has employed gaze similarity

analysis to quantify the spatial and temporal overlap between corresponding gaze patterns (e.g., encoding and retrieval).  Using this analysis, our work has shown that when maintaining or retrieving an encoded image from memory, we spontaneously shift our eyes to empty regions of the screen in which salient image features previously appeared (Wynn et al., 2018, J Exp Psychol Hum Percept Perform; Wynn et al., 2020, Proc Natl Acad Sci USA). In other words, we use our eye movements to rehearse and reactivate image features from memory.  This pattern of gaze reinstatement is predictive of memory performance across a number of tasks, suggesting that it is functional for memory retrieval (Wynn et al., 2019, Vision). More recently, our research has demonstrated that the effects of gaze reinstatement extend beyond memory retrieval to the related process of simulation- the construction and imagination of future events (Wynn et al., 2022, Conscious Cogn, Wynn et al., 2024, Cognition). When simulating a beach scene, for example, we move our eyes in a characteristic pattern that is similar across individuals, and different from the patterns elicited by other scenes (e.g., forest). The extent to which this scene-specific pattern of eye movements is expressed (i.e., the more similar eye movements are to the average or prototypical pattern of eye movements for that scene) also predicts the success of the simulation (e.g., how much detail and specificity it contains), further suggesting that eye movements not only support the retrieval of stored memories, but also the construction of future episodes. 

Factors that influence eye movement-memory interactions

Converging evidence indicates that eye movements play a functional role  in memory retrieval and construction. But, what specifically is this role? How does it change across different tasks, contexts, and populations? To address these questions, our work employs a number of behavioural techniques and includes both younger and older adults, a population with documented deficits in memory function. For example, during cued retrieval, gaze reinstatement (of the encoded image) is positively associated with mnemonic performance for old images, but negatively associated with performance for


novel, but similar lure images. That is, when tasked with rejecting a lure image, erroneous retrieval of the similar encoded image increases the tendency to false alarm (i.e., falsely endorse the new lure image as 'old') (Wynn et al., 2020, Proc Natl Acad Sci  USA; Wynn et al., 2021, Cognition). These findings indicate that how gaze reinstatement relates to mnemonic performance depends on the demands of the task. Other work suggests that when gaze reinstatement supports memory also differs across tasks and age groups. For example, when holding a multi-item spatial array in mind, younger adults recruit gaze reinstatement to support maintenance of the array over long delays (Wynn et al., 2018, J Exp Psychol Hum Percept Perform). Older adults, however, show evidence of gaze reinstatement for even short delays. Together, these findings suggest that eye movements, and specifically gaze reinstatement, may be recruited to support memory when task demands exceed cognitive resources. 

Neural correlates of eye movement-memory interactions

Recent research indicates that in addition to being linked at the behavioural level, eye movements and memory share common neural substrates. While this work has largely focused on univariate gaze metrics (e.g., number of fixations), our research seeks to identify and elucidate the neural mechanisms underlying the functional reinstatement of complex gaze patterns. To this end, our work uses converging eyetracking and neuroimaging techniques to investigate the neural activity patterns at encoding and retrieval that predict gaze reinstatement - that is, reinstated gaze patterns that support mnemonic performance. Notably, this work has demonstrated that gaze reinstatement is associated with encoding-related neural activity in brain regions associated with visual processing and gaze control, suggesting that gaze patterns contain important sensory and motor information (Wynn et al., 2021, J Cogn Neurosci). Gaze reinstatement is also associated with retrieval-related activity in the hippocampus (Wynn et al., in prep) and with patterns of encoding activity in the hippocampus that also predict subsequent memory (Wynn et al., 2021, J Cogn Neurosci).  Together, these findings suggest that gaze reinstatement and memory are supported by similar brain mechanisms. 

bottom of page