Please enable javascript to view this page in its intended format.

Queen's University

Feature Story: Effie Pereira, August 2014
Exploring our visual and cognitive systems


By Eric Brousseau with Effie Pereira
Photo by Eric Brousseau

The amount of visual information that we get from the world around us is astounding – from basic features like colours, textures, and edges, to higher-level concepts like objects, structures, and spatial layout. The fact that we easily perceive this information to plan, coordinate, and perform a multitude of tasks on a daily basis is a testament to the efficiency of our mental processes. Psychology Brain, Behaviour, and Cognitive Science Masters student Effie Pereira has always been fascinated by eye movement planning, especially when it comes to specific directed tasks.
“When we are trying to find our keys as we run out the door, our visual and cognitive processes function with complete ease to guide us in our task,” Effie says. “I’m interested in exploring what aspects of our environment actually direct our eyes. For example, in searching for our keys, do we primarily seek out specific features and look for small metallic oddly-shaped items, or do we use more general information about where we typically expect keys to be? And if our expectations do play a role, does this affect what areas we decide to focus on?”

Effie works with her supervisor Dr. Monica Castelhano in the Queen’s Visual Cognition Lab to explore questions of how human visual and cognitive systems interact to deal with information contained within scenes. Effie’s main work has focused on how prior knowledge and expectations influence search strategies while trying to find objects. Complimentary to this, she’s also interested in how these expectations affect what we pay attention to within scenes such that we naturally bias our eyes towards specific areas, sometimes even to our detriment – as in when we’re searching for something that has been misplaced or placed somewhere unusual.

Eye tracking systems are typically used to help answer these questions. “Our eyes move about three times per second, but most people aren’t actually aware of how much they move their eyes,” Effie explains. “Using an eye tracker to non-evasively record these movements is a powerful and unbiased way of investigating the types of information that is important across various tasks.”

When subjects participate in a study, they look at images on a computer screen and are asked to perform a specific task with these images, such as searching for an object. Effie and her team use a special infrared camera that records where the participant’s eyes move while completing this task. She can then associate where participants look and how long they look with the importance or interest in the information in completing that task. “To use the example of keys again, if you were looking for them in a living room and your eyes kept darting to the tables, that speaks to your inherent knowledge and expectations about where those keys are most likely to be found in the room,” Effie says. “We can then exploit and manipulate this information by asking you to search for the keys in a bathroom to see whether similar strategies come into play even when the situation is novel or unusual.”

Effie has found that search strategies are often based on where we expect to find objects and this strategy works quite successfully most of the time. The interesting questions arise in how these expectations are formed and how they adapt to changing circumstances. So when searching for keys, objects around the room are helpful in guiding our attention, but only if they also fall within the areas that we expect to find the keys.

What she’s also finding is that our attention can be biased and restricted towards these expected areas such that we might take longer to move our search outside of these regions, sometimes not noticing quite obvious things that appear in these outside spaces. For example, if you’re still searching for your keys on the table, you may not notice your pet suddenly appearing at your feet.

Effie has also had the opportunity to collaborate with Queen’s Psychology’s Dr. Christopher Bowie and use eye tracking to better understand cognitive impairments in patients with schizophrenia. This feedback of cognitive data informing clinical work (and vice versa) helps us gain a broader understanding of the human condition in all its complexity.

Her main goal through her research is to answer fundamental questions about how we do the complex things that what we do, not only because it furthers our understanding about the interconnectedness of our cognitive system, but also because it teaches us about how these connections affect how we process information. “These types of questions have important implications for the basic theoretical understanding of how we attend to and retain information about the world around us,” Effie stresses. “Armed with this knowledge, we might also be able to better assist with clinical conditions that affect the visual system, like tunnel vision and macular degeneration.”

Kingston, Ontario, Canada. K7L 3N6. 613.533.2000