Inner Explorer
Investigating the brain's innate ability to map our surroundings.
You step out of a coffee shop and scan the street, looking at buildings, parked cars, people walking down the sidewalk. You notice some trees and the angle of the sun. If you’re familiar with the area, you confidently turn toward your next destination without giving your surroundings much thought. If you’re not, you might consider each of these visuals as you decide: right or left? No matter the scenario, your brain is calling on multiple tools to build a mental map of your space and point you in the right direction.
It’s the type of scenario Russell Epstein, a psychology professor, analyzes in order to map out the brain’s navigation system. “The mind is like a Swiss Army knife with different tools to solve different problems as they come up,” he says, and he wants to figure out how those tools work together to recognize places and make cognitive maps of the world that surrounds them.
To this end, Epstein and his research partners study the neural differences between recognition and orientation, perceptions of static and dynamic images, and the development of spatial recognition over time.
His lab tracks brain activity when people look at pictures of landscapes, city streets, buildings, or rooms, as well as pictures of faces or objects. Using functional magnetic resonance imaging (fMRI), he found that a specific region, called the parahippocampal place area (PPA), plays a key role in processing information about the spatial structure of visual scenes. The PPA was particularly active when people viewed the images of streets and landscapes, but less so when looking at faces. This specialized response suggests that the PPA functions to help people orient themselves in space, and that ability is different than the ability to identify something or someone.
“Most people think that locating themselves spatially is a single thought process, but the cognitive logistics are more complex than that,” says Epstein. In order to navigate successfully your mind must work out two problems simultaneously: knowing your current location and understanding which direction you are facing.
Additional evidence for the complexity of spatial orientation versus recognition comes from studying the neural activity of mice. Epstein collaborated with Isabel Muzzio, a professor at The University of Texas, San Antonio, and Penn Arts and Sciences graduate students Josh Julian, GR’22 and Alex Keinath, GR’22, to understand how mice figure out where to go when they are disoriented.
In their experiment, mice were placed in an enclosure with two rectangular rooms, each with unique wallpaper on one of the four walls. Researchers trained the mice on where in the enclosure food was located. When the mice were placed in the enclosure, they had to solve two problems in order to find the food. First, which chamber were they in? Second, what wall were they facing?
“There is growing support for the idea that people map the social and conceptual worlds using the same mechanisms that they use to map the physical world. This suggests that the idea of space might lay the groundwork for much of human thought.” - Russell Epstein, Professor of Psychology
The team observed the activity the brain structure called the hippocampus to determine that the mice used the wallpaper to figure out which chamber they were in, but they ignored it when figuring out which way they were facing. To solve the directional problem, they only paid attention to the shape of the room.
Recently, Epstein’s research has expanded to include mapping brain activity when viewing static and dynamic images. He is working with John Trueswell, a psychology professor who focuses on language and thought, and graduate student Alon Hafri, GR’23, on an experiment that looks at which areas of the brain are stimulated when people view actions and whether the neural codes changed depending on visual input.
Their experiment involved two sets of stimuli. The first set consists of dynamic videos of two actors in identical indoor settings. The actors complete various actions in these identical settings. The second set of stimuli, made up of static images, offers great variety in terms of setting and point of view. In these images, the same actors are posed in a way that suggests an action—an outreached arm, say, or a raised leg—but the images are static.
Epstein and his collaborators designed this experiment to learn more about how the brain processes movement in space. Study participants, a group of college-age adults, viewed the stimuli in an MRI machine while the researchers examined brain activity.
They determined that in certain regions of the brain, activity exhibits similar patterns when viewing dynamic and static depictions of the same action. So, a video of a person taking a bite of food results in similar neural coding as a still image that suggests taking a bite.
“The brain representations we observed,” Hafri says, “are surprisingly consistent no matter how different biting looks in the world. Even when you don’t see the whole action take place, like when you view a static image, these brain regions are encoding the visual input similarly to when you see the whole action from beginning to end, as in a video. That suggests the encoding we are observing is at a very high level.”
Putting his work in the larger context of current neuroscience research, Epstein says that scientists are now speculating that the orientation to space and the mapping of the physical world form a sort of brain template for more abstract concepts. The tools we use to navigate real space may be used to navigate conceptual spaces.
“People gauge if they are higher or lower on a social hierarchy,” Epstein says. “They judge how ‘close’ two people are. These might be more than just metaphors. There is growing support for the idea that people map the social and conceptual worlds using the same mechanisms that they use to map the physical world. This suggests that the idea of space might lay the groundwork for much of human thought.”
“There is growing support for the idea that people map the social and conceptual worlds using the same mechanisms that they use to map the physical world. This suggests that the idea of space might lay the groundwork for much of human thought.”
Spatial Intelligence: There’s an App for That
Psychologists can rate spatial intelligence in an assessment called judgment of relative direction (JRD). In a practical application of special cognitive theory, Professor of Psychology Russell Epstein’s lab developed an iPhone app called iJRD, which allows people across the world to virtually navigate their way through their home cities and test their senses of direction as they go.
The project was spearheaded in winter 2016 by Joshua Julian, a doctoral candidate in psychology, with the research assistance of Peter Bryan, C’16,
“We wanted to take our research out of the lab and into the real world,” says Bryan.
Users enter their location and choose from a list of familiar landmarks. For example, a Philadelphian might choose the Liberty Bell and Constitution Hall. Users imagine standing in front of one landmark while virtually indicating the direction of other locations. User can learn about their spatial abilities and compare them to those of other who have downloaded the app.