Recently we wrote about a portable brain scanner that’s able to record a person’s neural activity while they’re on the go. That’s pretty impressive, but as far as on-the-move brain scanning is concerned, it may have just been one-upped by research coming out of the Johns Hopkins University. Researchers there have developed technology which lets them study what happens in the brains of bats as they fly.
The breakthrough is the culmination of a 25-year dream by Cynthia Moss, professor in Psychological and Brain Sciences and Neuroscience at Johns Hopkins. It involves using a tiny wireless brain signal recording device, weighing less than one ounce. The bat’s flight is carried out in a special “flight room,” boasting high-speed cameras and microphones for picking up the bat’s echolocation calls. By combining the bat’s brain activity, its location and the timing of its vocalizations, the team was able to determine which objects were triggering the bat’s neurons to fire and, as a result, what it was paying attention to.
“We are interested in how the external three-dimensional environment is represented in the brain and how these representations are used by the animal as it moves through space, while attending to the location of objects to guide its path,” Moss told Digital Trends. “A vast majority of research on how the brain determines the location of an object has been conducted in restrained animals, using 2D stimuli and simplified behaviors. Our work is exciting because we use an animal performing a naturalistic real-world task.”
Among the researchers’ discoveries was the fact that neurons in the brain represent the 3D locations of objects in space, and that when bats increase their attention on an object, these neural representations sharpen. This is the first time that such brain activity has been recorded in an animal as it moves through 3D space, inspecting and reacting to objects in its path.
The work isn’t just of interest to bat researchers, though. Moss said that potential applications may exist in the field of autonomous sensing. “A variety of robotic and self-guided systems take in information from the outside world, and use this information to react accordingly,” she noted. “Our research shows that the brain dynamically adjusts representations of the sensory world depend on action selection, and we believe that these results would be beneficial in the development of new autonomous regimes that adjust according to task demands.”
A paper describing the research was recently published in the journal eLife.