Although our sense of hearing facilitates certain high-profile abilities, such as performing an aria or discerning whether an infant’s cry is one of pain or hunger, it also assists us in subtler ways. One important role of our auditory sense is to help us better understand the space around us by detecting the direction and distance of nearby objects. To process this information, the brain recruits peripersonal neurons, or personal-space neurons. Michael S. A. Graziano, a neurology professor who has studied these cells for 30 years, explains that their purpose is to “monitor the space around the body.” Filtering visual, auditory, and tactile cues while also incorporating remembered locations of objects, peripersonal neurons react sensitively to intruding entities. In this passage from his recent book, The Spaces Between Us: A Story of Neuroscience, Evolution, and Human Nature, Graziano describes how the brain constructs auditory space and reveals that humans possess a skill set more typically associated with bats and dolphins.
Auditory space is probably the most difficult, computationally intensive space for the brain to construct. Tactile space is easy in comparison. If there’s a touch on the hand, then something is on your hand. There’s a one-to-one correspondence between the location of the sensory neuron and the location of the relevant object in the world. But if there’s a sound nearby, the sound waves spread everywhere and enter both ears. Auditory space requires a lot of hard preprocessing of the data. Of course, it doesn’t seem like that to our conscious minds. Perceptually, the sound seems to come from there. Behind that perception is a lot of hidden computation.
Some aspects of auditory space are reasonably well understood. To calculate whether the sound comes more from the left or the right, the brain must compare the two ears. Is the sound louder in the right ear or the left? Did it reach the right ear a few microseconds before it reached the left? Precise circuits in the brainstem perform those delicate computations.
Much less is known about how the brain reconstructs the distance to a sound. A range of studies suggests that we rely on reverberation. Subtle echoes tell us about the size of the space around us and the distance to a sound source. We all intuitively know the sonic difference between an echoing cathedral and a closet full of old clothes. The space is alive in one case and totally dead in the other. Even with your eyes closed you can sense that largeness or smallness. The same subtle reverberation cues are used to parse out the distance to a voice or a footstep.
One of the hints that reverberation lies at the root of distance perception comes from animals that are true experts at three-dimensional auditory perception: bats and dolphins. They evolved echolocation, which is really just a souped-up version of reverberation analysis. They generate a sound pulse and analyze the returning echo.
People also have a version of echolocation, though we are novices compared to bats. It used to be called facial vision before anybody knew what it was. If you blindfold someone and tell her to walk carefully forward and stop before hitting an obstacle, she’ll probably manage to stop before disjointing her nose on a wall. She won’t necessarily know how she does it. People report feeling something like a warning prickle on the face or a shadow in their minds.
I’m reminded of a short story by Roald Dahl, The Wonderful Story of Henry Sugar. It’s about a mystic who develops his facial vision to an amazing degree. In a moment of hubris he has wads of dough plopped over his eyes and then linen wrapped around and around his head until he looks like a mummy from the neck up. To the astonishment of the spectators, he gets on a bicycle and weaves effortlessly in and out of traffic. Well, that’s fiction. And it’s Roald Dahl, too, so we can expect the exaggeration.
In reality, facial sense can give you a vague impression of nearby looming objects. It can also be blocked by a simple intervention. If the mystic had put the dough in his ears, he would have been hopeless getting around. Facial sense, when put to experimental test, depends on subtle sounds generated by the body—breathing, rustling of clothes—and the reverberation of those sounds from nearby surfaces. We don’t consciously realize that it has anything to do with sound. That processing is somehow under the surface of awareness. The experience seems more like a spooky kind of shadow vision without eyes, or like a subtle warmth on the face.
The personal-space neurons that we studied, with their auditory and visual responses extending into the space around the head, joined to a strong tactile response on the face, begin to sound particularly relevant. I suspect that they form the basis of the mysterious facial sense. The neurons can be triggered by a variety of sensory sources and, once triggered, give a rough map of where objects lie around the body.
From The Spaces Between Us: A Story of Neuroscience, Evolution, and Human Nature. Copyright © 2018 by Michael S. A. Graziano and published by Oxford University Press. All rights reserved.