November 8, 2013
Just as our eyes inform us of the position of people or objects in our environment, our ears are also directional. Young children learn to turn their heads towards a sound. Stereo recordings fill our heads with music that stretches from side to side and fills the space in between. It is also not difficult to notice the safety implications of judging the direction of sounds that signal imminent danger. What produces directional hearing, and how do we incorporate both ears to identify the direction of a sound source?
Our ears have been placed on the sides of our head and are essentially oriented “forward”. The conical shape of the outer ear “pinna” (the visible portion) favors sounds coming from the front. This is not difficult to understand, given the emphasis on vision in communication. We like to see whomever we’re talking to. Sounds from directly in front of us tend to sound relatively even between the ears. Sounds from the side, however, “sound like” they’re coming from that side.
High-frequency (high-pitched, as whistles and speech sounds such as /s/, /f/, and /th/) sounds are localized by our ears from left-to-right based on the relative level of sound. Known as the “interaural level difference”, sounds that are more intense on one side will be sensed toward the more intense side. Low-frequency (“bass” sounds and vowel sounds in speech), on the other hand, are localized according to “interaural time difference”. The ear receiving the signal first will be assumed to be the side from where the sound originates. In both cases the greater the difference from ear to ear the more lateral the sound will appear. Sounds from the extreme left or right are also easily judged due to the “head shadow”, where the farther ear has sound blocked by the head. Stereo recording exploits another phenomenon called the “Stenger Principle”. If both ears receive the same signal directly, as through headphones, but the sound is more intense on one side, the listener will only perceive the louder side.
Scientists have discovered another, more complex, method of sound localization. We have seen the importance of utilizing both ears for localization, however, researchers have found that the folds and contours of the outer ear not only funnel the sound energy toward the eardrum, they cause reflections of the sound waves that may reinforce or interfere with the original waves. These reflections will differ based on the angle of the incoming sound. This is known as a Head Related Transfer Function (HRTF), which requires only one ear and is reinforced by the experience of the listener. Sounds moving around the head will produce changing reflections of sound, and the brain learns to recognize the difference between various angles. “3D sound” technology has been able to mimic this phenomenon through selected filtering of sound, generating an illusion of three-dimensional space. Sounds are not only localized left-to-right, but also top-to-bottom and front-to-back. See www.youtube.com/watch?v=5oAPsP2JZ9s for an amusing demonstration that requires headphones.
For our daily activities it is readily apparent that localizing sound is important not just for safety, but for effective communication. Hearing impairments interfere with all of the localization methods we’ve discussed, therefore it is imperative to ask your hearing professional how to best preserve and maximize directional hearing. Let us keep you on a clear path to good hearing and ear health.