Sunday 6 March 2022

Lend me your ears! (Hearing 2)

In nature, localising the source of a sound has obvious survival value, to localise prey, predators, mates, competitors, etc. The previous post started the subject, but this one deals with how mammals, humans in particular, try to solve that problem. As we shall see, localising sounds is not all that easy. Of course, some animals are much better at this than others, but there are some fundamental problems. Humans make use of no less than three different mechanisms. This post might be a bit technical, but I left the mathematics to a nice free good review, here.   

The main problem is that sound can bend around obstacles, and so change direction. A sound reaching your ear can therefore come from just about anywhere. You can therefore hear things you cannot see, which is good news in the dark or in a jungle. Of course, you do not know where it is coming from. Sound provides a good alerting but a poor localising system, whereas a well-developed eye is a localising organ (see here for a comparison of echolocation with vision). 

Click to enlarge; copyright Gert van Dijk

If we would have just one simple ear of the hole-in-the head variety, but with an eardrum and vibration sensors, we would be stuck at that level. But bilateral symmetry provides us with two ears, and evolution made clever use of that. The image above shows that the waves from a sound source travel directly to the nearest ear but must travel further to reach the farthest ear. The sound arrives at the ears with a time difference: the 'interaural time difference (ITD)'. The difference in arrival time depends on the extra distance the sound has to travel, shown in red. That difference depends on where the sound comes from, relative to the axis between the two ears. The maximum arrival distance occurs with sources placed on that axis, because the sound must travel farthest around the head. The minimum is no difference at all: this occurs when the source is placed on the plane of symmetry. You could make a table telling you which difference corresponds to which angle, and that is basically what the nervous system does for you. 

Click to enlarge; copyright Gert van Dijk
 

Unfortunately, this is not a perfect solution. The arrival difference tells you what that angle is, but not whether the source is up, down, to the front, the back, or anywhere in between. The image above tries to explain that. The source could be anywhere on the brilliantly named ‘cone of confusion’; that is the surface you get if you rotate the angle around the ‘hearing axis’. One way to get around this problem  is to rotate your head and listen again, because then you get a different cone of confusion, giving you an additional clue where the sound comes from.  

Mammalian ears make use of a second trick. When sound bends around an object, its volume decreases. Your head provides a ‘sound shadow’: The volume of sound in the nearest ear is louder than that in the farthest ear, in the sound shadow, so there is an 'interaural level difference (ILD)'. The difference in volume depends on the angle, and again the brain constructs a table telling you which difference in sound level corresponds to which angle. But that irritating cone of confusion is still there...

Mammal evolution came up with a third trick: the external ear! The complex shape of the external ear alters the spectrum of the sound, and how it is altered depends on the location of the source of the sound: the 'head-related transfer function ('HRTF)'. This works to an extent with just one ear. When I first read this, I wondered how that could work, because people have such differently shaped ears. Well, the solution is brain learns to live with the filtering characteristics of the ears you happen to have. This has been put to the test by altering people's ears by placing a mould on the ear, and that indeed that fooled the brain to make mistakes in sound location. 

Click to enlarge; source and rights here

You may wonder why you would have three systems for the same purpose. One part of the answer is that the efficacy of each system depends on sound frequency. The figure above shows the frequency: ITD ('arrival time') works best at low frequencies, and ILD ('loudness') and HRTF ('external ear') work best at high frequencies. Together, they do a nice job for all frequencies. But there are strong clues that the situation still is not optimal, and that is behaviour. Many animals can move their ears, and people can tilt and turn their head to locate the source of a sound: that shrinks the cone of confusion. But we still start a visual search of the general area of the sound, hoping that vision will provide the ultimate localisation. Of course, that is in part because we are diurnal mammals with very good vision; but part of the explanation is that localising sounds is inherently difficult.

Could animals on other planets do better? I think so, but that speculation will have to wait for another post.