Sunday, 6 March 2022

Lend me your ears! (Hearing 2)

In nature, localising the source of a sound has obvious survival value, to localise prey, predators, mates, competitors, etc. The previous post started the subject, but this one deals with how mammals, humans in particular, try to solve that problem. As we shall see, localising sounds is not all that easy. Of course, some animals are much better at this than others, but there are some fundamental problems. Humans make use of no less than three different mechanisms. This post might be a bit technical, but I left the mathematics to a nice free good review, here.   

The main problem is that sound can bend around obstacles, and so change direction. A sound reaching your ear can therefore come from just about anywhere. You can therefore hear things you cannot see, which is good news in the dark or in a jungle. Of course, you do not know where it is coming from. Sound provides a good alerting but a poor localising system, whereas a well-developed eye is a localising organ (see here for a comparison of echolocation with vision). 

Click to enlarge; copyright Gert van Dijk

If we would have just one simple ear of the hole-in-the head variety, but with an eardrum and vibration sensors, we would be stuck at that level. But bilateral symmetry provides us with two ears, and evolution made clever use of that. The image above shows that the waves from a sound source travel directly to the nearest ear but must travel further to reach the farthest ear. The sound arrives at the ears with a time difference: the 'interaural time difference (ITD)'. The difference in arrival time depends on the extra distance the sound has to travel, shown in red. That difference depends on where the sound comes from, relative to the axis between the two ears. The maximum arrival distance occurs with sources placed on that axis, because the sound must travel farthest around the head. The minimum is no difference at all: this occurs when the source is placed on the plane of symmetry. You could make a table telling you which difference corresponds to which angle, and that is basically what the nervous system does for you. 

Click to enlarge; copyright Gert van Dijk
 

Unfortunately, this is not a perfect solution. The arrival difference tells you what that angle is, but not whether the source is up, down, to the front, the back, or anywhere in between. The image above tries to explain that. The source could be anywhere on the brilliantly named ‘cone of confusion’; that is the surface you get if you rotate the angle around the ‘hearing axis’. One way to get around this problem  is to rotate your head and listen again, because then you get a different cone of confusion, giving you an additional clue where the sound comes from.  

Mammalian ears make use of a second trick. When sound bends around an object, its volume decreases. Your head provides a ‘sound shadow’: The volume of sound in the nearest ear is louder than that in the farthest ear, in the sound shadow, so there is an 'interaural level difference (ILD)'. The difference in volume depends on the angle, and again the brain constructs a table telling you which difference in sound level corresponds to which angle. But that irritating cone of confusion is still there...

Mammal evolution came up with a third trick: the external ear! The complex shape of the external ear alters the spectrum of the sound, and how it is altered depends on the location of the source of the sound: the 'head-related transfer function ('HRTF)'. This works to an extent with just one ear. When I first read this, I wondered how that could work, because people have such differently shaped ears. Well, the solution is brain learns to live with the filtering characteristics of the ears you happen to have. This has been put to the test by altering people's ears by placing a mould on the ear, and that indeed that fooled the brain to make mistakes in sound location. 

Click to enlarge; source and rights here

You may wonder why you would have three systems for the same purpose. One part of the answer is that the efficacy of each system depends on sound frequency. The figure above shows the frequency: ITD ('arrival time') works best at low frequencies, and ILD ('loudness') and HRTF ('external ear') work best at high frequencies. Together, they do a nice job for all frequencies. But there are strong clues that the situation still is not optimal, and that is behaviour. Many animals can move their ears, and people can tilt and turn their head to locate the source of a sound: that shrinks the cone of confusion. But we still start a visual search of the general area of the sound, hoping that vision will provide the ultimate localisation. Of course, that is in part because we are diurnal mammals with very good vision; but part of the explanation is that localising sounds is inherently difficult.

Could animals on other planets do better? I think so, but that speculation will have to wait for another post.



14 comments:

  1. I always love the more technical posts you make on this blog.

    The most obvious solution to better localise sound is probably just have more ears.

    ReplyDelete
  2. Keavan: thanks; I am always worried that no-one really likes this kinds of posts. And bonus points for being the first to mention the solution!

    ReplyDelete
  3. these analyses(sp) of different parts of anatomy & biology are always enjoyable. A pox on anyone who makes you feel otherwise.

    Don't dolphins (and even hippos) use their lower jaw as a makeshift ear to pick up certain pitches/tones...which, now that I'm typing it, reminds me a bit of snakes.

    -anthony docimo.

    ReplyDelete
  4. Anthony: It isn't that people write unpleasant comments on this type of post, but that the number of visits is at times a bit disappointing for such posts. I was saw that much of my explanation of the anatomy of giants had been used by a YouTube author. He had taken care to mention the blog explicitly, so that was done as it should. What I learned from that is that a YouTube video can attract hundreds of times as many visitors as my blog posts. However, I prefer writing.

    ReplyDelete
    Replies
    1. Personally I find that number of visits are far overrated. Don't you want people to really read and think about your posts? For me the thing is that these technical posts are a little more scientifically based and more food for thought.

      Delete
  5. About "alien ears", that brings up a question related to an older post on this blog, although it wasn't really explicitly mentioned in that one; how well would the "Cloverhead"'s (from Purple Plasmid's Fentil) ear-setup work?

    You know, the one with the 2 forward-facing and 2 backward-facing ears.

    ReplyDelete
  6. Ruega Suzu Re: having more than two ears should indeed solve many of these problems, an that is indeed what he next post in the 'hearing' series will be about.

    ReplyDelete
  7. I am also super interested in your technical posts! You are able to explain all these processes and functions in a very accessible way and I appreciate that a lot.
    Looking forward to the next part already!

    ReplyDelete
  8. Good stuff. Unfortunately for me, you have written what I was intending to write one day. Sadly it was far down my todo list and still in the distant future.

    It seems that you are also about to discuss acoustic cameras which was my intent too. The use of "ear arrays" seems under-utilised in reality so it seems a good choice for an alien sense.

    It's also interesting to consider how active echo-location combines with this. In fact this would be very similar to how some imaging radars work, so I still feel the challenge to justify how an organic radar-like sense could possibly evolve...

    ReplyDelete
  9. Ruega Suzu Re and Petr: my pleasure and thank you
    Abbydon: On the one hand, the internet has a long memory, with old posts completely accessible; on the other hand, many people do not bother to search for themselves: I see discussions on the speculative evolution forum repeating themselves on subjects that I thought had been dealt with. So, if you wait just a few years, you can serve a brand new audience!
    I had not finalised my thoughts on how far I would take the 'multiple ears' concept, but doubt I will take it as far as 'acoustic cameras.'

    ReplyDelete
  10. Really looking forward to your thoughts on multiple ears. I fear more than 2 ears would also complicate the problem of localisation by adding another dimension to the localisation problem. Doesn't evolution on earth prove multiple ears are not efficient? ;)

    ReplyDelete
  11. Bob: You are right about the number of visits. I guess it would be nice to have the kind of high numbers that some YouTube people achieve, but I feel comfortable with the level of writing and of the response I get now. ;-)
    As for evolution proving something about hearing; well, that is a question of phrasing. Evolution has certainly performed impressive feats with bird and mammal hearing, but to conclude anything about multiple ears, there should first be enough material to draw a comparison. The only animals I found with multiple ears are insects, but that is just one factor amidst a vast array of other differences, so I personally would not dare to say that the insect evidence argues for or against multiple ears. There are simply no birds or mammals with more than two ears, so there is no evidence; not for, not against...

    ReplyDelete
  12. The closest example of multiple ears in a vertebrate I can think of is the lateral line in fish. It's basically hearing but for low frequencies (<100 Hz). It's linked to the difference between the far-field and near-field in acoustic propagation. To require the same approach on land you'd need a similarly high speed of sound to keep the wavelengths long and therefore the near-field quite large. This might occur in a gas giant atmosphere where the speed of sound is comparable to underwater I suppose.

    As for efficiency, I suspect multiple ears would be more efficient than the use of pinna and the HRTF effect to perform vertical localisation. It's a typical evolutionary incremental solution that produces a local optimum but not a global optimum.

    Note that we have the same problem when we design imaging MIMO radars but just adding another receiver offset in the vertical direction is the simple solution. Admittedly we have also designed complicated shaped antennas that operate in a vaguely similar way to pinna too.

    ReplyDelete

Please leave a message if you find any of this of interest.