Showing posts with label echolocation. Show all posts
Showing posts with label echolocation. Show all posts

Friday, 19 October 2012

Big Bad Flashy Fish (BBFF): the final answer to echolocation

This is -probably- the last of my series of posts on the comparison of vision with echolocation. Previously, I discussed disadvantages of echolocation (here and here).First, an animal using echolocation must send out very loud noises, and in doing so makes its presence known over a much larger distance than at which it can detect objects itself. Second, echolocating animals  have to provide their own signal limiting their range, while sight takes advantage of sunlight or moonlight. Third, sight -and hearing- are passive senses, not betraying the presence of animals using them. Instead, echolocation boils down to shouting "WHERE ARE YOU!?", which probably means that only the Big and the Bad can afford its use.

I ended by assuming that the darkness of the deep sea would make it a perfect habitat for echolocators. Of course whales do exactly that, and they fit the job description of being Big and Bad. But they have not been around all that long, and the seas have been full of fish and squid for much longer, so you would think that they would have had the time to evolve echolocation. So where are the marine echolocators? Nothing. Silence. 

So I asked a biologist, Steve Haddock, who was kind enough to enlist a colleague, Sonke Johnsen. Here is their conversation, Steve Haddock first: "I don't know of any examples. Lots of fish make sound (the midshipman), but it takes a lot of energy and seems to be largely for mating. Maybe the distance between their 'ears' is too small to be effective? Even humans underwater can't tell what direction sound is coming from. That doesn't explain bats, but different speeds of sound in air vs. water? Not sure, but it is an interesting question!"

I had not thought of that, but sound certainly travels faster through seawater than through air. At a depth of 2 km, sound travels at a speed of over 1500 m/s. Compared to about 333 m/s in air at sea level, the speed of sound in the deep sea is about 4.5 times faster. That matters, because you can tell the direction of a sound by measuring the differences in arrival time between two ears. Immersing those ears in water immediately makes the difference in arrival time 4.5 times smaller and therefore more difficult to detect. Could it still work? To find out,  I first assumed a distance between the two ears of 20 cm. With that, a sound coming in from the side will arrive 0.6 ms later at the farthest ear in air, and 0.13 ms later in the deep sea. That does not seem like a lot, but Wikipedia informs us that humans can detect differences in  arrival times of sound of 0.01 ms. So, given some good neural software, it should be possible to use this trick in the deep sea.

Anyway, Sonke Johnson added the following to Steve's reply: "There seems to be no good reason why fish don't echolocate. There are certainly fish and sharks whose heads are wider than echolocating dolphins. It's also not a marine mammal thing, since seals don't echolocate. Many fish and mammals eat the same things, so it's not that either. Cetaceans have great hearing, but that's sort of a chicken-egg thing and there's nothing preventing fish from having better hearing. You can't even say it's a warm-blooded-only club, because certain large fish (e.g. swordfish, tuna) actually heat up their brains and eyes so that the work faster. It's probably just one of those things. One possibility is that early cetaceans may have started in muddy rivers. Muddy river animals sometimes evolve interesting sensory systems (e.g. electroreception) because it's impossible to see. Even today, some cetaceans inhabit murky rivers and lagoons. Of course, many fish do too...." 

I thanked both through email, but would like to repeat my gratitude to them here.

Back to the light

So far, we have to conclude that we do not know why the deep seas are not filled with Big Bad Echolocating Fish (BBEF) or Squid (BBES).

Click to enlarge; from this source

The sea is full of bioluminescent animals, and the image above shows the ways it can be used for offensive purposes, something we will focus on. An amazing array of life forms, from bacteria to many diverse major groups, have bioluminescence. They use it for a wide variety of purposes, that can be basically divided into defensive and offensive ones. Steve Haddock has written a very comprehensive review, that can be obtained free of charge, and which is very readable for non-biologists. There is also an excellent website. I will focus on just one of the many uses of bioluminescence: to illuminate prey using photophores.

Click to enlarge; source here

First, what are photophores? Well, the word simply means 'light bearers', so they are organs producing light. Without ever having studied them, I thought they would be just sacs with bioluminescent chemicals in them. But as the image above proves, showing a squid photophore, they turn out to be much more complicated than that. Perhaps you recall the reasoning that the physics of light quickly led to the evolution of a camera-type eye, with a retina, lens and diaphragm? Well, there are lenses and shutters in photophores as well. There must have been a process very similar to that of evolution of the the eye, but here the question must have been how to produce the best biological flashlight possible. The image above shows a photophore from a squid. At the centre there is a light producing mass, surrounded by a mirror, reflecting light until it exits the photophore through a lens. I have unfortunately not found a review paper comparing the optical design of photophores, but this should be enough to prove how complex they can be.

Click to enlarge. These are 'loosejaw' fish.The one on the top right sends out red light, and is called Malacosteus niger. Note that these animals have various photophores on their heads. The one it is all about is the suborbital one ('so'). 
From: Kenaley CP. J Morphol 2010; 271: 418-437

Now, finally, we are ready for the final twist in the comparison of vision and echolocation. There are fish, shown above, using well-developed photophores as searchlights to find their prey. This use of light is very similar to echolocation: the animals have to provide their own signal, resulting both in a limited range and in becoming rather conspicuous.


Click to enlarge. Photophores from the dragonfish Malacosteus. In the two images, 'c' is the light-emitting core, 'r' is the reflector surrounding it, and 'f' is a filter to give the emitted light a red colour. The light bounces around until it exits the photophore through the aperture 'ap'. Form: Herring and Cope, Marine Biology 2005; 148: 383-394

The fish best known for this behaviour are so-called dragonfish, and their use of photophores involves the kind of wonderful bizarre features that only real evolution produces. These fish send out red light, which is unusual because red light doesn't carry very far in water. Most bioluminescent signals therefore use blue light, and accordingly most animals in the deep sea cannot see red light. They also cannot see the red light emitted by the dragonfish, which is rather cunning and makes the searchlight invisible. The snag is that some dragonfish species do not have a pigment in their retinas to see red light either...

Instead, they use a trick: there is an antenna protein in their eyes that is sensitive to red light, and this transferred the energy to the pigments sensitive to blue and green light that the fish does have. That transfer pigment works like chlorophyll, not a protein you expect in an animal at all. That's because the fish obtain it from their food and somehow transfer it to their retina. All this can be found on the website I mentioned. I certainly would not dare to use such outrageous traits in my fictional animals!

The oceans may not be filled with predatory BBEF, but there aren't many Big Bad Flashy Fish (BBFF) either. Neither option seems to have gained evolutionary prominence. Perhaps their characteristic conspicuousness makes these options too risky. I must say I like the option of equipping animals with flash lights.

Click to enlarge; copyright Gert van Dijk

So here is a quick and rough sketch of a possible Furahan animal with searchlights, which has just spotted a tetropter. Would the edge of better prey detection outweigh the increase in its own predation risk? I do not know. There are other worrying thoughts: why is bioluminescence on Earth so rare outside the oceans? It is hardly found on land, and does not even seem to occur in fresh water either. Are there reasons for that? Is the poor animal shown above doomed already?

Saturday, 11 August 2012

Why sight is superior to echolocation

In the previous post some characteristics of echolocation were discussed, and the results were somewhat worrying as far as a comparison with vision is concerned: echolocation involves 'shouting to hear a whisper', meaning that its range is limited and the sender is loudly proclaiming its presence.


Optics of pinhole and lens eyes. Source here

In my opinion there are two other major difficulties with echolocation that favour vision. The first is the ability to locate objects: with eyes such as ours it is very easy to locate objects. Rays of light can be bent by lenses and can reflect from surfaces, but in between they follow nice straight lines. That is the reason why even a simple pinhole camera such as in the image above will produce a good image: any particular point on the retina can only be lit be rays coming from a direction specific to that point. Such a pinhole will not let much light in, and solving that by increasing the pupil will blur the image. If you put in a lens you can have a large pupil for lots of light with a sharp image. Problem solved. The point of all this is that seeing an object is almost the same as knowing where it is.

From: Animal Eyes (2nd Ed.), Oxford; copyright Land & Nilsson

Above you see an image of lineages of eye design, leading to pinhole eyes and eyes with lenses. Those who wish to read more about eye evolution should read the new edition of Land and Nilsson's 'Animal Eyes'. It also describes the very high number of eye designs (there are even eyes based on mirrors!). Another very nice book is Evolution's Witness, with hardly any physics but boasting numerous examples of wonderful eye designs.


From: Animal Eyes (2nd Ed.), Oxford; copyright Land & Nilsson

In 1994 Nilsson and Pelger calculated that a good camera eye with a retina and a lens could evolve from basic elements without any localizing ability in fewer than half a million generations. With one generation a year this amounts to a geological blink of an eye (sorry for that one), meaning just half a million years. Vision can apparently evolve so quickly and conveys such a large advantage that some say it explains the runaway evolution known as the Cambrian explosion. In a very short time things such as armour, speed and vision evolved. Giving animals an unobtrusive ability for precise long-range sensing may just have been the impetus to start this accelerated runaway evolution: claws, shells, teeth and brains co-evolved quickly. Perhaps vision was not the only factor jump-starting the process, but the idea is too powerful to ignore a role for vision altogether, I think. It is tempting to think that most planets with complex life would have their own 'Cambrian Explosion' in the early evolution of complex animals. Of course, the label 'Cambrian' would not apply on Furaha, Snaiad, Nereus, nor on real exoplanets. We need a more general name for the phenomenon; how about the 'Sight Spark'? (and if it sticks can I copyright it?).

Back to sound

Seeing how a pinhole eye is simple and works well, how about evolving a 'pinhole ear'? Suppose we place lots of microphones on the inside of a sphere and cut a hole in front of the sphere to let sound in. Wouldn't each microphone only pick up sound from the bit of the world it 'sees' through the opening? If so, we would have an ear with perfect localising ability. Alas, no. Sound does not travel in neat straight lines but travels around corners. You can hear people talking through an open door even when you cannot see them.


Sound 'bending' around a building. Source here.

How the size of an opening affects diffraction of sound. Source here.

When sound waves hit objects, those interfaces form new sound sources, a process called diffraction. In the misbegotten 'pinhole ear' idea, the 'pupil' would simply act as a new sound source, so all microphones on the 'retina' would receive sounds from all directions. As a location device this would be utterly useless.

The physical reason why sound bends around corners and light does is not that diffraction is limited to sound. But in fact diffraction affects light and hence vision too. The effects of diffraction depend on wave length, and the wave lengths of sound have a range of a few cm to 15 or more meters; those of light are measured in micrometers. The diffraction of light is seen at microscopic scales, but that of sound occurs at the scale we live in, that of doors and people.

So the physics of sound conspire against it providing an easy way to tell where a sound is coming from. Evolution solved that problem as it did others, but the solution requires combining the signals from two ears (I know that using two eyes improves distance detection, but you can do it with one eye, and locating the direction of an object needs just one eye). Tiny differences in arrival time of a sound at two ears allow a suitable brain to calculate the direction of the source of a sound in the plane of the ears, but not whether it is to the front or the back nor up or down. Finding out things like that call for ingenious trickery such as tilting heads or complexly shaped outer ears that subtly change the characteristics of a sound depending on where it is. Wikipedia has a nice article on the subject. Some animals (owls!) perfected the art of sound location, but theirs is a small niche compared to the ubiquity of good camera eyes.

Mind you, I have no idea why there are no animals with more than two ears. Having four, placed at the corners of a tetrahedron, would be nice. Then again, perhaps there are arthropods with more than two functional ears. I have never heard of any but have not looked either. Are there any?

An unfair advantage of sight over echolocation

I wrote above that I thought there were two more difficulties with echolocation. The second one is based on the fact that echolocating animals have to produce their own signal, limiting the range at which they can detect anything. How about vision? There was an omission in the discussion, and it is a glaring one: the sun! (sorry about that one too). Sunlight, free for all and there regardless of whether anything of anyone is using it, is what allows vision to work as a long range sense. Compared to echolocation this free gift to sight is not really fair.

Copyright Gert van Dijk

The images above were made for the previous post: the right one showed what echolocation might be like, with some energy coming from the 'camera', only illuminating objects close by. Compare that to the left image, a visual scene lit by the much more powerful sun.

But vision is not always available, and echolocation has a chance when there is no light. On rotating planets like ours, sunlight is only available for half the time, so the night would seem a good time for animals to start echolocating. But that is not the case; most animals prefer to more or less shut down at night. In previous discussions on when echolocation would be better than vision some dieas came up: caves, planets with permanent fog, planets without suns and seas underneath ice caps. One region seemed to have been forgotten though: the deep dark seas, where the sun does not reach. Shouldn't they be filled with echolocating animals, squeaking and pinging away? For Earth whales, the ocean floor may be too deep to reach, but fish were there a long time before the first whale ancestor took its first dive. Why are there no echolocating fish? I asked experts, but they did not know either. Fish have suitable ears and brains, and nothing seems to stand in the way of them evolving echolocation. But they have not. Or is echolocation simply too much like a burglar who enters a silent dark house and then starts shouting 'Hellooo!'? I do not know.

However, I do know of one final twist in the comparison of echolocation and vision; but I will keep that for the last post on this subject...

Friday, 27 July 2012

Echolocation: a sound choice?

In January I wrote a post on whether detecting heat could supplant vision, and concluded that it was, in fact, just a form of sight. I wished to tackle echolocation next, but wondered where to start: with echolocating animals in fictional biology? Other possible questions would be which atmosphere would be best, which frequencies to use, how it can be compared with vision, etc. In the end I decided to start -there's more!- with a post on the nature of echolocation; so here we go...

The basic principle is simple: you send out a sound and if an echo returns, there is something out there. As everyone knows, dolphins and bats are expert echolocators., but it is less well known that some blind people are quite good at it, and that they in fact use their occipital cortex to process echoes, a brain region normally busy with analysing visual signals. That direct link between vision and echolocation is perhaps not that surprising, as both senses help build a spatial representation of the world outside: what is where?

A major difference between vision and echolocation is how distances are judged. In vision, judging distances depends on complex image analysis, but in echolocation the time between emitting a sound and the arrival of the echo directly tells you how far an object is away. The big problem here is that echoes are much fainter than the emitted sound. The reason for that is the 'inverse square law', something that works for light as well as for sound.

Click to enlarge; copyright Gert van Dijk

The image above explains the principle. Sound waves emanate from a source near the man in the middle and spread as widening spheres (A, B and C). As the spheres get bigger, the intensity of the sound diminishes per 'unit area'. A 'unit area' can be a square meter, but can also be the size of your ear. When you are close to the source your ear corresponds to some specific part of the sphere, and when you move away your ear will correspond to a smaller part of the sphere: the sound will be less loud. Now, the area of the sphere increases with the square of the distance. If you double the distance from the source, the area of the sphere increases fourfold, and the part your ear catches will decrease fourfold. To continue; increase the distance threefold and the volume decreases ninefold. Move away ten times the original distance from the source, and the sound volume becomes 100 times smaller!
In the image above, only a tiny fraction of the original sound will hit the 'object', a man, at the left. Not all of that will bounce back, and the part that is reflected forms a new sound: the echo. The echo in tun decreases immensely before arriving at the sender, and that is the essence of echolocation: to hear a whisper you have to shout.


Click to enlarge; copyright Gert van Dijk

As if the 'inverse square law' is not bad enough, there is another nasty characteristic of echolocation. At the left (A) you see a random predator using echolocation. Oh, all right, it's not random, but Dougal Dixon's 'nightstalker' (brilliant at the time!). It sends out sound waves (black circles) of which a tiny part will hit a suitable prey; there's that man again. As said, the echoes travel back while decreasing in strength (red circles).
There will be some distance at which a prey of this size can just be detected. Any further away and the returning echoes will be too faint to detect. Suppose that this is the case here, meaning 10m is the limit at which a nightstalker can detect a man (as mankind is extinct in the nightstalker's universe no-one will be hurt).
Here's the catch: most of the sound emitted by the nightstalker travels on beyond the prey. These sound waves can be picked up easily by other animals further away than 10 meters (I assume you recognise the creature listening there; it's pretty frightening). For animals out there the sound only has to travel in one direction and none of it gets lost in bouncing back from the prey. The unfortunate consequence of all this 'shouting to hear a whisper' is that the nightstalker is announcing its presence loudly to animals that it cannot detect itself!
This suggests that echolocation could be a dangerous luxury. One way to use it safely would be if other predators cannot get to you anyway. Is that why bats, up there in the air, can afford echolocation? Another solution would be to be big and bad, so you can afford to be noisy? If so, echolocation is not a suitable tool to find a yummy carrot if you are an inoffensive rabbit-analogue. The carrot does not care, but the wolf-analogue will.

---------------------

Getting back on topic, we now know that echolocation tells you how far away an object is. To make sense of the world you will also need to know where the object spatially: left and right and up and down. With hearing this is more difficult than with vision, but it can be done. The spatial resolution of bats is one or two degrees (see here for that), which is impressive but still 60 to 120 times less good than human vision. For now, let's take it for granted that an echolocating animal can locate echo sources. Next, let's try to visualise what it may be like.


Click to enlarge; copyright Gert van Dijk

Here is a scene with a variety of objects on a featureless plain. The objects have transparency, colours, shadows, etc. At one glance we see them all, as well as the horizon, the clouds, etc., without restrictions regarding distance, all in high resolution. The glory of vision, for all to see.


Click to enlarge; copyright Gert van Dijk

Colour is purely visual, so to mimic echolocation it has to go. All the objects are now just white. They are also all featureless, but that is for simplicity's sake only: vision and echolocation can both carry information about things like wrinkles and bumps, so I left texture out.


Click to enlarge; copyright Gert van Dijk

In sight the main source of light is the sun shining from above, but in echolocation you have to provide your own energy. To mimic that, the only light source left is at the camera. The resulting image looks like that of a flash photograph, for good reasons: the light follows the inverse square law, as does sound. Nearby objects reflect a lot of light (sound!) for two reasons: they are close by, and part of their surfaces face the camera squarely, turning light directly back at the camera. This is an 'intensity image'.


Click to enlarge; copyright Gert van Dijk

However, you can see nearby and far objects at the same time, but that is not true for sound. Sound travels in air at about 333 m/s, so sound takes about 3 ms to travel one meter. An object one meter away will produce an echo in 6 ms: 3 ms going to the object and 3 ms travelling back. The image above shows the same scene, but now the grey levels indicate the distance from the camera. Light areas in the image are close by, dark areas are further away. This is a 'depth image', formed courtesy of the ray tracing algorithms in Vue Infinite.


Copyright Gert van Dijk

Now the scene is set to mimic echolocation. Let's send out an imaginary 'ping'; each interval in time determines how far away an echo-producing object is. For instance, the interval from 6 to 12 ms after the 'ping' corresponds to objects 1 to 2 meters away. While the depth image tells us how far away objects are, the intensity image tells us how much of an echo is produced there. To make things easier for the human eye a visual clue was added: echoes returning early are shown in red, while those returning later are blue. Above is a video showing three successive 'pings'. As the echoes bounce back, areas close by will light up in red, and objects furet away will produce an echo in blue, later on. I blurred the images a bit to mimic the relatively poor spatial resolution of echolocation.
I personally found it difficult to reconstruct a three-dimensional image of the world using such images, but my visual system is not used to getting its cues in such fashion.


Copyright Gert van Dijk

One easy processing trick to improve the image is to remember the location of early echoes. The video above does that, by adding new echoes without erasing the old ones. The image is wiped as a new ping starts. More advanced neuronal analyses could take care of additional clues such as the Doppler effect, to read your own or an object's movement. By the way, the above is in slow-motion. In real life echoes from an object 10 m away would only take 60 msec to get back. Even without any overlap you could afford 16 pings a second for that range. That is not bad: after all, 20-25 frames a second is enough to trick our visual system into thinking that there is continuous movement.

So there we are. Is this simple metaphor a valid indication of what echolocation is like? Probably not, but it does point out a few basic characteristics of echolocation. Echolocation must be a claustrophobic: no clouds, no horizon, just your immediate surroundings. It would seem the meek cannot afford it, as it may be the most abrasive and abusive of senses.
Is it therefore completely inferior to vision? Well, yes and no...