#just like bats they navigate with echolocation and produce a high clicking noise that is in fact audible to humans
Explore tagged Tumblr posts
Note
I’ve had a bad day so far, but hey, at least I’ve gotten 15 words into a story yesterday. It will be a sour endnote as my birthday eve.
I’d say I hope, but I’m not gonna.
Anyways on with the story! Anybody know the nesting behaviors of gannets/oilbirds/how they express emotions with sound/wings?
.
7 notes · View notes
Text
Detecting Objects with Invisible Waves: Using Radar, Sonar, and Echolocation to “See”
The ability to see visible waves of light can be beneficial for determining the size, shape, distance, and speed of things in our surrounding environments. But in many situations, reliance on sight might not be the best option for the remote detection of objects. For example, most animals do not have eyes on the backs of their heads; many cannot see very well at night; and some live in the depths of the ocean where visible light doesn’t reach. Yet these conditions don’t hinder the ability to sense objects for many animals. So, how do humans and other animals “see” distant objects without depending on the use of sight?
One answer is that other types of waves outside of visible light exist and animals have developed methods for detecting them. Two of these methods, sonar and radar, are man-made detection systems that allow us to “see” what our eyes can’t. The other, echolocation, is a natural way for some animals to detect motion through sound waves.
Radar
Radar is a system used to detect, locate, track, and recognize objects from a considerable distance. R.A.D.A.R is an acronym for “radio detection and ranging.” It was initially developed in the 1930s and 1940s for military use, but is now common for civilian purposes as well. Some of these uses include weather observation, air traffic control, and surveillance of other planets.
Tumblr media
Air traffic control radar.
Radar works by sending out radio waves, a type of electromagnetic wave, in pulses through a radio transmitter. The waves are reflected off of objects in their path back toward a receiver that can detect those reflections. Radar devices usually use the same antenna for transmitting and receiving, which means the device switches between being active and passive. The received radio wave information can help observers determine the distance and location of the object, how fast it is moving in relation to the receiver, the direction of travel, and sometimes the shape and size of the objects, too.
Radio waves have the longest wavelengths and lowest frequencies of all electromagnetic waves. Because they move slower and require less energy, they travel well through adverse weather conditions like fog, rain, snow, etc. Detection systems like lidar that operate through infrared and visible waves with shorter wavelengths and higher frequencies do not function well in such conditions.
While radar can effectively move through or around various environmental conditions, it is much less effective underwater. The electromagnetic waves of radar are absorbed in large bodies of water within feet of transmission. Instead, we use Sonar in underwater applications.
Sonar
S.O.N.A.R, an acronym for “sound navigation and ranging,” is a similar system to radar in terms of transmitting and receiving waves through pulses to determine distance and speed. However, it functions through the use of sound waves and is highly effective underwater.
Sound waves are mechanical waves, which means they are oscillations, or back and forth movements at regular speeds, of matter. When a mechanical wave strikes an obstacle or comes to the end of the medium it travels in, some portion of the wave is reflected back into the original medium. Water turns out to be a fantastic medium – albeit a slow one – for carrying mechanical waves long distances, making Sonar the top choice for underwater object detection.
Echolocation
Echolocation is a natural sound wave transmission and detection method used by animals to accomplish the same goal of object detection. Though sometimes referred to as sonar in casual conversation, echolocation requires no human-made device to function and is used both above and below water. Animals use echolocation by sending out sound waves in the air or water before them. They can then determine information about objects in their path through the echoes produced when those sounds are reflected.
Echolocation can be utilized by any animal with sound-producing and sensing capabilities. Humans have been known to develop methods of systematically tapping canes or clicking their tongues to produce the sounds needed for echolocation. However, echolocation is more generally associated with the use of ultrasound by non-human animals. Ultrasound is sound that has a mechanical wave frequency higher than the human ear can detect though they operate the same as audible sound waves.
Tumblr media
Bats are among the most well-known users of echolocation. They use relatively high, mostly ultrasonic wavelengths and some can create echolocating sounds up to 140 decibels – higher than a military jet taking off only 100 feet away. In order to handle such intense sound wave vibrations, bats turn off their middle ears by just before calling to avoid being deafened by their own calls. They use muscles in their middle ear to pull apart bones that carry sound waves to the inner ear leaving no path for the sound waves to damage the cochlea. Similar to radar devices switching between active transmitters and passive receivers, Bats restore their full hearing a split second later to listen for echoes.
Most of the more than 1300 species of bats use echolocation to hunt and navigate in poor lighting conditions. Fossil evidence indicates that this capability developed in bats at least 52 million years ago. They can detect an insect up to 15 feet away and determine its size, shape, hardness, and direction of travel through their skillful use of echolocation.
Wave Echoes
Animals have long been able to detect objects at a distance through the manipulation of nonvisible waves using technologies like radar and sonar or natural echolocation. Though each of these methods operates a little differently and relies on various shapes, sizes, and types of waves, they each work by emitting waves then determining characteristics based upon the echoes of those waves.
Try it at Home
Go to a corner of a quiet room and close your eyes. Without moving your body too much, try turning your head while making clicking noises with your mouth. Can you tell when you are turned more toward a wall or if there are any objects near you through the way the clicking sound changes? Try holding your hand up in front of your face and moving it back and forth while you click. Can you tell how far away it is or which direction it is moving by the sound? Get creative and try it with different types of objects and different locations!
Jane Thaler is a Gallery Experience Presenter in CMNH’s Life Long Learning Department. Museum staff, volunteers, and interns are encouraged to blog about their unique experiences and knowledge gained from working at the museum.
41 notes · View notes
grison-in-space · 6 years ago
Text
that big “what the fuck is up with Matt Murdock’s senses” post I keep threatening to make
For context, my day job involves studying animal communication, where I am a PhD student in evolution/animal behavior. I don't work on organisms that use non-standard sensory modalities directly, but I'm very familiar with the adaptations that electroceptive and echolocatory systems (mostly in bats, that latter one) generally require.
I also spend an awful lot of time watching my cat Dent, who has been blind probably from birth and definitely since he was about ten weeks old. (We're not entirely sure whether he can see light or movement, and he definitely can't see anything else.) Dent therefore has access to certain sensory modalities that are more sensitive than vision (cats can hear much higher into the ultrasonic than humans and have a wider range of olfactory sensitivities) without actually having vision to rely on, and thinking about what it is that specific sensory modalities actually bring to the table in terms of function.
What I am not is a blind person, nor have I lived or worked closely with someone who is. This is therefore going to be a discussion that focuses pretty heavily on "okay, let's assume Matty really does have ears like a bat--how does that constrain what he can and cannot do?" and less on the actual functional issues for someone who is, you know, a blind human--although if folks have comments on that, I would absolutely fucking love to hear them.
TL, DR: radar isn't fucking magic, and neither is echolocation. And physics still matters when we get down into sensation, more than you might think.
One of the things you have to understand when you're trying to study sensation and perception is that different sensory modalities--sight, touch, hearing, proprioception/balance, echolocation, etc--are good for different things. We tend to intuitively understand this in humans, but when reading experiences of characters with very different sensory toolboxes I often find that people simply... assume that the "extra-sensitive" senses can more or less perfectly compensate for the loss of vision.
The thing is, different sensory modalities are good for different things. That's why different groups of animals develop specialties in different modalities in the way they do. Some of what evolution can do is constrained by phylogenetic history--mammals are always going to have a leg up on birds when it comes to hearing in high frequencies, for example, because of a quirk of the development of the mammalian jaw--but a lot comes down to the interaction between the world around a particular animal and the needs and ecological niche that the animal takes up. Species generally specialize and hone the sensory systems that they have available which are useful to the needs of the animal in question.
What I mean by this is that you have to understand that different sensory systems are really good at different things, and sometimes you need different levels of resolution for different tasks. You can think of sensory systems as having two kinds of resolution: temporal and spatial. Vision has, generally speaking, pretty fine temporal resolution--you get a continuous "picture" of things around you and where they are at any given time. Your spatial resolution, as anyone who wears glasses (me included!) can tell you, varies based on your individual eyes and level of focus.
There's one final distinction that is important to bring up with respect to choosing sensory modalities, and that is active versus passive sensation. You can define this by asking yourself: do you have to do anything to work this sense and pick up stimuli from the environment around you? If yes, we're active; if not, we're passive. Humans don't really have any equivalent active sensory modalities with the possible exception of touch, but because Matt is almost always depicted as having access to at least one (echolocation, "radar"), I'm going to talk a little bit about those here, too.
Why does resolution matter?
Well, when we talk about modalities compensating for each other as an individual navigates the world, resolution is what lets us adapt senses to do each other's jobs. Fine resolution isn't always the most useful range for a given sense, either: olfaction has very coarse temporal resolution and moderate spatial resolution in most species, and that means that you can use it to tell where things have been even if the thing creating the signal is no longer there. Echolocation has perhaps the finest possible temporal resolution in that it is not a continuous signal--more on that in a minute--and very, very fine spatial resolution, but only for the instant of a given vocalization. Vision has very tight temporal resolution and very tight spatial resolution, depending on the level of focus a given person has.
What's the deal with active versus passive sensation?
For one thing, that means that Matt should not be able to use either echolocation or "radar" unless he's actually producing some kind of signal. I keep putting "radar" in quotes because it's not used by any known biological system; the closest analogue is probably electroception, but electroception is prettyk much exclusively used and developed by aquatic or semi-aquatic animals and uses different ranges of electromagnetic waves to most human-built radar systems. That means that echolocation doesn't produce continuous information the same way that passive sensory systems (like vision!) do, which means that Matt has to string together a series of disconnected "impressions" of where things are in space and time to make a "picture" of the world around him, at least with respect to that sense.
Basically, the way these sensory systems work is that you produce a signal and you "listen" to the response patterns. This means that if you aren't producing that signal, you don't get anything. This is interesting and important in the context of Daredevil because Matt very specifically does not produce any vocalizations or noises that could be used for echolocation in the human range, and it's even less likely that he's continously emitting weak electric charges into his environment--the air just isn't a good enough conductor to give him any real distance.
So if he's doing this, he's doing it at either very high pitches, outside the usual human auditory range, or else at very low pitches--and high is much more likely. High-frequency vocalizations decay faster over space, which is why they don't carry well. Because of this, and because the pattern of reverberation and decay of the sound is what you're using to construct the idea of shape with echolocation, all known echolocating species use very high-frequency, very loud vocalizations to create pulses of sound that will decay in ways that are sensitive to the shape of whatever they're bouncing off of.
Personally, I like to imagine Matt squeaking at very high pitches like a real bat might, mostly because I think it's funny. This is particularly amusing because in many social species that rely very heavily on echolocation or electroception, individuals produce a signal that is unique to them within the local group--so it's the equivalent of Matt wandering around yelling MATT MATT MATT MATT whenever he wants to get a good sense of its position and shape without having to actually, you know, touch it. (This may or may not be a good way for Stick and Matt to get a sense for where each other are at a distance--if they're managing to make a super high-pitched vocalization, it probably doesn't carry too well. On the other hand, if they're fighting something as a team, as we see both of them doing, the odds are good that each is listening to the information that the other is getting if one or both is using whatever this sensory system is. )
If I'm going to take a more realistic tack on the whole thing, I'd guess he's probably vocalizing through his nose, which is pretty common in both human vocalizations (you don't need your mouth to be open to say nnnnn or mmmm, because those sounds are produced via reverberations through the nasal cavity) and also in many ultrasonic vocalizations specifically (for example, the ultrasonic communication that rodents often engage in).
(Humans who say they can use echolocation in real life rely on clicks and taps, which is why I think it's particularly interesting that Matt is consistently shown using his long cane a few inches above the ground. I don't think he ever uses it to tap the ground in the show, and he certainly isn't making a loud click noise with it. Both clicks and taps can work for echolocation because these are wide-frequency noises, so you still have the decay patterns of the higher frequencies to work off of if you can filter through the lower-frequency stuff muddying the waters. It's not very sophisticated and will only give you a comparatively broad sense of where things are, but it's certainly better than nothing. But whatever Matt is using, he's specifically not using that to navigate his world.
A friend who uses a long cane suggested dryly that this might be an attempt to avoid the common peril of getting one's cane stuck in a pothole and winding up taking your cane to the balls or the kidneys, which... given the general lack of maintenance of Hell's Kitchen in other venues of the show, I suspect this is a peril Matt has been negotiating for some time.)
So what is Matt likely to use?
Honestly, I'm pretty sure that Matt's most important sense day-to-day isn't echolocation. It's his proprioception--his sense of where he is in the world, his spatial memory and his sense of balance. I heavily suspect that he has an incredibly good spatial sense and ability to process spatial information, and I notice that his combat style is heavily geared towards blocking his opponents into a space and hitting them until they go down. (Matt spends a lot of time using space to his advantage in combat--when he's not stalking an opponent and bringing them down by surprise, he's either constantly blocking them in and grappling close in or he's using a narrow confine like a hallway or an alley to constrain his opponent's ability to move quickly. Because the ability to echolocate does require him to produce a sound and because I'm not aware of any way to produce sufficiently high-pitched sounds that doesn't involve forcing air through the larynx at some level, I would guess that he's actually primarily relying on passive listening to pick up cues about what is going on in his environment in the middle of combat. I'd gamble he's most likely to quickly use his active sense (whatever it is) to make a rough "sketch" of what's going on around him in moments of relative quiet, when he's not moving too quickly to control his breathing.
I like constraint in my headcanons, because it lets me plumb the unexpected boundaries of abilities, perceptions, and creates avenues for conflict and unexpected humor; if you don't--and the writers of Daredevil in all forms certainly don't seem to be particularly careful about this--seriously, by all means ignore me or pick out whatever you like and leave the rest. But hey, I had fun putting this fucker together. 
This post is crossposted at pillowfort and dreamwidth.
9 notes · View notes
and-then-there-were-n0ne · 3 years ago
Text
« Bats have a problem: how to find their way around in the dark. They hunt at night, and cannot use light to help them find prey and avoid obstacles. You might say that if this is a problem it is a problem of their own making, a problem that they could avoid simply by changing their habits and hunting by day. But the daytime economy is already heavily exploited by other creatures such as birds. Given that there is a living to be made at night, and given that alternative daytime trades are thoroughly occupied, natural selection has favoured bats that make a go of the night-hunting trade. It is probable, by the way, that the nocturnal trades go way back in the ancestry of all us mammals. In the time when the dinosaurs dominated the daytime economy, our mammalian ancestors probably only managed to survive at all because they found ways of scraping a living at night. Only after the mysterious mass extinction of the dinosaurs about 65 million years ago were our ancestors able to emerge into the daylight in any substantial numbers. 
Returning to bats, they have an engineering problem: how to find their way and find their prey in the absence of light. Bats are not the only creatures to face this difficulty today. Obviously the night-flying insects that they prey on must find their way about somehow. Deepsea fish and whales have little or no light by day or by night, because the sun's rays cannot penetrate far below the surface. Fish and dolphins that live in extremely muddy water cannot see because, although there is light, it is obstructed and scattered by the dirt in the water. Plenty of other modern animals make their living in conditions where seeing is difficult or impossible. Given the question of how to manoeuvre in the dark, what solutions might an engineer consider? 
The first one that might occur to him is to manufacture light, to use a lantern or a searchlight. Fireflies and some fish (usually with the help of bacteria) have the power to manufacture their own light, but the process seems to consume a large amount of energy. Fireflies use their light for attracting mates. This doesn't require prohibitively much energy: a male's tiny pinprick can be seen by a female from some distance on a dark night, since her eyes are exposed directly to the light source itself. Using light to find one's own way around requires vastly more energy, since the eyes have to detect the tiny fraction of the light that bounces off each part of the scene. The light source must therefore be immensely brighter if it is to be used as a headlight to illuminate the path, than if it is to be used as a signal to others. Anyway, whether or not the reason is the energy expense, it seems to be the case that, with the possible exception of some weird deep-sea fish, no animal apart from man uses manufactured light to find its way about. What else might the engineer think of? 
Well, blind humans sometimes seem to have an uncanny sense of obstacles in their path. It has been given the name 'facial vision', because blind people have reported that it feels a bit like the sense of touch, on the face. One report tells of a totally blind boy who could ride his tricycle at a good speed round the block near his home, using 'facial vision'. Experiments showed that, in fact, 'facial vision' is nothing to do with touch or the front of the face, although the sensation may be referred to the front of the face, like the referred pain in a phantom (severed) limb. The sensation of 'facial vision', it turns out, really goes in through the ears. The blind people, without even being aware of the fact, are actually using echoes, of their own footsteps and other sounds, to sense the presence of obstacles. Before this was discovered, engineers had already built instruments to exploit the principle, for example to measure the depth of the sea under a ship. After this technique had been invented, it was only a matter of time before weapons designers adapted it for the detection of submarines. Both sides in the Second World War relied heavily on these devices, under such code names as Asdic (British) and Sonar (American), as well as the similar technology of Radar (American) or RDF (British), which uses radio echoes rather than sound echoes.
The Sonar and Radar pioneers didn't know it then, but all the world now knows that bats, or rather natural selection working on bats, had perfected the system tens of millions of years earlier, and their 'radar' achieves feats of detection and navigation that would strike an engineer dumb with admiration. It is technically incorrect to talk about bat 'radar', since they do not use radio waves. It is sonar. But the underlying mathematical theories of radar and sonar are very similar, and much of our scientific understanding of the details of what bats are doing has come from applying radar theory to them. The American zoologist Donald Griffin, who was largely responsible for the discovery of sonar in bats, coined the term 'echolocation' to cover both sonar and radar, whether used by animals or by human instruments. In practice, the word seems to be used mostly to refer to animal sonar. 
It is misleading to speak of bats as though they were all the same. It is as though we were to speak of dogs, lions, weasels, bears, hyenas, pandas and otters all in one breath, just because they are all carnivores. Different groups of bats use sonar in radically different ways, and they seem to have 'invented' it separately and independently, just as the British, Germans and Americans all independently developed radar. Not all bats use echolocation. The Old World tropical fruit bats have good vision, and most of them use only their eyes for finding their way around. One or two species of fruit bats, however, for instance Rousettus, are capable of finding their way around in total darkness where eyes, however good, must be powerless. They are using sonar, but it is a cruder kind of sonar than is used by the smaller bats with which we, in temperate regions, are familiar.
Rousettus clicks its tongue loudly and rhythmically as it flies, and navigates by measuring the time interval between each click and its echo. A good proportion of Rousettus's clicks are clearly audible to us (which by definition makes them sound rather than ultrasound: ultrasound is just the same'as sound except that it is too high for humans to hear). In theory, the higher the pitch of a sound, the better it is for accurate sonar. This is because low-pitched sounds have long wavelengths which cannot resolve the difference between closely spaced objects. All other things being equal therefore, a missile that used echoes for its guidance system would ideally produce very high-pitched sounds. Most bats do, indeed, use extremely high-pitched sounds, far too high for humans to hear - ultrasound. Unlike Rousettus, which can see very well and which uses unmodified relatively low-pitched sounds to do a modest amount of echolocation to supplement its good vision, the smaller bats appear to be technically highly advanced echo-machines. They have tiny eyes which, in most cases, probably can't see much. They live in a world of echoes, and probably their brains can use echoes to do something akin to 'seeing' images, although it is next to impossible for us to 'visualize' what those images might be like. The noises that they produce are not just slightly too high for humans to hear, like a kind of super dog whistle. In many cases they are vastly higher than the highest note anybody has heard or can imagine. It is fortunate that we can't hear them, incidentally, for they are immensely powerful and would be deafeningly loud if we could hear them, and impossible to sleep through. These bats are like miniature spy planes, bristling with sophisticated instrumentation. [...]
Myotis, one of the common little brown bats, [emits pulses] at a rate of about 10 per second. This is about the rate of a standard teleprinter, or a Bren machine gun. Presumably the bat's image of the world in which it is cruising is being updated 10 times per second. Our own visual image appears to be continuously updated as long as our eyes are open. We can see what it might be like to have an intermittently updated world image, by using a stroboscope at night. This is sometimes done at discotheques, and it produces some dramatic effects. A dancing person appears as a succession of frozen statuesque attitudes. Obviously, the faster we set the strobe, the more the image corresponds to normal 'continuous' vision. Stroboscopic vision 'sampling' at the bat's cruising rate of about 10 samples per second would be nearly as good as normal 'continuous' vision for some ordinary purposes, though not for catching a ball or an insect. This is just the sampling rate of a bat on a routine cruising flight. When a little brown bat detects an insect and starts to move in on an interception course, its [rate] goes up. Faster than a machine gun, it can reach peak rates of 200 pulses per second as the bat finally closes in on the moving target. [...]
If we may imagine bat brains as building up an image of the world analogous to our visual images, the pulse rate alone seems to suggest that the bat's echo image might be at least as detailed and 'continuous' as our visual image. Of course, there may be other reasons why it is not so detailed as our visual image. If bats are capable of boosting their sampling rates to 200 pulses per second, why don't they keep this up all the time? Since they evidently have a rate control 'knob' on their 'stroboscope', why don't they turn it permanently to maximum, thereby keeping their perception of the world at its most acute, all the time, to meet any emergency? One reason is that these high rates are suitable only for near targets. If a pulse follows too hard on the heels of its predecessor it gets mixed up with the echo of its predecessor returning from a distant target. Even if this weren't so, there would probably be good economic reasons for not keeping up the maximum pulse rate all the time. It must be costly producing loud ultrasonic pulses, costly in energy, costly in wear and tear on voice and ears, perhaps costly in computer time. A brain that is processing 200 distinct echoes per second might not find surplus capacity for thinking about anything else. Even the ticking-over rate of about 10 pulses per second is probably quite costly, but much less so than the maximum rate of 200 per second. An individual bat that boosted its tickover rate would pay an additional price in energy, etc., which would not be justified by the increased sonar acuity. [...] When the salient vicinity includes another moving object, particularly a flying insect twisting and turning and diving in a desperate attempt to shake off its pursuer, the extra benefit to the bat of increasing its sample rate more than justifies the increased cost. 
Of course, the considerations of cost and benefit in this paragraph are all surmise, but something like this almost certainly must be going on. The engineer who sets about designing an efficient sonar or radar device soon comes up against a problem resulting from the need to make the pulses extremely loud. They have to be loud because when a sound is broadcast its wavefront advances as an ever-expanding sphere. The intensity of the sound is distributed and, in a sense, 'diluted' over the whole surface of the sphere. The surface area of any sphere is proportional to the radius squared. The intensity of the sound at any particular point on the sphere therefore decreases, not in proportion to the distance (the radius) but in proportion to the square of the distance from the sound source, as the wavefront advances and the sphere swells. This means that the sound gets quieter pretty fast, as it travels away from its source, in this case the bat. When this diluted sound hits an object, say a fly, it bounces off the fly. This reflected sound now, in its turn, radiates away from the fly in an expanding spherical wavefront. For the same reason as in the case of the original sound, it decays as the square of the distance from the fly. By the time the echo reaches the bat again, the decay in its intensity is proportional, not to the distance of the fly from the bat, not even to the square of that distance, but to something more like the square of the square - the fourth power, of the distance. This means that it is very very quiet indeed. The problem can be partially overcome if the bat beams the sound by means of the equivalent of a megaphone, but only if it already knows the direction of the target. In any case, if the bat is to receive any reasonable echo at all from a distant target, the outgoing squeak as it leaves the bat must be very loud indeed, and the instrument that detects the echo, the ear, must be highly sensitive to very quiet sounds - the echoes. 
Bat cries, as we have seen, are indeed often very loud, and their ears are very sensitive. Now here is the problem that would strike the engineer trying to design a bat-like machine. If the microphone, or ear, is as sensitive as all that, it is in grave danger of being seriously damaged by its own enormously loud outgoing pulse of sound. It is no good trying to combat the problem by making the sounds quieter, for then the echoes would be too quiet to hear. And it is no good trying to combat that by making the [ear] more sensitive, since this would only make it more vulnerable to being damaged by the, albeit now slightly quieter, outgoing sounds! It is a dilemma inherent in the dramatic difference in intensity between outgoing sound and returning echo, a difference that is inexorably imposed by the laws of physics. What other solution might occur to the engineer? 
When an analogous problem struck the designers of radar in the Second World War, they hit upon a solution which they called 'send/receive' radar. The radar signals were sent out in necessarily very powerful pulses, which might have damaged the highly sensitive aerials (American 'antennas') waiting for the faint returning echoes. The 'send/receive' circuit temporarily disconnected the receiving aerial just before the outgoing pulse was about to be emitted, then switched the aerial on again in time to receive the echo. Bats developed 'send/receive' switching technology long long ago, probably millions of years before our ancestors came down from the trees. It works as follows. In bat ears, as in ours, sound is transmitted from the eardrum to the microphonic, sound-sensitive cells by means of a bridge of three tiny bones known (in Latin) as the hammer, the anvil and the stirrup, because of their shape. [...] What matters here is that some bats have well-developed muscles attached to the stirrup and to the hammer. When these muscles are contracted the bones don't transmit sound so efficiently - it is as though you muted a microphone by jamming your thumb against the vibrating diaphragm. The bat is able to use these muscles to switch its ears off temporarily. The muscles contract immediately before the bat emits each outgoing pulse,thereby switching the ears off so that they are not damaged by the loud pulse. Then they relax so that the ear returns to maximal sensitivity just in time for the returning echo. This send/receive switching system works only if split-second accuracy in timing is maintained. The bat called Tadarida is capable of alternately contracting and relaxing its switching muscles 50 times per second, keeping in perfect synchrony with the machine gun-like pulses of ultrasound. [...]
The next problem that might occur to our engineer is the following. If the sonar device is measuring the distance of targets by measuring the duration of silence between the emission of a sound and its returning echo - the method which Rousettus, indeed, seems to be using - the sounds would seem to have to be very brief, staccato pulses. A long drawn-out sound would still be going on when the echo returned, and, even if partially muffled by send/receive muscles, would get in the way of detecting the echo. Ideally, it would seem, bat pulses should be very brief indeed. But the briefer a sound is, the more difficult it is to make it energetic enough to produce a decent echo. We seem to have another unfortunate trade-off imposed by the laws of physics. Two solutions might occur to ingenious engineers, indeed did occur to them when they encountered the same problem, again in the analogous case of radar. Which of the two solutions is preferable depends on whether it is more important to measure range (how far away an object is from the instrument) or velocity (how fast the object is moving relative to the instrument). 
The first solution is that known to radar engineers as chirp radar. We can think of radar signals as a series of pulses, but each pulse has a so-called carrier frequency. [...] The special feature of chirp radar is that it does not have a fixed carrier frequency during each shriek. Rather, the carrier frequency swoops up or down about an octave. [...] The advantage of chirp radar, as opposed to the fixed pitch pulse, is the following. It doesn't matter if the original chirp is still going on when the echo returns. They won't be confused with each other. This is because the echo being detected at any given moment will be a reflection of an earlier part of the chirp, and will therefore have a different pitch. Human radar designers have made good use of this ingenious technique. Is there any evidence that bats have 'discovered' it too, just as they did the send/receive system? Well, as a matter of fact, numerous species of bats do produce cries that sweep down, usually through about an octave, during each cry. These wolf-whistle cries are known as frequency modulated (FM). They appear to be just what would be required to exploit the 'chirp radar' technique. However, the evidence so far suggests that bats are using the technique, not to distinguish an echo from the original sound that produced it, but for the more subtle task of distinguishing echoes from other echoes. A bat lives in a world of echoes from near objects, distant objects and objects at all intermediate distances. It has to sort these echoes out from each other. If it gives downward-swooping, wolf-whistle chirps, the sorting is neatly done by pitch. When an echo from a distant object finally arrives back at the bat, it will be an 'older' echo than an echo that is simultaneously arriving back from a near object. It will therefore be of higher pitch. When the bat is faced with clashing echoes from several objects, it can apply the rule of thumb: higher pitch means farther away. 
The second clever idea that might occur to the engineer, especially one interested in measuring the speed of a moving target, is to exploit what physicists call the Doppler Shift. [...] The Doppler Shift occurs whenever a source of sound (or light or any other kind of wave) and a receiver of that sound move relative to one another. [...I]f we ride fast on a motorbike past a wailing factory siren, when we are approaching the factory the pitch will be raised: our ears are, in effect, gobbling up the [sound] waves at a faster rate than they would if we just sat still. By the same kind of argument, when our motorbike has passed the factory and is moving away from it, the pitch will be lowered. If we stop moving we shall hear the pitch of the siren as it actually is, intermediate between the two Doppler-shifted pitches. It follows that if we know the exact pitch of the siren, it is theoretically possible to work out how fast we are moving towards or away from it simply by listening to the apparent pitch and comparing it with the known 'true' pitch. The same principle works when the sound source is moving and the hstener is still. [...] It is relative motion that matters, and as far as the Doppler Effect is concerned it doesn't matter whether we consider the sound source to be moving past the ear, or the ear moving past the sound source. [...] The Doppler Effect is used in police radar speed-traps for motorists. A static instrument beams radar signals down a road. The radar waves bounce back off the cars that approach, and are registered by the receiving apparatus. The faster a car is moving, the higher is the Doppler shift in frequency. By comparing the outgoing frequency with the frequency of the returning echo the police, or rather their automatic instrument, can calculate the speed of each car. If the police can exploit the technique for measuring the speed of road hogs, dare we hope to find that bats use it for measuring the speed of insect prey? The answer is yes. 
The small bats known as horseshoe bats have long been known to emit long, fixed-pitch hoots rather than staccato clicks or descending wolf-whistles. When I say long, I mean long by bat standards. The 'hoots' are still less than a tenth of a second long. And there is often a 'wolf-whistle' tacked onto the end of each hoot, as we shall see. Imagine, first, a horseshoe bat giving out a continuous hum of ultrasound as it flies fast towards a still object, like a tree. The wavefronts will hit the tree at an accelerated rate because of the movement of the bat towards the tree. If a microphone were concealed in the tree, it would 'hear' the sound Doppler-shifted upwards in pitch because of the movement of the bat. There isn't a microphone in the tree, but the echo reflected back from the tree will be Doppler-shifted upwards in pitch in this way. Now, as the echo wavefronts stream back from the tree towards the approaching bat, the bat is still moving fast towards them. Therefore there is a further Doppler shift upwards in the bat's perception of the pitch of the echo. The movement of the bat leads to a kind of double Doppler shift, whose magnitude is a precise indication of the velocity of the bat relative to the tree. By comparing the pitch of its cry with the pitch of the returning echo, therefore, the bat (or rather its on-board computer in the brain) could, in theory, calculate how fast it was moving towards the tree. This wouldn't tell the bat how far away the tree was, but it might still be very useful information, nevertheless. If the object reflecting the echoes were not a static tree but a moving insect, the Doppler consequences would be more complicated, but the bat could still calculate the velocity of relative motion between itself and its target, obviously just the kind of information a sophisticated guided missile like a hunting bat needs. 
Actually some bats play a trick that is more interesting than simply emitting hoots of constant pitch and measuring the pitch of the returning echoes. They carefully adjust the pitch of the outgoing hoots, in such a way as to keep the pitch of the echo constant after it has been Doppler-shifted. As they speed towards a moving insect, the pitch of their cries is constantly changing, continuously hunting for just the pitch needed to keep the returning echoes at a fixed pitch. This ingenious trick keeps the echo at the pitch to which their ears are maximally sensitive - important since the echoes are so faint. They can then obtain the necessary information for their Doppler calculations, by monitoring the pitch at which they are obliged to hoot in order to achieve the fixed-pitch echo. I don't know whether man-made devices, either sonar or radar, use this subtle trick. But on the principle that most clever ideas in this field seem to have been developed first by bats, I don't mind betting that the answer is yes. It is only to be expected that these two rather different techniques, the Doppler shift technique and the 'chirp radar' technique, would be useful for different special purposes. Some groups of bats specialize in one of them, some in the other. Some groups seem to try to get the best of both worlds, tacking an FM 'wolf-whistle' onto the end (or sometimes the beginning) of a long, constant-frequency 'hoot'. [...]
Human experimenters have found it surprisingly difficult to put bats off their stride by playing loud artificial ultrasound at them. With hindsight one might have predicted this. Bats must have come to terms with the jamming-avoidance problem long ago. Many species of bats roost in enormous aggregations, in caves that must be a deafening babel of ultrasound and echoes, yet the bats can still fly rapidly about the cave, avoiding the walls and each other in total darkness. How does a bat keep track of its own echoes, and avoid being misled by the echoes of others? The first solution that might occur to an engineer is some sort of frequency coding: each bat might have its own private frequency, just like separate radio stations. To some extent this may happen, but it is by no means the whole story. How bats avoid being jammed by other bats is not well understood, but an interesting clue comes from experiments on trying to put bats off. It turns out that you can actively deceive some bats if you play back to them their own cries with an artificial delay. Give them, in other words, false echoes of their own cries. It is even possible, by carefully controlling the electronic apparatus delaying the false echo, to make the bats attempt to land on a 'phantom' ledge. I suppose it is the bat equivalent of looking at the world through a lens. It seems that bats may be using something that we could call a 'strangeness filter'. Each successive echo from a bat's own cries produces a picture of the world that makes sense in terms of the previous picture of the world built up with earlier echoes. If the bat's brain hears an echo from another bat's cry, and attempts to incorporate it into the picture of the world that it has previously built up, it will make no sense. It will appear as though objects in the world have suddenly jumped in various random directions. Objects in the real world do not behave in'such a crazy way, so the brain can safely filter out the apparent echo as background noise. If a human experimenter feeds the bat artificially delayed or accelerated 'echoes' of its own cries, the false echoes will make sense in terms of the world picture that the bat has previously built up. The false echoes are accepted by the strangeness filter because they are plausible in the context of the previous echoes. They cause objects to seem to shift in position by only a small amount, which is what objects plausibly can be expected to do in the real world. The.bat's brain relies upon the assumption that the world portrayed by any one echo pulse will be either the same as the world portrayed by previous pulses, or only slightly different: the insect being tracked may have moved a little, for instance. [...] 
If you want to share a bat's experience, it is almost certainly grossly misleading to go into a cave, shout or bang two spoons together, consciously time the delay before you hear the echo, and calculate from this how far the wall must be. That is no more what it is like to be a bat than the following is a good picture of what it is like to see colour: use an instrument to measure the wavelength of the light that is entering your eye: if it is long, you are seeing red, if it is short you are seeing violet or blue. It happens to be a physical fact that the light that we call red has a longer wavelength than the light that we call blue. Different wavelengths switch on the red-sensitive and the blue-sensitive photocells in our retinas. But there is no trace of the concept of wavelength in our subjective sensation of the colours. Nothing about 'what it is like' to see blue or red tells us which light has the longer wavelength. If it matters (it usually doesn't), we just have to remember it, or (what I always do) look it up in a book. Similarly, a bat perceives the position of an insect using what we call echoes. But the bat surely no more thinks in terms of delays of echoes when it perceives an insect, than we think in terms of wavelengths when we perceive blue or red. Indeed, if I were forced to try the impossible, to imagine what it is like to be a bat, I would guess that echolocating, for them, might be rather like seeing for us. We are such thoroughly visual animals that we hardly realize what a complicated business seeing is. Objects are 'out there'; and we think that we 'see' them out there. But I suspect that really our percept is an elaborate computer model in the brain, constructed on the basis of information coming from out there, but transformed in the head into a form in which that information can be used. Wavelength differences in the light out there become coded as 'colour' differences in the computer model in the head. Shape and other attributes are encoded in the same kind of way, encoded into a form that is convenient to handle. The sensation of seeing is, for us, very different from the sensation of hearing, but this cannot be'directly due to the physical differences between light and sound. Both light and sound are, after all, translated by the respective sense organs into the same kind of nerve impulses. It is impossible to tell, from the physical attributes of a nerve impulse, whether it is conveying information about light, about sound or about smell. The reason the sensation of seeing is so different from the sensation of hearing and the sensation of smelling is that the brain finds it convenient to use different kinds of internal model of the visual world, the world of sound and the world of smell. It is because we internally use our visual information and our sound information in different ways and for different purposes that the sensations of seeing and hearing are so different. It is not directly because of the physical differences between light and sound. But a bat uses its sound information for very much the same kind of purpose as we use our visual information. It uses sound to perceive, and continuously update its perception of, the position of objects in three-dimensional space, just as we use light. The type of internal computer model that it needs, therefore, is one suitable for the internal representation of the changing positions of objects in threedimensional space. My point is that the form that an animal's subjective experience takes will be a property of the internal computer model. That model will be designed, in evolution, for its suitability for useful internal representation, irrespective of the physical stimuli that come to it from outside. Bats and we need the same kind of internal model for representing the position of objects in three-dimensional space. The fact that bats construct their internal model with the aid of echoes, while we construct ours with the aid of light, is irrelevant. That outside information is, in any case, translated into the same kind of nerve impulses on its way to the brain. »
— The Blind Watchmaker: Why the Evidence of Evolution Reveals a Universe without Design, Richard Dawkins
0 notes
didanawisgi · 7 years ago
Link
Alain Van Ryckegham, a professor at the School of Natural Resources at Sir Sandford Fleming College in Lindsay, Ontario, Canada, offers this explanation:
Bats are a fascinating group of animals. They are one of the few mammals that can use sound to navigate--a trick called echolocation. Of the some 900 species of bats, more than half rely on echolocation to detect obstacles in flight, find their way into roosts and forage for food.
Echolocation--the active use of sonar (SOund Navigation And Ranging) along with special morphological (physical features) and physiological adaptations--allows bats to "see" with sound. Most bats produce echolocation sounds by contracting their larynx (voice box). A few species, though, click their tongues. These sounds are generally emitted through the mouth, but Horseshoe bats (Rhinolophidae) and Old World leaf-nosed bats (Hipposideridae) emit their echolocation calls through their nostrils: there they have basal fleshy horseshoe or leaf-like structures that are well-adapted to function as megaphones.
Echolocation calls are usually ultrasonic--ranging in frequency from 20 to 200 kilohertz (kHz), whereas human hearing normally tops out at around 20 kHz. Even so, we can hear echolocation clicks from some bats, such as the Spotted bat (Euderma maculatum). These noises resemble the sounds made by hitting two round pebbles together. In general, echolocation calls are characterized by their frequency; their intensity in decibels (dB); and their duration in milliseconds (ms).
Some bats have specialized structures for emitting echolocation calls.
In terms of pitch, bats produce echolocation calls with both constant frequencies (CF calls) and varying frequencies that are frequently modulated (FM calls). Most bats produce a complicated sequence of calls, combining CF and FM components. Although low frequency sound travels further than high-frequency sound, calls at higher frequencies give the bats more detailed information--such as size, range, position, speed and direction of a prey's flight. Thus, these sounds are used more often.
In terms of loudness, bats emit calls as low as 50 dB and as high as 120 dB, which is louder than a smoke detector 10 centimeters from your ear. That's not just loud, but damaging to human hearing. The Little brown bat (Myotis lucifugus) can emit such an intense sound. The good news is that because this call has an ultrasonic frequency, we are unable to hear it.
Bat calls are categorized according to frequency, intensity and duration. Most sounds bats emit fall beyond the range of human hearing.
The ears and brain cells in bats are especially tuned to the frequencies of the sounds they emit and the echoes that result. A concentration of receptor cells in their inner ear makes bats extremely sensitive to frequency changes: Some Horseshoe bats can detect differences as slight as .000l Khz. For bats to listen to the echoes of their original emissions and not be temporarily deafened by the intensity of their own calls, the middle ear muscle (called the stapedius) contracts to separate the three bones there--the malleus, incus and stapes, or hammer, anvil and stirrup--and reduce the hearing sensitivity. This contraction occurs about 6 ms before the larynx muscles (called the crycothyroid) begin to contract. The middle ear muscle relaxes 2 to 8 ms later. At this point, the ear is ready to receive the echo of an insect one meter away, which takes only 6 ms.
The external structure of bats' ears also plays an important role in receiving echoes. The large variation in sizes, shapes, folds and wrinkles are thought to aid in the reception and funneling of echoes and sounds emitted from prey. Echolocation is a highly technical and interesting tactic. To truly understand the concepts and complexity of this subject is to begin to understand the amazing nature of these animals. For interested readers, an excellent resource is M. Brock Fenton's book Bats.
9 notes · View notes
viscomyear1 · 6 years ago
Text
Words for Book
Story for book
1. We live in a noisy world, and much of that noise is made by humans. Traffic, machinery, electronics—it’s a constant barrage of sound. How does our noise affect the animals around us? Unlike us, they can’t put in some earplugs, close a window or turn off the stereo. Recent studies are showing that our increasingly loud world is having negative effects on a range of animals, across a variety of habitats.
2.’Wait!’ I hear you say. ‘Natural landscapes aren’t always quiet either.’That’s true, nature can be noisy—from bird calls to wild winds, thunder and animal migrations, the natural environment creates and uses noise in a complex information network. Most animals, however, have specially adapted to the natural noises in their environment—they are aware of them, understand them and know how to use and interpret them.
The population and diversity of certain bird populations has been shown to decline or change when exposed to continuous noise generated by urban environments, such as roads, cities and industrial sites.
Several species have begun to adjust their vocal calls in an attempt to be heard above the din. Male great tits (Parus major) for example, have been noted to change the frequency of their call in order to be heard over anthropogenic noise. Female great tits prefer lower frequency calls when selecting a mate, but these frequencies are harder to hear over urban noise. Males who sing at higher frequencies are less attractive to females, but females may still mate with them if there are no lower-frequency singers available. Males are therefore placed in a difficult position—sing at a lower frequency and not be heard, or sing at a higher frequency and potentially be dismissed!
3. There are other effects too. A 2013 study by researchers at Boise State University created a ‘phantom road’ using a series of electronic speakers placed in the woods which played the sounds of a busy highway at regular intervals. The phantom road was situated near an important stop for migratory birds, where they would traditionally rest and fatten up before undertaking the journey ahead. For four days the team turned on the speakers playing the faux traffic noise. The results showed that during the periods of noise, birds stopping to rest in the area declined by more than one-quarter. When the speakers were off, the numbers bounced back. The researchers concluded that noise can change an animal’s most basic stay-or-go assessments of habitat, and ‘prompt more than the usual number of birds on thousand-mile marathons to skip a chance to rest and refuel’.
4. Birds are not the only animals affected by noise. A study published in 2010 found that noise pollution—specifically traffic noise—decreased the foraging efficiency of an acoustic predator, the greater mouse-eared bat (Myotis myotis). Successful foraging bouts decreased and search times increased dramatically with proximity to the highway. As the animals being hunted by the bats are themselves predators, the study noted that ‘the noise impact on the bats’ foraging performance will have complex effects on the food web and ultimately on the ecosystem stability’. Noise pollution could potentially interfere with other acoustic predators, such as owls, in a similar fashion.
5. Noise pollution can also kill off your sex life—at least if you’re a frog. A study conducted in Melbourne, Australia, by Dr Kirsten Parris and colleagues found that, for some highly vocal frog species, noise pollution is correlated with an increase in the frequency of their calls. This increase partially compensates for the loss of communication distance in noise-traffic areas experienced by these frogs. The mating call of male pobblebonk frogs could historically be heard up to 800 metres away by interested females. At very noisy sites, this is reduced to just 14 metres. If male frogs alter their call to a higher frequency to be heard, the females may not like what they hear. Female frogs of some species prefer lower-pitched calls, which often indicate larger and/or more experienced males. Once again for the male frogs, it’s a tough call—to not be heard, or to be heard and rejected!
The researchers concluded that ‘road noise can alter key survival behaviours’ and that ‘these findings highlight that the presence of animals in a location is no guarantee of population and ecological integrity’. So while noise pollution may not necessarily drive animals away from a site, it may alter their established behaviours and be having a less-obvious negative effect on their physical wellbeing.
6. Many cetaceans—marine mammals such as whales, dolphins and porpoises—live in a world largely defined by acoustic information. They use sound to communicate, and to navigate and monitor their surroundings, creating a picture of the world around them with 3D clarity. By emitting pulses of active sonar clicks they can echolocate food sources and pinpoint features in the environment around them down to millimetres. Their ‘songs’ and clicks can communicate with animals hundreds of kilometres away.
In the open ocean some species are even able to hear sounds thousands of kilometres away, from waves breaking on the shore to cracking ice. They use this information to help navigate and guide their migrations.
With the ability to hear being so important to these mammals, it is important to take note of the increasing studies showing that acoustic pollution from the human world can harm these animals in several ways.
7. Other incidences of beached whales show signs of physical trauma such as bleeding around the ears, brain and other tissues, as well as air bubbles in their organs. Known as barotrauma, this can occur from the sudden change in pressure caused by a sound. These symptoms are akin to ‘the bends’, an illness which can also affect human divers when they surface too quickly from deep water. Some scientists also speculate that mid-frequency sonar blasts may prompt certain species to quickly alter their dive patterns, resulting in debilitating or even fatal injuries.
While standings are an immediately obvious sign of some sort of distress or confusion, there are other, more subtle, ways that noise pollution can affect these mammals. A range of cetaceans have displayed changes in behaviour. For example noise has been shown to reduce humpback whale communication, with less ‘song’ during periods of noise, even when the origin of the noise is 200 kilometres away. Both right and blue whales have been found to increase the level of vocalisations when exposed to sound sources in their vocal range. In effect, they need to ‘shout’ to allow themselves to be heard. Chronic stress in baleen whales has been associated with low-frequency shipping noise, while other whale species have been shown to avoid important habitats (key breeding and/or feeding grounds) as they purposely evade areas of high noise. They can also experience lower respiration rates resulting in shorter dive periods. Whales off the coast of Western Australia have been recorded changing course and speed to avoid close contact with active seismic surveys.
8. It’s not only the larger animals that are being affected. Squid and other cephalopods have also shown negative responses to noise pollution. Even short exposure to low-frequency, low-intensity sounds―such as those produced by offshore oil drilling and commercial fishing―can disturb the balance systems of squid, octopuses and cuttlefish. A study conducted in 2011 collected 87 wild cephalopods across four species and exposed them to short bursts of low-intensity, low frequency sound for a period of two hours. The animals were then dissected to examine their statocysts (the organ responsible for their maintaining balance in the water). The results were disconcerting: every octopus, squid and cuttlefish had damage to its statocyst, including ruptured and missing hair, swollen nerve cells and even legions and holes in the statocyst’s sensory surface.
9. Even fish larvae are being affected. Recent studies have shown that larval fish and invertebrates are moving away from their traditional habitats. Interestingly, many are settling instead in places that have (low-frequency) noise caused by shipping. This movement has flow-on effects for the ecosystems that depend on these larvae.
Other species, such as hermit crabs, have been shown to be less responsive to visual predators when in high-noise environments. The authors of one study proposed a ‘distracted prey hypothesis’ to explain the finding, and noted that it demonstrates the potential for noise pollution to affect behaviours that are stimulated by non-auditory information.
So … what are we doing about it?
In response to the growing evidence regarding noise pollution and its effect on animals, some changes are being made.
10.
ON LAND
Some of the ways noise in which pollution from traffic can be reduced are by developing quieter roads and cars; installing noise-reduction barriers around major traffic areas; lowering speed limits; educating drivers; and implementing relevant legislation to progressively reduce noise. Other strategies include the use of better materials; improved site planning; and the undertaking of detailed environmental assessments before construction on houses and industrial sites is permitted, with follow-up assessments after construction is completed.
IN THE SEA
Scientists are working to better understand where and how noise pollution is generated, and where it is causing the most problems. They are working with industry and government on solutions including, currently, establishing ‘quiet areas’ for marine species and reducing noise levels in critical habitats; developing greener technology, including quieter ships, hulls shapes and machinery; and advocating for national regulation relating to the amount of noise (by all sources) that can be released into the ocean.
https://www.science.org.au/curious/earth-environment/noise-pollution-and-environment
0 notes