Jump to content

Does echolocation produce an actual 'visible' image like in the movies?


Recommended Posts

I recently watched a documentary involving an interstellar mission to a fictional alien planet in which the majority of lifeforms on the planet perceived the world around them using a form of echolocation rather than eyes. Echolocation has always interested me, and one thing I notice whenever it is used in movies such as Pitch Black and TV is that from the point of view of the animals/aliens using it, it seems to produce an actual 'visible' image comparable to the way that we see using our eyes, albeit grainy and without colour. Is this biologically accurate or is it simply done for the viewers' benefit? Is it even possible for us to know?

Link to comment
Share on other sites

Well, using echolocation you certainly wouldn't be able to learn the colors of objects. Other than that, I don't know much about how the echolocation is percieved within the brain.
Argh, where is a good neuroscientist when you need them :D

Link to comment
Share on other sites

We see using light, bats, dolphins and other animals "see" using sound. Bats routinely catch small insects (mosquitoes) that we have difficulty seeing except in swarms. Why wouldn't sound appears as an image to those animals. Doctors make images of fetus inside women.

Link to comment
Share on other sites

We see using light, bats, dolphins and other animals "see" using sound. Bats routinely catch small insects (mosquitoes) that we have difficulty seeing except in swarms. Why wouldn't sound appears as an image to those animals. Doctors make images of fetus inside women.

 

But would this be the case for humans, who do not all exhibit such severe synesthesia?

 

I do not think so, in fact I'm quite sure the answer is no, but I'm wondering if it's possible for individuals who have suffer from blindness to inherit some form of induced audio-visual synesthesia after utilizing echolocation for some number of years.

Link to comment
Share on other sites

Okay, just to clarify, I am not talking about echolocation, such as it is, in humans. I am talking about echolocation in an animal where that is its soul means of 'seeing', like the Bioraptors in the movie 'Pitch Black', or even animals like bats on Earth whose primary means of navigation is by echolocation. Would they perceive the world via their echolocation in a way comparable to how we perceive the world using our eyes?

Link to comment
Share on other sites

  • 4 weeks later...

It would depend entirely on how their brain was wired and what happened to the electrical signals the brain captured in response.

 

"Seeing" is just a convenient term to used to describe one possible method of processing external spatial stimuli by the brain. There's no reason to say that bats cannot "see" using echolocation, and indeed, judging by the size of the objects they routinely hunt, their "vision" has a much finer resolution than ours, at least at close range.

 

 


[Jim Simmons] found that bats could distinguish jitter as small as 10-12 ns! This corresponds to distances of about 2 μm. This suggests that bats can perceive extremely fine-grained distance differences, which could allow it to recognize the “acoustic texture” of a winged insect in as much detail as we could see under a microscope.1

 

However, given that bats operate in a three dimensional environment as a matter of course (being flying mammals), their brains certainly have the capability to represent their environment topographically to keep them from flying into things. Is that good enough to be called "seeing"? THat's probably for the philosophers to decide.

 


1: http://www.neuro.uoregon.edu/wehr/lecturenotes/echolocation%20lecture%20notes.pdf, p 4

Link to comment
Share on other sites

Maybe this question is decideable, thanks to functional NMR imaging.

 

When a human sees a shape, f-NMR shows the activity at his occipital cortex, and more surprising, this activity reproduces the pattern observed by the subject.

 

Now, you imagine the difficulty of the protocol...

 

It would need an animal with eyes and echolocation, put it in an NMR machine and convince the animal this is normal life...

 

Then show it an optical image only (light on a screen), observe its brains activity.

Give it a sound pattern only, observe its brains activity. Compare both.

 

Not obvious, is it? And as echolocation is naturally 3D, sight is less rich.

 

-----

 

Sonars for submarines, surface boats, helicopters do get the data for a complete 3D image from a single ping. You might use colour to display the distance if you wish. It's just that... They don't ping any more! Except at the very last minute, to fire a torpedo or grenade precisely at the target. Because right after the ping's echo comes the submarine's own torpedo - at some 500m/s. So sonars use to work passively now: they only listen.

 

Can an image be built just from listening? I've read that. With much signal processing, echoes from natural noise sources, especially waves at the surface, are said to give a true image of all items in the sea: the ground, the submarines, the big animals. Is it true? No idea.

 

-----

 

Maybe well-seeing humans link sight and touch in some part of the brains. For sure, toddlers learn to correlate both senses - you know, at the age of grasping the flowers from the wallpaper. We can make a mental image of a shape, or of our environment, from touching much like from seeing. This is something that well-seeing humans learn little with sound.

 

That's one reason why I suggest to build a laser ranging blind's stick with a moving actuator instead of an audio signal:

http://www.scienceforums.net/topic/74606-blinds-stick-with-laser-tactile/

Link to comment
Share on other sites

A ping is merely a sound with high frequency harmonics, which improves resolution, and its source position and time are known, which may simplify the math necessary to make an image. However our eyes form images using ambient light, not a strobe on our head. Similarly sonar can make an image from ambient sound.

Link to comment
Share on other sites

It's the idea behind it. But how difficult is signal processing for it? The source is essentially random and diffuse, and all the background reflects noise, in addition to the target that tries to be acoustically banal. Discrimination by the echoes' direction only will supposedly show nothing.

 

I suppose that in this mode, the passive sonar discriminates the target from the background through the distance also, but this demands to correlate the echoes with the source noise, which spreads over all distances. Less than obvious!

 

In addition, submarines use to conceal themselves near an irregular ground.

Link to comment
Share on other sites

Humans can form images of a space by processing sound, including reflections and echoes - in my case the image, or mental model, is similar to that formed by touch, by handling or touching things in the dark. It is three dimensional in the beginning, it seems, in contrast to the visual construction of the third dimension by processing two dimensional images.

 

I automatically imagined that a refined, focused, and detailed version of that would be what bats experience. But when thinking it over, the fact arrives that in blind people the sense of touch expands in the brain, and in particular coopts the visual centers - so the matter becomes complex.

Link to comment
Share on other sites

It's a different animal with a sense we don't have, you'd be anthropomorphising if you tried to compare it. No they don't "see" but obviously they "sense" somehow.


They use those images in movies because how else would you express something uncomprehenable to a human through thier senses.

Link to comment
Share on other sites

In the real world, there is no color, only frequency. We've all seen "false color" images, right? What we need to keep in mind is that our human vision sees in "false color". For example, the two vote buttons on every post here are colored green and red. But what we perceive as green and red are not green and red — these colors (and the millions of other "colors" that we can "see") are synthetic "visual" characteristics created by our brains of particular ranges within the electromagnetic spectrum.

 

And, if I remember correctly, we probably don't "see" (perceive) the actual light intensity, but instead, the log of the intensity. So, when the OP asks whether echolocation produces a "visual" image (of any kind), well, they're actually asking if it produces the images that we've come to know as our very much falsified/distorted visual perception of reality.

Link to comment
Share on other sites

 

 

It's a different animal with a sense we don't have,
Humans have very similar ears, and similar brain structures devoted to sound processing, and an ability to form models - not images so much, but 3D models - based on sound.

 

It seems a reasonable starting point for considering what kinds of impression or internal nature an echo-sounding mammal creates. We know, for example, that bats do not form pictures in direct similacrum to our visual ones, because part of their model is the density or internal structure of the object - information easily and inevitably available from sound that is not so available from light reflection.

Link to comment
Share on other sites

I ran across an article on Daniel Kish, (National Geographic, July 2013, p. 104) who taught himself echolocation after losing both eyes to retinal cancer when he was 13 months old. According to Daniel, "Each click is like a dim camera flash. I construct a three-dimensional image of my surroundings for hundreds of feet in every direction. Up close, I can detect a pole an inch thick. At 15 feet, I recognize cars and bushes. Houses come into focus at 150 feet."

 

I found this Wikipedia article: http://en.wikipedia.org/wiki/Daniel_Kish

 

And this YouTube video of him riding a bike.

 

Link to comment
Share on other sites

  • 4 weeks later...

As the visual processing in the brain already has the parts for constructing 3d models of the environment it is probable that animals that evolved echo location probably have a form of synethesia like the humans mentioned above who use echolocation or other acoustic aids for navigatioon.

Link to comment
Share on other sites

  • 3 weeks later...

Bats seem to be being used a lot as an example in here so I just thought it worth mentioning that bats actually have fairly good eyesight. They are entirely capable of navigating using only their standard vision if the area is light enough and not too full of obstacles to avoid.

Since their brains have their normal visual sense to deal with, I doubt the echolocation data is translated into a picture for them, but that is just my personal guess.

Link to comment
Share on other sites

Bats seem to be being used a lot as an example in here so I just thought it worth mentioning that bats actually have fairly good eyesight. They are entirely capable of navigating using only their standard vision if the area is light enough and not too full of obstacles to avoid.

 

Since their brains have their normal visual sense to deal with, I doubt the echolocation data is translated into a picture for them, but that is just my personal guess.

To me it seems the brain would share the part of it that forms images with input from both sight and sound; thus, being efficient in its use neurons.

Edited by EdEarl
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.