In a scientific first, researchers have just reproduced what a dolphin saw as it encountered a male diver.
This “what the dolphin saw” image of the submerged man reveals that dolphin echolocation results in fairly detailed images. What’s more, it’s now thought that dolphins may share such images with each other as part of a previously unknown marine mammal language.
Research team leader Jack Kassewitz of SpeakDolphin.com said in a press release that “our recent success has left us all speechless. We now think it is safe to speculate that dolphins may employ a ‘sono-pictorial’ form of language, a language of pictures that they share with each other. If that proves to be true an exciting future lies ahead for inter species communications.”
For the research, which took place at the Dolphin Discovery Center in Puerto Aventuras, Mexico, Kassewitz had colleague Jim McDonough submerge himself in front of the female dolphin “Amaya” in a research pool at the center. To avoid bubbles from a breathing apparatus (which might have hurt the later recreation of the image), McDonough wore a weight belt and exhaled most of the air in his lungs to overcome his natural buoyancy before positioning himself against a shelf in the pool.
As Amaya directed her echolocation beam to McDonough, high specification audio equipment was used to record the signal. Team members Alex Green and Toni Saul handled that part of the project.
Green and Saul then sent the recording to the CymaScope laboratory in the U.K., where yet another colleague, acoustic physics research John Stuart Reid, imprinted the signal onto a water membrane and then computer enhanced the resulting image.
“The ability of the CymaScope to capture what-the-dolphin-saw images relates to the quasi-holographic properties of sound and its relationship with water, which will be described in a forthcoming science paper on this subject,” Reid explained.
His fellow teammates thought they had captured an echolocation image of McDonough’s face, so that was what Reid was expecting to see. Instead, as he told Kassewitz in a note at the time, the signal translated to “what appears to be the fuzzy silhouette of almost a full man. No face.”
As it turns out, Amaya had been echolocating on McDonough from several feet away before she came in closer, so the researchers captured one of those farther away signals.
Kassewitz said, “Having demonstrated that the CymaScope can capture what-the-dolphin-saw images, our research infers that dolphins can at least see the full silhouette of an object with their echolocation sound sense, but the fact that we can just make out the weight belt worn by Jim in our what-the-dolphin-saw image suggests that dolphins can see surface features too.”
It could be that dolphin echolocation signals result in much clearer, more detailed mental images, and that it’s our technology that isn’t yet fully attuned to what the marine mammals are precisely seeing.
As Kassewitz said, “The dolphin has had around fifty million years to evolve its echolocation sense, whereas marine biologists have studied the physiology of cetaceans for only around five decades, and I have worked with John Stuart Reid for barely five years.”