Share this story
Close X
Switch to Desktop Site

'Acoustic Daylight' Gives Oceans New Transparency

Scientists harness ambient noise to bring undersea objects into focus

About these ads

From the songs of whales to the crash and hiss of waves, the ocean is a sea of sound. For years, scientists trying to improve traditional underwater-detection techniques, such as sonar, have viewed this background noise as a nuisance.

To Michael Buckingham, however, that noise can shed a whole new "light" on what lies or moves beneath the surface. The professor of ocean acoustics at Scripps Institute of Oceanography in La Jolla, Calif., is refining a technique he pioneered for using the ocean's ready supply of background noise to yield images of underwater objects.

The approach, dubbed "acoustic daylight," could find applications ranging from giving submerged submarines forward vision to surveying the sea floor and monitoring efforts to lay undersea cable, he says.

The ocean, he explains, is much more transparent to sound than to light. Like light, sound has a characteristic set of frequencies, or wavelengths, which an object can absorb, reflect, and scatter. While working on conventional sonar technologies for the British government in the mid 1980s, Dr. Buckingham says, "it occurred to me that one might be able to use the ambient noise in the ocean like using the natural-light field to take a photograph."

In 1991, he and colleagues at Scripps fielded their first acoustic "lens" - which looked like a small satellite dish. The dish-shaped reflector gathered incoming sound and focused it on an underwater microphone, or hydrophone.

The lens was connected to a computer, which turned the signal into an image consisting of a singe large rectangle, or pixel, on a screen. The idea, he says, was to see if the noise level changed when divers placed an object in front of the lens. Their target consisted of a plywood sheet 1 meter (about 39 inches) square, covered with neoprene and placed at distances ranging from about 7 to 12 meters (23 to 40 feet) away.

When they aimed the lens at the target, the sound's volume doubled, and the target reflected some frequencies better than others. This gave the team hope that by having the computer assign colors to specific frequencies, they could generate "false-color" images. In principle, he says, "false-color images could allow us to make inferences about the nature of an object."


Page:   1   |   2

Follow Stories Like This
Get the Monitor stories you care about delivered to your inbox.