|Current Issue||Past Issues||About YSM||Subscriptions||Advertisements||Contact Us|
|77.4 - Summer 2004|
|Search YSM Articles|
> Summer 2004 > Articles
Getting Back to Nature: Biomimetic Sensors
By Brian Wayda
(Credit: Dolphin Research Center)
Roman Kuc, Professor and Director of Educational Affairs in the Department of Electrical Engineering, is taking a page out of a biology book to create the next generation of intelligent robots. Inspired by how bats and dolphins “see,” Kuc has gone a step further than biologists have by modeling natural binaural sonar systems in the department’s Intelligent Sensors lab.Sensing With Sound
The central problem facing robotics, as Kuc describes it, is compiling the abundance of data gained by a sensor and generalizing it into information about specific “objects.” Camera vision mimicking the human eye is popular, but it has severe limitations. Overlap two objects in a camera eye’s field of vision, for instance, and the computer reads them as a single object. This is known as the “occlusion problem” (see p. 20, “Out of Sight, Not Out of Mind”). Changes in lighting, smoky conditions, or the mild yet predictable skew of a glass window can also render a camera eye incapacitated. Humans can solve the occlusion problem, because their knowledge of the environment is based on expectation. Robotics, lacking this “common sense,” can only approach this kind of complexity through intricate math algorithms.
Kuc’s lab and a few other groups in the field are now exploring the benefits of substituting sonar for traditional camera vision. Built with inexpensive equipment that can be easily purchased at a local electronic store, the most basic device emits a probing acoustic pulse and waits for “echoes” from objects. The time between the transmitted pulse and the echo detected by the sensor gives information about the distance of an object from the sensor.
Figure 1. The pride of Professor Kuc’s Intelligent Sensors Lab is its multi-sensing jeep, a “practical” application for its robotics research. Its sensory systems include two sonars, one affixed to each side, a camera vision system, an odometer, and even a GPS locating system. The lab has entered and won numerous competitions against other robotics labs, in which the jeep must navigate through a field of obstacles, among other tasks. The lab is brainstorming to give the jeep a name; currently the unofficial consensus is “Yalien.” (Credit: Intelligent Sensors Lab)
Having this distance information limits the possible positions of an object to an arc of fixed radius about the detector. To pinpoint the object in space, Kuc employed a pair of sonars in a binaural configuration. The waveforms received by traditional stationary sensors tend to vary significantly depending on where the object lies in the field of the acoustic beam. In Kuc’s study, the apparatus, consisting of a transmitter flanked by two adjustable receivers, was attached to a robot arm to allow full rotational and translational mobility.
This idea mimics biology’s solution to the same problem in nature. For instance, a key component of bat echo-location is their ability to rotate and focus their ears on an echo source to generate the largest possible bandwidth. Likewise, dolphins depend on the ability to fix an object in a stationary repeatable position. If bats and dolphins could employ these simple strategies to get a detailed picture of their surroundings, so could he through his binaural sonar, Kuc thought. To filter out unwanted “noise” — spurious echoes and reverberation, he uses a biologically-inspired threshold mechanism, which reacts only to strong echoes.
Animals that rely on sight gauge distances through simple trigonometry. Given the angles at which each eye sees an object and the spacing between the eyes, one can triangulate to find the distance of the object. Kuc’s binaural sonar employs an inverted form of this algorithm. Given the distances at which each sensor “views” an object and the distance between the sensors, one can solve the triangle and pinpoint the vertex where that object lies.
Kuc demonstrated that his binaural sensor is sensitive enough to distinguish between two sides of a coin just by their relief images. A penny was placed flat on a table and the sonar was positioned at the end of a robot arm and at a 45 degree angle above the coin. Echoes helped move the sonar to first detect the coin and then position it at a known location. Two components of the echoes were used to classify the objects. First-order echoes are caused by a change in the acoustic impedance (the product of density and acoustic velocity) of the wave medium. In the case of the penny, first-order echoes result from reflections off the corner defined by the near edge of the coin and table surface. Second-order echoes are produced by a discontinuity in the first derivative of acoustic impedance with respect to range. These echoes occurred primarily at the sharp edges at the trailing edge of the coin and relief surface.
While these echoes do not provide sufficient resolution to create a visual, user-friendly picture of the coin, Kuc thought about how a bat might interpret this data, devoid of visual cues. Just as humans maintain a database of common visual objects for ready recognition, Kuc produced a database of echo waveform templates corresponding to each side of the coin, measured at slightly different, known sonar inclinations since the robot arm was not perfectly accurate. Given a coin with an unknown side, the sonar generated a new waveform and compared it to templates in the database.
Not only were the two sides of the coin distinguishable but a comparison of a heads echo waveform with those in the tails database produced a significant disparity. The two relief patterns, though hardly distinguishable from a bird’s eye view, showed great difference when analyzed by sonar.
Beyond providing insights on how bats and dolphins see with sound, Kuc’s methods have been implemented in a Massachusetts Institute of Technology (MIT) autonomous underwater vehicle for the task of localizing and classifying suspicious objects.The Interference Problem
Biologists have long speculated about how bats and dolphins are able to “see” amid a pack of animals that are emitting waves similar to theirs and are thus causing interference. Some have reasoned that these interference signals actually contain useful information, but there has been no way of testing this hypothesis. The Kuc lab set out to test if this cooperative sonar theory was physically possible, and if there was enough information contained in a sonar signal from a different organism to be useful.
Figure 2. Professor Kuc’s robotic “dolphin” utilizes full translational and rotational mobility to pinpoint objects. (Credit: Intelligent Sensors Lab)
In the study, he set up a microphone on a residential college squash court and struck a dampened aluminum rod to produce an impulsive acoustic probing signal. The microphone connected to a laptop computer recorded four separate signals. First, it would receive a signal directly from the source with no echoes. The microphone then received three subsequent signals caused by the reflection of the pulse off of one wall, the floor, and two walls, respectively. The time delay between the direct signal and each echo could be converted into the absolute distances which the sound had traveled. When the distance to each surrounding wall is known, which is easy to find from self-echolocation, one can complete the triangle to find the distance to the points of reflection, denoted O1, O2, and O3 (see Figure 3).
Figure 3. Using simple trigonometry, Kuc showed how bats might gain information about their environments from other bats’ signals. (Credit: Intelligent Sensors Lab)
These points can be visualized as obstacles in an aquatic world, and it is easy to see how dolphins and bats might utilize pulses emitted by their neighbors in flight. Of course, this experiment was executed with only one wave source, while bats and dolphins live an environment of several, simultaneously-emitting sonars. Biologists have long speculated that each bat has an individualized foraging call to help them navigate. The only way to for a bat to know where and when a received wave originated, would be for each bat to emit a unique wavelength. Indeed, Kuc’s study showed that for a bat, or any receiver, to derive information from another sonar, it must know the location of the wave source; otherwise the plethora of waves is just noise. Consequently, his experiment may explain why dolphins travel in formation with one another. By ensuring that the relative positions and distances from each other are constant, each individual dolphin has access to not only the information it gains from its surroundings but also all of the information contained in the sound signals from other members of the group.Cognitive Implications
One day, as Kuc sat across from a student in his office, he decided to issue a thought experiment. “Describe this room to me,” he requested. The student went on to accurately describe the proximity of four walls, an open door, how far down the hall he had traveled, and Kuc himself — all without the benefit of sight. The student was blind yet had “adopted an acoustic sense of the world through the abundance of sound data available to him,” according to Kuc.
Even those who are capable of seeing are not oblivious to acoustical information. “We know what concert halls sound reverberant, and that if you add a rug to a room, it will make the stereo sound different,” said Kuc. He thinks that in time, cognitive scientists will discover that a single processing scheme governs sight and sound in the brain, which in blind people becomes dominated by the latter function. There are various computational analogs, including the trigonometric pinpointing of objects in a field, and the isolation of waves according to wavelength, or what Kuc describes as “acoustic color,” used for both sight and hearing. He notes that hearing is just a super-sensitized version of touch by tiny hairs residing in the inner ear that detect discrete frequencies. While the senses have ostensibly separate regions in the brain, Kuc makes the case that a single computational scheme applies to both sight and sound.Reciprocal Relationship with Biology
Kuc calls sonar systems “the next level” in terms of robotic sensing. What his biomimetic sonars lose in not providing color information and human-familiarity, they make up for by using simpler mathematical algorithms to derive a detailed picture of their surroundings. The outcomes of Kuc’s research are two-fold: not only is he using biology’s innovations to build better robots but his work is confirming biology’s long-held speculations about how bats and dolphins are able to “see.”About the Author
Copyright 2013 Yale Scientific Publications, Inc. - Disclaimer