News Feature | November 18, 2014

Bat Echolocation Inspires Sonar-Assisted Human Navigation

By Chuck Seegert, Ph.D.

echo

Using sonar distance sensors and two cellphone vibration motors, an undergraduate project has led to a prototype to help the visually impaired. The device senses objects and gives cues through vibrations that indicate the proximity of nearby objects to the user.      

Looking to nature for successful designs and then trying to capture their essence in a man-made device is often called biomimetics. When broken down, the word literally means the miming of biology, and it was the basis for a new project from Wake Forest University, according to a recent press release from the University. The idea came about in a brainstorm session at the Science, Technology, Engineering, and Mathematics incubator program (STEM). The program is designed to increase interaction between upper and lower classmen, along with faculty.

“The vision of the STEM incubator is to pair upperclassmen from possibly different science fields together with undeclared freshman and sophomores,” Paul Pauca, an associate professor of computer science, said in the press release. “One of the goals is to foster horizontal relationships between students, meaning that they are learning from each other, but also vertical relationships with their faculty mentors.”                                                                                                     

In this case, a senior computer science major named Jack Janes and a senior biology major named Dominic Prado were paired with Ran Chang, a sophomore computer science major, according to the press release. The original idea came from professor William Conner, the David and Lelia Farr Professor of Innovation, Creativity, and Entrepreneurship, who has been studying echolocation for some time. He teaches a course called “Bio-inspiration and Biomimetics.”

“The sonar device for assisting the visually impaired is a perfect example of how my class works,” Conner said, in the press release. “We were inspired by bat sonar, we learned about it, and then we used it to develop a new product. The students took the idea and made it happen.”

The device is similar in size to a wristwatch and was designed to augment other resources that a visually impaired person might have, like a dog or cane, according to the press release. It’s called the Human Echo Location Partner, or HELP, and was based on a watch because it is small and unobtrusive.

The HELP device is based on an Arduino Lilypad microprocessor and runs on JAVA-like code, according to the press release. Sensing an object nearby, feedback is given to the user through vibrations. The frequency of the vibrations is proportional to the distance from the object. The team tested the device with the help of a blind classmate who used it to navigate doors and other objects without her guide dog.

“I’ve never heard of any of this kind of technology before, and I was eager to come in and test it,” said Kathryn Webster, the student who tested the device, according to the press release. “Of course, I was hesitant, but it was really cool.”

Other teams have been focused on developing similar technologies in an age when computer software and hardware is abundant and easily assembled. For example, Google recently awarded funding to a team at the Karlsruhe Institute of Technology (KIT) to develop a mobility and navigational aid for the visually impaired that uses cameras and smartphone technology, among other things.

Image Credit: Wake Forest University