Sufferers of brain and spinal diseases could get a boost from a developing technology that could allow them to regain the ability to communicate with computers or wheelchairs, or to other people through a synthesized voice.
Known as The Audeo, the new technology uses an electromyographic-type sensor to detect electrical signals on the throats of people who are attempting to speak, and then processes those signals into text, synthesized words or commands for an electrically activated wheelchair. Ambient Corp., developers of the new system, said that it hopes to use it to help individuals disabled by such diseases as ALS (amyotrophic lateral sclerosis, also known as Lou Gehrig’s Disease), cerebral palsy, and traumatic brain or spinal cord injury.
“Our mission is to give back communication to those who have lost it through disease or diability,” said Thomas Coleman, chief technical officer of Ambient Corp. Coleman and Ambient CEO Michael Callahan demonstrated the technology to an audience of about 2,000 engineers during a keynote speech at NIWeek here yesterday. During the demonstration, Coleman controlled a motor-powered wheelchair by giving it silent commands on stage.
“When you speak, your brain sends a signal to the muscles in your throat,” Coleman explained. “We detect the electrical activity at the throat, convert it, and then use it for communication.”
Ambient’s electromyographic-type sensor, which fits around the user’s neck like a tiny scarf, picks up the electrical signals from the nerves near the surface of the skin. It then uses an A/D converter and an on-board 16-bit Texas Instruments microcontroller, along with software algorithms in a separate PC-based microcontroller, to process and understand the incoming signals, and then send them to an output.
“Once you convert the signals to words, you could do a transcription, or create a synthesized voice, or send commands to a wheelchair,” Coleman said.
Coleman, who co-founded the company while studying engineering at the University of Illinois, said that he was aided in his product development effort by the use of National Instruments LabView software. Although he started school as a computer science major, he was initially overwhelmed by the task of implementing the control algorithms in hardware, and needed LabView’s graphical techniques to help him through that process.
“Without LabView, I probably wouldn’t have finished this,” he said. “It would have taken too long.” Even with LabView, he said, the product development took approximately three years.
Coleman noted that the ALS and cerebral palsy sufferers, in particular, could benefit from the new technology. Many such patients can still use their throat muscles but cannot squeeze enough air out of their lungs to generate audible speech. As such, some ALS patients are ultimately forced to communicate by blinking their eyes.
By reading the signals from the appropriate throat muscles, however, The Audeo could enable such patients to communicate in an audible fashion.
“We’ve worked with patients from a number of different medical categories, and this helps them,” Coleman said.
View our NIWeek photo gallery to see The Audeo system in action.