UIUC Innovators Develop Mind-Computer Interface with NI LabVIEW

Publish Date: Jul 09, 2013 | 6 Ratings | 5.00 out of 5 |  PDF

Overview

Engineering Students at University of Illinois at Urbana-Champaign (UIUC) used NI LabVIEW to develop a computer-assisted communications device that translates thought into speech or commands for computer control over wheelchairs and other devices.
Their application showcases how NI LabVIEW enables innovative development.

1. NI LabVIEW Enables Innovative Development

Michael Callahan, a recent graduate of the systems and entrepreneurial engineering program at the University of Illinois at Urbana-Champaign (UIUC), recently received the prestigious Lemelson-Illinois Student Prize. The award recognizes Callahan for developing a device that will profoundly benefit people with amyotrophic lateral sclerosis (ALS / Lou Gherig's Disease), Cerebral Palsy, spinal cord injury, or other neurological disorders. The device, dubbed "The Audeo," acquires and translates neurological signals, so subjects who could not speak or move can communicate. Callahan has teamed up with Thomas Coleman and other UIUC colleagues to start their own company – Ambient Corporation – to develop and commercialize the technology. National Instruments LabVIEW has played a key role in doing so.

The Audeo device allows thought-based control of a wheelchair. A LabVIEW application is at the heart of this laptop-based prototype, which acquires nerve impulse signal signals from a sensor worn around a subject’s neck. A custom LabVIEW-based algorithm processes / interprets those signals, generating appropriate control signals for the wheelchair.

The Audeo is a sensory device that is placed around the neck of an afflicted individual. In operation, it intercepts the signals from the brain that control the vocal chords and the vocal tract. These signals then are sent to a computer that filters out the noise from any background sources, processes it using a complex signal processing algorithm and then interprets the individual’s meaning from the signals to produce speech. The speech can then either be output directly or used to control external devices, such as a motorized wheelchair.

“People with ALS maybe able to move their mouth a little, but they can’t exhale enough air from their lungs to produce audible speech. But since the speech signals are produced by the brain even if they cannot speak, we can intercept those signals and create the speech for them,” said Coleman.

The team has implemented prototypes that rely on an application built using LabVIEW to control the acquisition and apply the signal processing that translates the nerve impulse signal samples into commands for wheelchair control, or to be spoken.

To control a wheelchair, the system needs to identify directional commands – forward, right, left, stop, etc. For it to do so, the software identifies those singular commands from the speech patterns - rather than process the data to produce continuous speech.

The continuous speech processing strategy that would allow the user to hear what they are producing as they are doing so is still in development. “One of the challenges,” explained Callahan, “is to develop a universal mathematical transformation of the data that would work well for everyone - not just for a specific individual.”

The job requires sophisticated signal processing algorithms that can adapt to nerve impulse patterns, which vary between individuals and over time. Using native LabVIEW graphical programming has aided the development process allowing Callahan and his team to perform rapid trials using live signals with varying algorithms and parameters.

"LabVIEW simplifies development and encourages innovation by offering an intuitive graphical programming approach that allows you to focus on innovation rather than programming details," Callahan said. He and his colleagues implemented the software with LabVIEW to build an initial prototype in 2005 and have been making great improvements ever since.

The developers are continuing to use LabVIEW throughout the development process. Callahan and Coleman have collected a large amount of data from numerous individuals and have developed a system around LabVIEW that can generate such transformation algorithms and evaluate their effectiveness on the fly. As the developers discover the optimum processing algorithms, and fix the specific design parameters of the system, then they intend to commit the entire acquisition and processing into hardware that will be self-contained in The Audeo device itself. That way, there will be no need for an external computer in the system.

The result, they say, will not just be one device, but a range of devices that will target the needs of a number of different users. One such device, for example, might simply be dedicated to speech production, while another might be used to enable a disabled person to control a variety of different devices such as a wheelchair, computer or even a mobile phone.

 

 

For More Information

www.ni.com/academic/signalprocessing.htm

Find out more about how intuitive graphical programming, extensive signal processing capabilities and other features of NI LabVIEW can enable innovative research and development.

www.iese.uiuc.edu

Learn about the Industrial & Enterprise Systems Engineering program at the University of Illinois at Urbana-Chapaign.

www.theaudeo.com

Callahan and his colleagues have founded a biotechnology start-up company, Ambient Corp., to develop their technology.

Back to Top

Bookmark & Share


Ratings

Rate this document

Answered Your Question?
Yes No

Submit