ALS News & Research For postings of news or research links and articles related to ALS


advertisement
Reply
 
Thread Tools Display Modes
Old 12-09-2008, 08:20 PM #1
BobbyB's Avatar
BobbyB BobbyB is offline
In Remembrance
 
Join Date: Aug 2006
Location: North Carolina
Posts: 4,609
15 yr Member
BobbyB BobbyB is offline
In Remembrance
BobbyB's Avatar
 
Join Date: Aug 2006
Location: North Carolina
Posts: 4,609
15 yr Member
Ribbon Cranial Computing: Practical Brain-to-Cyber Interfaces Closer to Reality

Cranial Computing: Practical Brain-to-Cyber Interfaces Closer to Reality
New portable, inexpensive brain-to-computer interfaces promise to help brain-injured patients communicate, but as of yet, they are not yet ready for prime time
By Larry Greenemeier



I KNOW WHAT YOU'RE THINKING: University of Portsmouth researcher Paul Gnanayutham is working to create an inexpensive, easy-to-use interface that allows a computer to read, interpret and display thoughts and feelings based on eye movement, the use of face muscles and/or brainwaves.
Image courtesy of the University of Portsmouth







People suffering from physically debilitating illnesses such as amyotrophic lateral sclerosis (aka Lou Gehrig's Disease) and traumatic brain injuries often find themselves trapped inside their own bodies, unable to speak, gesture or otherwise communicate with the outside world. Scientists have shown they can create computer interfaces that sense, interpret and display a locked-in person's brain waves, eye movements or facial expressions, but the challenge has been to find cost-effective ways of harnessing this technology for consumer use.

Paul Gnanayutham, a computer scientist at England's University of Portsmouth School of Computing, is attempting to do this by taking a generic, relatively inexpensive interface (software and a sensor-laden headband), loading its software into a laptop, and writing additional code to customize the device to meet the specific needs of different patients with different capabilities.

Gnanayutham chose Brain Actuated Technologies, Inc.'s Cyberlink Interface, which costs about $2,000 per unit, to serve as the core of his brain-to-computer interface kit. Probes on Cyberlink's headband can detect the minute surface electrical signals (resulting from brain and subtle muscle activity) by using electrooculography (EOG) to sense eye movement, electromyography (EMG) to sense the twitching of forehead muscles, and electroencephalography (EEG) to sense brain waves.

Gnanayutham has been searching for ways to improve brain and body computer interfaces since 2001. The past three years, he has worked with Jennifer George, a doctoral candidate at the University of Sunderland in England, focusing on accessibility of young children with severe motor impairment. As part of their research they have taught patients to control a computer cursor using facial muscles (frowning or relaxing their faces to move the cursor up or down) and eye movement (looking left or right to move the cursor accordingly). EMG and EOG sensors proved to work best in this situation, he says, because the muscle and eye movement signals are about 1,000 times stronger than those produced by brain waves (measured via EEG).

Gnanayutham's interest in helping severely disabled patients stems back to a 2000 trip he took with a church group to London's Royal Hospital for Neuro-disability. There he met a 24-year-old man who could communicate only through eye movement, primarily blinking to his nurse to indicate "yes" or "no" in response to her questions. This eye movement may have saved the man's life. Before his family and the hospital staff realized he could control his eye movement, they believed he was in a vegetative state and had made the heart-wrenching decision to disconnect his feeding tube. The man's nurse stopped the procedure when she noticed the patient was moving his eyes in a way that seemed to be communicating with her, Gnanayutham says, adding, "I thought I could do more for people like him."

Interfaces that connect computers to the brain and body are still in their infancy. "But we believe," Gnanayutham and George wrote in research published last year by the Athens Institute for Education and Research, "that our work could be the basis for their more widespread use in extensively extending the activities of severely impaired individuals."

They estimate that the number of scientific teams conducting brain-to-computer research worldwide has jumped from no more than six in 1995 to more than 30 today.

One of the best known groups developing such technology is the Laboratory of Neural Injury and Repair at the Wadsworth Center in Albany, N.Y. Wadsworth researchers have developed a cap that covers the scalp and can read EEG signals emitted by the brain (as shown in this video produced by CBS News's 60 Minutes). Like Gnanayutham, Wadsworth's goal is to get this technology out of the lab and provide it to patients in their homes. Wadsworth's technology, however, costs about $5,000 and requires too much technical support for patients to use the device outside a research setting.

Gnanayutham is hoping to create an affordable and highly usable brain and body computer interface within the next three years, although several challenges remain. Devices such as Cyberlink are available but at a high price and require custom software to be written for them to be useful to individual patients with varying levels of ability to use the interface, he says.

Gaming companies might provide at least part of the solution. San Francisco–based Emotiv Systems, Inc., uses a similar interface technology for its EPOC headset to let players use their own brain activity to interact with the virtual worlds where they play. The $299 headset's 14 strategically placed head sensors are at the ends of what look like stretched, plastic fingers that use an EEG to detect patterns produced by the brain's electrical activity and transform them into actions in the way a joystick might be used.

"What I'd like to do is create portable devices that can be used by anyone," Gnanayutham says, "and make this technology available to the general public so that many will be able to use this as a communication and recreation device on a daily basis."

http://www.sciam.com/article.cfm?id=unlocking-the-brain
__________________

.

ALS/MND Registry

.
BobbyB is offline   Reply With QuoteReply With Quote

advertisement
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
MouseKeyDo - anyone try this for computing? Bustrbyte Thoracic Outlet Syndrome 2 07-28-2008 06:14 PM
Cyber Lingo Bearygood Computers and Technology 4 07-16-2008 10:00 PM
Brain-Computer Interfaces for Communication and Control BobbyB ALS News & Research 0 06-21-2007 07:24 AM
Some good distributed computing projects dyslimbic Community & Forum Feedback 0 10-07-2006 05:42 PM


All times are GMT -5. The time now is 05:11 AM.

Powered by vBulletin • Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.

vBulletin Optimisation provided by vB Optimise v2.7.1 (Lite) - vBulletin Mods & Addons Copyright © 2024 DragonByte Technologies Ltd.
 

NeuroTalk Forums

Helping support those with neurological and related conditions.

 

The material on this site is for informational purposes only,
and is not a substitute for medical advice, diagnosis or treatment
provided by a qualified health care provider.


Always consult your doctor before trying anything you read here.