Where It’s AT

The ALS Association National Office Assistive Technology Newsletter

April 2011

 

Two Interesting BCI (Brain-Computer Interface) Articles

Now, you can control computer commands by thought

The Economic Times, Jan 8, 2011

A new software platform, developed by French scientists, which was demonstrated at a tech fest at the Indian Institute of Technology (IIT) allows individuals to control computer commands by just a 'thought'.

Acting as an interface designed to translate what happens in the brain into a computer command, this software --'OpenViBE'-- is the outcome of a project initiated in 2005 and has a multitude of potential applications.

"The OpenViBE software platform facilitates the design, testing and use of 'brain-computer interfaces' - in other words, systems that process the electrical signals linked with brain activity and translate them into a command that can be understood by machines," computer scientists Yann Renard and Laurent Bonnet said while demonstrating the software at the Department of Computer Sciences, IIT, here yesterday.

These allow individuals to communicate with a computer or any automated system without using their hands or other movements to activate a button or remote control, they said.

"OpenViBE provides a tool that is aimed at a varied audience, from researchers and clinicians to video game developers," they said.

To a query on whether writing by thought was now possible, the scientists replied in affirmative.

A person wearing an EEG (electroencephalogram) cap focuses his attention on the letter that he wants to spell out. When this letter flashes, a particular brain wave is generated which is picked up, detected and interpreted by the machine, they said.

Explaining further in technical terms, the scientists said, "OpenViBE is a series of software libraries and modules written in C++ that can be simply and effectively integrated in order to design real-time applications. Programmer users can develop their own code, while non-programmers can use the graphical interface."

The Brain-Computer Interfaces and OpenViBE can also be used to assist those with motor disabilities -- particularly entirely paralyzed persons suffering from locked-in syndrome, in multimedia -- video games and virtual reality and in general to facilitate any type of interaction with an automated system like robotics, home automation.

"They also open up possibilities to treat certain neurological problems, attention disorders, motor recovery after a stroke for example, through rehabilitation processes such as neuro feedback," the scientists said.

 

Music is all in the mind

 

Nature News, March 18, 2011-04-04

A brain–computer interface allows paralyzed patients to play music with brainpower alone.

 

 

The brain-computer interface allows paralyzed patients to play music just by thinking about it.ICCMR Research Team - University of Plymouth

A pianist plays a series of notes, and the woman echoes them on a computerized music system. The woman then goes on to play a simple improvised melody over a looped backing track. It doesn't sound like much of a musical challenge — except that the woman is paralyzed after a stroke, and can make only eye, facial and slight head movements. She is making the music purely by thinking.

This is a trial of a computer-music system that interacts directly with the user's brain, by picking up the tiny electrical impulses of neurons. The device, developed by composer and computer-music specialist Eduardo Miranda of the University of Plymouth, UK, working with computer scientists at the University of Essex, should eventually help people with severe physical disabilities, caused by brain or spinal-cord injuries, for example, to make music for recreational or therapeutic purposes. The findings are published online in the journal Music and Medicine1.

"This is an interesting avenue, and might be very useful for patients," says Rainer Goebel, a neuroscientist at Maastricht University in the Netherlands who works on brain-computer interfacing.

Therapeutic use

Evidence suggests that musical participation can be beneficial for people with neurodegenerative diseases such as dementia and Parkinson's disease. But people who have almost no muscle movement have generally been excluded from such benefits, and can enjoy music only through passive listening.

The development of brain–computer interfaces (BCIs) that enable users to control computer functions by mind alone offer new possibilities for such people.  In general, these interfaces rely on the user's ability to learn how to self-induce particular mental states that can be detected by brain-scanning technologies.

Miranda and his colleagues have used one of the oldest of these systems: electroencephalography (EEG), in which electrodes on the skull pick up faint neural signals. The EEG signal can be processed quickly, allowing fast response times, and the instrument is cheaper and more portable than brain-scanning techniques such as magnetic resonance imaging and positron-emission tomography.

Previous efforts using BCIs have focused on moving computer screen icons such as cursors, but Miranda's team sought to achieve the much more complex task of enabling users to play and compose music. Miranda says that he first became aware of the then-emerging field of BCIs more than a decade ago while researching how to make music using brainwaves. "When I realized the potential of a musical BCI for the wellbeing of severely disabled people," he says, "I couldn't leave the idea alone. Now I can't separate this work from my activities as a composer."

The trick is to teach the user how to associate particular brain signals with specific tasks by presenting a repeating stimulus — auditory, visual or tactile — and getting the user to focus on it. This elicits a distinctive, detectable pattern in the EEG signal. Miranda and his colleagues show several flashing 'buttons' on a computer screen, which each trigger a musical event. The users push a button just by directing their attention to it.

For example, a button could be used to generate a melody from a pre-selected set of notes. The user can alter the intensity of the control signal – how 'hard' the button is pressed – by varying the intensity of attention, and the result is fed back to them visually as a change in the button's size. In this way, any one of several notes can be selected by mentally altering the intensity of pressing.

With a little practice, this allows users to create a melody as if they were selecting keys on a piano. And, as with learning an instrument, say the researchers, "the more one practices the better one becomes".

Back in control

The researchers trialed their system on a female patient who has locked-in syndrome, a form of almost total paralysis caused by brain lesions, at the Royal Hospital for Neuro-disability in London. During a two-hour session, she got the hang of the system and was eventually playing along with a backing track. She reported that "it was great to be in control again".

Goebel points out that the patients still need to be able to control their eye movements, which people with total locked-in syndrome cannot. In such partial cases, he says, "one can usually use gaze directly for controlling devices, instead of an EEG system". But Miranda points out that eye-gazing alone does not permit variations in the intensity of the signal. "Eye gazing is comparable to a mouse or joystick," he says. "Our system adds another dimension, which is the intensity of the choice. That's crucial for our musical system."

Miranda says that although increasing the complexity of the musical tasks is not a priority, music therapists have suggested it would be better if the system were more like a musical instrument — for instance, with an interface that looks like a piano keyboard. He admits that it is not easy to raise the number of buttons or keys beyond four, but is confident that "we will get there eventually".

"The flashing thing does not need to be on a computer screen," he says. It could, for example, be a physical electronic keyboard with light-emitting diodes on the keys. "You could play it by staring at the keys," he says. 

 

911 Emergency Services for Individuals with Disabilities Survey Released

 

On March 16, 2011, the FCC's Emergency Access Advisory Committee (EAAC) released a national on-line survey to determine the most effective and efficient technologies and methods by which persons with disabilities may access Next Generation 9-1-1 emergency services systems.  Among other things, the survey asks about accessing emergency services via video, text, and voice.  The results of the survey will inform the EAAC as it develops recommendations for the FCC to draft rules to ensure that people with disabilities can access NG 9-1-1 services.  The survey is available in English, Spanish and American Sign Language (ASL) video).

This survey will be available until April 24, 2011.  We encourage people with disabilities to complete this survey, and share information about the survey with other people with disabilities and organizations that represent persons with disabilities.

 

Link to the survey in English and ASL:

http://fcc.eaac.sgizmo.com/s3

 

Link to the survey in Spanish:

http://fcc.eaac-es.sgizmo.com/s3

 
Alisa Brownlee, ATP
Clinical Manager, Assistive Technology Services
ALS (Lou Gehrig's Disease) Association National Office
 and Greater Philadelphia Chapter
Direct Phone: 215-631-1877