The power of thought

Eight Brazilians between the ages of 20 and 34 are seated in a room whose walls, floor and ceiling are covered with screens. They are wearing 3D glasses and helmets that record their brain waves.

Since December they have been practising diligently in a virtual immersion room called CAVE (Cave Automatic Virtual Environment, a reference to Plato’s famous cave). They are trying to use their thoughts to control the movements of a virtual avatar. Their brain waves, recorded through electroencephalography (EEG), are analysed by computer algorithms that are learning to recognise their intentions: get up, move, and kick a ball. None of these men and women can do these things in real life; they are paraplegic.

If all goes as expected, one of them will kick off the Football World Cup in Sao Paulo on June 12th. With the help of an exoskeleton, he or she will rise out of a wheelchair, walk to the centre of the pitch and kick the football. To do this he or she will have to visualize the movements mentally; a computer in a backpack will decode the motor signals emitted by the brain and instruct the exoskeleton to carry out the desired action.

This is the ambition of the Walk Again Project, an international consortium made up of six university research centres. “He’ll feel like he’s moving his own legs again,” says project leader Miguel Nicolelis, a pioneer in human-machine interfaces who is a professor at both Duke University and EPFL, as well as a co-founder of a research institute in the northeastern Brazilian city of Natal. It will be the first time such an experiment is conducted outside the controlled environment of the laboratory – and it will be a challenge for the technology. “The stadium will be full of screaming fans and flashing cameras,” says Nicolelis. “We’ve simulated this environment in the virtual reality chamber, but it remains a challenge.” An additional concern is magnetic noise from all the TV cameras.

Feel the machine

Gordon Cheng, a member of the consortium who is affiliated with the Technische Universität München, has developed an “artificial skin” made up of interconnected sensors that are dispersed over the exoskeleton’s surface. “The sensors will send the information to a tactile display located on the shoulder or neck of the subject,” he explains. “He’ll be able to feel impact with the ground and sense vibrations or temperature changes.” The resolution of the sensors is such that “the subject will know when the exoskeleton’s foot strikes the ball,” adds Nicolelis.

This feeling of touch is crucial for providing balance and allows the subject to control the exoskeleton. “Having feedback makes it easier to ‘drive’ the exoskeleton. Otherwise, the brain lacks sensory information and a sense of where it is in space. This makes it difficult to will it to move and could even lead to motion sickness for the subject,” explains Cheng.

The plastic brain

One of the difficulties with brain-computer interfaces (BCIs) is that the human must stay focused. The EEG technique requires the subject to have a huge concentration capacity. If he becomes tired, his gestures become less precise and he could make an involuntary movement.

Fortunately, the more the interface is used, the more efficient it becomes. “The plasticity of the brain is huge. It adapts to the machine,” says Cheng. “We’ve seen it with healthy subjects, who were able to learn to control a robot within a few weeks. It becomes almost an extension of their own bodies. They no longer think about it.” The Walk Again Project has made it possible to demonstrate this plasticity. “We were able to establish that the brain underwent substantial reorganisation during the trial,” says Nicolelis.

Meanwhile, the machine too is learning. “Computer algorithms also adapt to brain activity. This produces more fluid man-machine communication,” notes Ziv Williams, a BCI expert from Harvard University. “The exoskeleton at the World Cup will even be able to detect if the human inside it is stressed, uncomfortable or needs a break,” says Cheng.

Gordon Cheng, professor at the Technische Universität München

“The plasticity of the brain is huge. It adapts to the machine. We’ve seen it with healthy subjects, who were able to learn to control a robot within a few weeks.”

An electrode in the brain

The Sao Paulo demonstration is only the beginning. “The next step will be to equip the exoskeleton with arms and hands and to obtain a better and more detailed reading of brain signals by using implants,” says Nicolelis. An electrode implanted in the motor cortex, the brain region that controls the muscles, will replace the exterior EEG electrodes. Although this device will require surgery, its proximity to neurons will deliver a much more precise signal. “This will allow it to reconstruct more complex movements and even different movements simultaneously, like grabbing and manipulating an object,” says Nicolelis. “In the future, we hope to develop other applications for patients with severe neurological problems following a stroke or with degenerative disorders that prevent muscle movement, like ALS (amyotrophic lateral sclerosis).”

Such implants have already shown their potential in a series of spectacular experiments conducted on rodents and monkeys (see Ninety years of brain-computer interfaces p. 26). Primates, for example, needed only a few weeks to learn to guide a robotic arm or a walking robot by using their thoughts. A few rare tests were also done using humans, notably in cases of “locked-in” syndrome.

Despite their promise, these new techniques are not yet precise enough. “You still can’t reliably decode the intentions of a subject over the long term,” says Grégoire Courtine, a professor in EPFL’s Centre for Neuroprosthetics who is working on regenerating bone marrow using electrical stimulation. “Imagine, for example, that a person who wants to use a robotic arm to drink from a glass gets scared, and the arm shoves the glass into his face instead.” One problem, according to University of Chicago scientist Nicho Hatsopoulos, is that “millions of signals travel down the spinal cord from the brain to control a limb, but we are able to record only about 100 now.” Hatsopoulos pioneered cerebral implants in paralysed patients a decade ago.

A metallic superman

Human-machine interfaces are currently being used only for research or to restore lost functions following an accident or illness. But in future they could be used to transcend the limits of the body and “enhance” the human being. “One day, we might be able to control computers, cars, home appliances or ships remotely,” says Nicolelis. Industry is already heading in that direction: Samsung and Intel are testing smartphones and computers that can be controlled with EEG helmets. As early as 2004 Larry Page, co-founder of Google, declared that, “eventually wants to get into your brain. When you think about something and don’t really know much about it, you’ll automatically get information.”

Even more exciting, brain implants could be used to augment human sensory capacity, as was demonstrated in an experiment Nicolelis conducted in 2012. Thanks to an implant in their somatosensory cortex that was linked to an infrared detector, rats could sense infrared light that is usually invisible to them.

In another head-spinning advance, Nicolelis recently connected the brains of two rats by cortical implants. “We were able to transmit information from one brain to another,” he says. One rat learned to accurately interpret the sensations experienced by the other – seeing light and feeling pressure on his whiskers – a form of telepathy.

“We are in the infancy of brain to machine interfaces,” says Nicolelis. If all goes as planned at the World Cup, the paraplegic who gets up from his wheelchair could appropriate the words of astronaut Neil Armstrong when he became the first human to walk on the moon: “One small step for man, one giant leap for mankind.”


Posted

in

, ,

by