The power of thought

Home Technologist 01 The power of thought

An amazing project may enable paralysed humans to walk again, with the help of an exoskeleton controlled directly by their thoughts. The first public demonstration will take place before hundreds of millions of television viewers when a young paraplegic kicks off the Football World Cup in Brazil.

Ninety years of brain-computer interfaces

1924 – Hans Berger, Friedrich Schiller University, Jena: Invention of electroencephalography (EEG) and the first recordings of electrical signals from the brain.

1963José Manuel, Rodriguez Delgado, Yale University: An electrode is implanted in the brain of a bull, making it possible to control its movements with a remote control.

1978 – Graeme Clark, University of Melbourne: The first cochlear implant: a microphone worn behind the ear transmits sound to electrodes placed in the ear canal that stimulate the auditory nerve. The technology was approved in the mid 1980-s.

1978 – William Dobelle, Dobelle Institute, Lisbon: A camera affixed to a pair of glasses is connected to 68 electrodes implanted in the visual cortex of a blind patient, giving him the ability to sense light. This type of retinal implant has been available since 2002.

1993 – Alim Louis Benabid, Université Joseph Fourier, Grenoble: Cerebral implants use electrical stimulation to reduce Parkinson’s-related trembling. The method is also used for MMI research that listens in on neurons using the electrodes.

1996 – Niels Birbaumer, University of Tübingen: Paralysed individuals use a simple EEG to control a cursor on a computer screen.

1998 – Phillip Kennedy, Emory University: A patient suffering from “locked-in syndrome” learns to move a cursor on a computer screen and thus communicate by selecting letters with the help of electrodes placed in his cortex.

1999 – Yang Dan, University of California, Berkeley: Scientists reconstitute images seen by cats from cerebral signals measured by implants.

1999 – Miguel NicoleliS, Duke University: Rats activate a water-delivery system by implants.

2000 – Miguel Nicolelis, Duke University: Brain signals of a monkey controlling a joystick, measured by an implant, are used to activate a robotic arm. In 2005, monkeys were able to control the arm without using the joystick.

2003 – Gert Pfurtscheller, Graz University of Technology: Tetraplegic patients move their arms, which are electrically stimulated and controlled by an EEG.

2005 – John Donoghue, Brown University: A tetraplegic patient controls an artificial hand via his thoughts using a 96-electrode implant.

2008 – Andrew Schwartz, University of Pittsburgh: Monkeys feed themselves fruit using a robotic arm controlled by an implant. In 2012, a paralysed patient was able to feed herself using an improved arm that could reproduce 30 movements.

2008Miguel Nicolelis and Gordon Cheng, Duke University and JSTA: A monkey in the U.S. used thoughts, transmitted over the Internet, to control the legs of a robot in Japan.

2009 – Commercial products: The Mindflex game levitates a foam ball with the help of a fan controlled by EEG. The Emotiv company markets an EEG helmet for use in video games.

2012Jose Del Millan, Ecole Polytechnique Federale de Lausanne: A partially tetraplegic patient equipped with an EEG helmet uses his thoughts to move a wheeled robot located in another city.

2012 – Miguel Nicolelis, Duke University: A rat senses infrared light via a camera and implants. One year later, rats exchange sensations via implants in each of their brains.

2014 – Ziv Williams, Harvard University: A monkey makes the hand of another primate move using electrodes implanted in its brain and spinal column.

Eight Brazilians between the ages of 20 and 34 are seated in a room whose walls, floor and ceiling are covered with screens. They are wearing 3D glasses and helmets that record their brain waves.

Since December they have been practising diligently in a virtual immersion room called CAVE (Cave Automatic Virtual Environment, a reference to Plato’s famous cave).

They are trying to use their thoughts to control the movements of a virtual avatar. Their brain waves, recorded through electroencephalography (EEG), are analysed by computer algorithms that are learning to recognise their intentions: get up, move, and kick a ball. None of these men and women can do these things in real life; they are paraplegic.

If all goes as expected, one of them will kick off the Football World Cup in Sao Paulo on June 12th. With the help of an exoskeleton, he or she will rise out of a wheelchair, walk to the centre of the pitch and kick the football. To do this he or she will have to visualize the movements mentally; a computer in a backpack will decode the motor signals emitted by the brain and instruct the exoskeleton to carry out the desired action.

This is the ambition of the Walk Again Project, an international consortium made up of six university research centres.

“He’ll feel like he’s moving his own legs again,” says project leader Miguel Nicolelis, a pioneer in human-machine interfaces who is a professor at both Duke University and EPFL, as well as a co-founder of a research institute in the northeastern Brazilian city of Natal.

It will be the first time such an experiment is conducted outside the controlled environment of the laboratory – and it will be a challenge for the technology. “The stadium will be full of screaming fans and flashing cameras,” says Nicolelis. “We’ve simulated this environment in the virtual reality chamber, but it remains a challenge.” An additional concern is magnetic noise from all the TV cameras.

Feel the machine

Gordon Cheng, a member of the consortium who is affiliated with the Technische Universität München, has developed an “artificial skin” made up of interconnected sensors that are dispersed over the exoskeleton’s surface. “The sensors will send the information to a tactile display located on the shoulder or neck of the subject,” he explains. “He’ll be able to feel impact with the ground and sense vibrations or temperature changes.” The resolution of the sensors is such that “the subject will know when the exoskeleton’s foot strikes the ball,” adds Nicolelis.

This feeling of touch is crucial for providing balance and allows the subject to control the exoskeleton. “Having feedback makes it easier to ‘drive’ the exoskeleton. Otherwise, the brain lacks sensory information and a sense of where it is in space. This makes it difficult to will it to move and could even lead to motion sickness for the subject,” explains Cheng.

The plastic brain

One of the difficulties with brain-computer interfaces (BCIs) is that the human must stay focussed. The EEG technique requires the subject to have a huge concentration capacity. If he becomes tired, his gestures become less precise and he could make an involuntary movement.

Fortunately, the more the interface is used, the more efficient it becomes. “The plasticity of the brain is huge. It adapts to the machine,” says Cheng. “We’ve seen it with healthy subjects, who were able to learn to control a robot within a few weeks. It becomes almost an extension of their own bodies. They no longer think about it.” The Walk Again Project has made it possible to demonstrate this plasticity. “We were able to establish that the brain underwent substantial reorganisation during the trial,” says Nicolelis.

Meanwhile, the machine too is learning. “Computer algorithms also adapt to brain activity. This produces more fluid man-machine communication,” notes Ziv Williams, a BCI expert from Harvard University. “The exoskeleton at the World Cup will even be able to detect if the human inside it is stressed, uncomfortable or needs a break,” says Cheng.

An electrode in the brain

The Sao Paulo demonstration is only the beginning. “The next step will be to equip the exoskeleton with arms and hands and to obtain a better and more detailed reading of brain signals by using implants,” says Nicolelis.

An electrode implanted in the motor cortex, the brain region that controls the muscles, will replace the exterior EEG electrodes. Although this device will require surgery, its proximity to neurons will deliver a much more precise signal. “This will allow it to reconstruct more complex movements and even different movements simultaneously, like grabbing and manipulating an object,” says Nicolelis.

“In the future, we hope to develop other applications for patients with severe neurological problems following a stroke or with degenerative disorders that prevent muscle movement, like ALS (amyotrophic lateral sclerosis).”

Such implants have already shown their potential in a series of spectacular experiments conducted on rodents and monkeys (see Ninety years of brain-computer interfaces). Primates, for example, needed only a few weeks to learn to guide a robotic arm or a walking robot by using their thoughts. A few rare tests were also done using humans, notably in cases of “locked-in” syndrome.

Despite their promise, these new techniques are not yet precise enough. “You still can’t reliably decode the intentions of a subject over the long term,” says Grégoire Courtine, a professor in EPFL’s Centre for Neuroprosthetics who is working on spinal cord regeneration using electrical stimulation. “Imagine, for example, that a person who wants to use a robotic arm to drink from a glass gets scared, and the arm shoves the glass into his face instead.” One problem, according to University of Chicago scientist Nicho Hatsopoulos, is that “millions of signals travel down the spinal cord from the brain to control a limb, but we are able to record only about 100 now.” Hatsopoulos pioneered cerebral implants in paralysed patients a decade ago.

A metallic superman

Human-machine interfaces are currently being used only for research or to restore lost functions following an accident or illness. But in future they could be used to transcend the limits of the body and “enhance” the human being. “One day, we might be able to control computers, cars, home appliances or ships remotely,” says Nicolelis. Industry is already heading in that direction: Samsung and Intel are testing smartphones and computers that can be controlled with EEG helmets. As early as 2004 Larry Page, co-founder of Google, declared that, “eventually [Google] wants to get into your brain. When you think about something and don’t really know much about it, you’ll automatically get information.”

 “The exoskeleton at the World Cup will even be able to detect if the human inside it is stressed, uncomfortable or needs a break”

“The plasticity of the brain is huge. It adapts to the machine.” Gordon Cheng of the Technische Universität München next to a walking robot. He led the Walk Again Project’s robotic research. Photo: Astrid Eckert

“The plasticity of the brain is huge. It adapts to the machine.” Gordon Cheng of the Technische Universität München next to a walking robot. He led the Walk Again Project’s robotic research. Photo: Astrid Eckert

Even more exciting, brain implants could be used to augment human sensory capacity, as was demonstrated in an experiment Nicolelis conducted in 2012. Thanks to an implant in their somatosensory cortex that was linked to an infrared detector, rats could sense infrared light that is usually invisible to them.

In another head-spinning advance, Nicolelis recently connected the brains of two rats by cortical implants. “We were able to transmit information from one brain to another,” he says. One rat learned to accurately interpret the sensations experienced by the other – seeing light and feeling pressure on his whiskers – a form of telepathy.

“We are in the infancy of brain to machine interfaces,” says Nicolelis. If all goes as planned at the World Cup, the paraplegic who gets up from his wheelchair could appropriate the words of astronaut Neil Armstrong when he became the first human to walk on the moon: “One small step for [a] man, one giant leap for mankind.”

By Julie Zaugg

Technologist 01.023

See also Meet a technologist: Gordon Cheng – pioneering mind-controlled prosthetics


A new miniature robot developed by EPFL researchers can swim with fish, learn how they communicate with each other and…
Robots vs Jobs

The looming confrontation
Technologist Issue 13

A universal basic income would mitigate the negative effects of automation. But it might be more effective if combined with…