Meet a technologist: Gordon Cheng – pioneering mind-controlled prosthetics

Home Technologist Online Meet a technologist Meet a technologist: Gordon Cheng – pioneering mind-controlled prosthetics

When a paraplegic man wearing a mind-controlled robotic suit, known as an exoskeleton, kicked off the 2014 soccer World Cup in Brazil, robotics engineer Gordon Cheng played a key role in this global premiere.

Photo of Professor Gordon Cheng, with various small robots in the background.

Gordon Cheng’s eyes shine with delight when the stadium in São Paulo, Brazil, appears in the huge digital photo frame in his office at Technische Universität München (TUM), Germany. The frame shows over a thousand images from more than four years’ work. But this particular image is something special. It is ‘the’ picture for the professor of cognitive systems.

And June 12, 2014, the day of the opening ceremony of the soccer World Cup in Brazil’s São Paulo, is a day Cheng will never forget. “Watching Juliano Pinto kick that World Cup ball was a fantastic and hugely satisfying feeling,” he explains with a smile.

In itself, kicking a soccer ball is nothing special. However, 29-year-old Pinto has been paralysed for years. The fact that he was able to independently kick the ball despite his paralysis is all down to the exoskeleton that he was wearing and controlling with his mind.

A robotic suit with built-in feedback

Illustration of a person wearing a roboticsuit

Wearing a robotic suit. Supporting its own weight, the exoskeleton is made of titanium, aluminium and steel. It features a large number of 3-D printed plastic, robotic muscles that can be adjusted via hydraulic systems as well as an integrated computer and built-in stabilising gyros. It is controlled via brain waves registered non-invasively by an electrode cap. Some body parts are equipped with artificial skin containing various sensors to replicate the feeling of movement. (Illustration: ediundsepp Gestaltungsgesellschaft München.)

The exoskeleton is a kind of robotic suit, partly covered by an artificial skin that transmits tactile signals to the user.

“The skin provides the user with feedback, telling them indirectly, for example, that the robotic leg – and therefore the patient’s own leg – has touched the ground,” adds Cheng.

The mind-controlled exoskeleton is the product of many years of international collaboration between experts in neuroscience and cognitive technology.

Cheng has been a key player in this initiative along with project leader Miguel Nicolelis at Duke University in Durham, North Carolina, US.

Cheng came to Munich to hold a lecture just over four years ago. Impressed by his work, TUM offered him a professorship in cognitive systems.

He was very tempted by the offer but would not commit without consulting his wife – who said yes.

The university created a new chair for Cheng, the Institute for Cognitive Systems, which had a core goal of creating an artificial skin for robots.

Patients, preschoolers and prime ministers

Cheng was unfamiliar with the German system, so he had no idea how much work was waiting for him in Munich. He started from scratch with just eight employees.

Later, in the critical phase of the project, he was flying back and forth between Munich and São Paulo to tailor the exoskeleton and artificial skin to the requirements of the patients in São Paulo. It was a herculean task to have everything ready by June 12, 2014.

Today, he has more than double the number of employees with new people joining all the time. Yet Cheng still has time for visitors – whether it’s preschoolers or school children looking to find out more about science, or the French prime minister and his entourage. He can’t help smiling when he talks about that visit, when more than a dozen limousines and two buses full of people brought Karlsstraße in Munich to a standstill.

“We like what we are doing and are always happy to show it to other people. Luckily, we didn’t have to deal with the chaos caused by our guests.”

Humanoid robots and neuroscience

Cheng is a true globetrotter. He was born and spent his first years in Macau, when it was still a Portuguese colony. Later, he went on to study information sciences and complete a PhD in systems engineering in Australia.

From 2003 to 2008, Cheng founded and headed up the Department for Humanoid Robotics and Computational Neuroscience at the Institute for Advanced Telecommunications Research in Kyoto, Japan. During this time, he was also responsible for the neuroscientific “Computational Brain” project.

It was during a symposium in Kyoto that he met Miguel Nicolelis. Both scientists were impressed with each other’s work. Before taking the professorship in Munich in 2010, Cheng worked as a visiting professor in the US and France, and also at the Edmond and Lily Safra International Institute of Neuroscience in Natal, Brazil.

The institute was founded by Nicolelis in 2005. He specifically chose to locate it in one of the poorest regions in Brazil. In addition to the actual research institute, the site also houses a clinic that offers free, prenatal check-ups and a science school for 1,500 children. Cheng taught at the school for one month.

The first step towards an exoskeleton

Cheng and Nicolelis started initial trials for the Walk Again Project back in 2008 in North Carolina. The first steps were made by a female rhesus macaque monkey called Idoya. Nicolelis and his team implanted electrodes into her brain. As soon as Idoya was able to walk upright on a treadmill, the team recorded the signals emitted by her brain and mapped them against slow motion recordings of her movements. The scientists were able to use this data to identify the commands associated with leg movement.

The commands from the monkey’s brain were transmitted in real time to Japan. There, Cheng fed them into a humanoid robot, which started to imitate Idoya’s steps. The robot’s leg movements were then played back live to Idoya, who learned to control and improve them. When Nicolelis’s treadmill stopped, Idoya also stopped moving, but kept her eyes firmly fixed on the monitor. The robot in Japan kept walking for another three minutes. This could only have been done by Idoya herself. She regarded the robot’s movements as her own and was able to control its steps just by using her mind.

“This was possible only because the robot had become part of her body schema,” Cheng enthuses, still impressed by the rhesus macaque’s intelligence. “When we drive a car, the car becomes an extension of our body schema. When we eat, it’s the knife, fork or chopsticks. It’s the same with the exoskeleton.”

Mind and body working together

Eight paralysed Brazilians aged between 20 and 40 made it through to the final selection for the Walk Again Project in São Paulo. For these eight people, the exoskeleton became a part of their body. But it took a long time to learn how to deal with the new technology.

Photo of Gordon Cheng and research assistant Philipp Mittendorfer watching a video recording from Brazil, where eight paralysed people were trained in moving using an exoskeleton.

Fantastic footwork by an exoskeleton. Since January 2014, the Walk Again Project has trained eight paralysed persons in Brazil to move with the exoskeleton. One of them used the suit to take the World Cup’s first kick – just with the help of mind power and supported by innovative technology. Lead robotics engineer Gordon Cheng and research assistant Philipp Mittendorfer were watching the training sessions and continuously worked to improve the system. (Photo: A. Eckert/TUM.)

The learning process started on a treadmill. The participants were placed in a harness and their legs supported by a frame. Their brain activity was measured noninvasively along the scalp using electrode caps on their heads and electroencephalography (EEG). The mechanical frame was used to walk the patients’ legs. While this was happening, the participants were asked to concentrate hard on wanting to walk. The resulting brain wave pattern was similar in all eight patients.

In the next phase, the patients had to wear the electrode cap and start training with a virtual computer simulation. Instead of their own legs, the participants saw animated legs below their waists. Whenever they thought about walking, standing or kicking in the right way and imagined making the corresponding movements, their virtual legs made the exact same movements.

“The patients were more or less learning a new language. We can do this only because our brains are unbelievably adaptable,” Cheng explains.

From January 2014, the participants started training using an exoskeleton with an integrated computer. The signals from the brain are translated into specific commands that the computer sends to the exoskeleton. “An incredible amount of data was collected, decoded and translated into movement commands for the exoskeleton. This all had to be categorised and the computer programmed correspondingly,” Cheng continues.

Artificial skin stabilises exoskeleton

Photo of a human hand touching the artificial skin of a robot part

CellulARSkin, the artificial skin designed to cover robot parts, consists of several hexagonally shaped unit cells. Each hexagonal cell comprises four sensors that measure pressure, proximity, temperature, and vibration as indications for touch. The interconnected cells pass electrical signals via varied pathways through the entire skin to and from the central computer. (Photo: A. Eckert/TUM.)

As far back as 2008 however, the researchers realised that something was missing.

The participants need continuous feedback on the current position of the robotic legs and, during training on the treadmill, with virtual legs. They have to know what movement the robot is making at all times; otherwise the exoskeleton could fall over.

What they needed were sensory receptors. And this is where CellulARSkin comes in. The artificial skin was developed by scientists in Munich to provide tactile sensations.

It comprises a large number of hexagonal printed boards approximately the size of a two-euro coin. Each one features an energy-saving microprocessor and sensors to detect changes in speed, temperature, and touch – and to sense proximity to other objects in three-dimensional space, for example when they are close to or moving away from the ground.

The artificial skin enabled the participants’ brains to learn more effectively while moving on the treadmill and in the exoskeleton.

Affordable technology

For Cheng, June 12 was not the end of his work, but just the beginning. “We still have a long way to go,” he adds. And he has many new ideas to try out. “We will be improving the exoskeleton even more and harnessing new technologies to bring down production costs.”

Cheng would like to see the exoskeleton used by as many paralysed people as possible. He also believes it can help patients with other movement disorders. “Science can really give something back to society here.”

I’s hard to see how Cheng finds the time to watch football with his eight-year-old daughter, an ardent fan of FC Bayern and goalkeeper Manuel Neuer. But he is a pioneer in the truest sense of the word: “It’s something completely new for me,” he says. And new is something he is always willing to try.

Adapted from the article ‘Exoskeleton enables paraplegic man to walk’ by Gerlinde Felix in TUM’s magazine Faszination Forschung

Related reading: The power of thought

SIMILAR ARTICLES

Robots vs Jobs

The looming confrontation
Technologist Issue 13

A universal basic income would mitigate the negative effects of automation. But it might be more effective if combined with…
Robert West, researcher at EPFLRobert West, researcher at EPFL

A Swiss researcher has created a system that scans Wikipedia for important articles that are missing in other languages. This project…
pepper_robot

Humanoid robots are finally learning to charm us. Can French-founded SoftBank Robotics stay ahead of the competition after a decade…