Robots vs Jobs: A new social contract

Home Technologist 13 Robots vs Jobs Robots vs Jobs: A new social contract

The looming confrontation

Robots vs Jobs

Editorial

A universal basic income would mitigate the negative effects of automation. But it might be more effective if combined with apprenticeships.Click on the picture to read more


Invasion of the job snatchers

Make no mistake, the intelligent machines of the Fourth Industrial Revolution will lay waste to human employment – unless governments act.

I recently spent a few hours in the future. At least, that’s what it felt like. I got to follow one of the first robots I’ve ever seen operating in the wild, outside a lab, as it trundled around London, mapping the streets of Greenwich where it will soon work for its living. Designed by Starship Technologies of Talinn, Estonia, the robot is a decidedly cute, black-and-white coolbox riding on six wheels, and it will soon be autonomously collecting takeaway food from fast-food restaurants and delivering it to homes. Once it arrives, you simply unlock its lid with an authenticating phone app – and remove your pizza, curry or sushi.

It was as intriguing to watch people’s reactions to the robot as it was to watch the droid itself stopping obediently at pedestrian crossings, allowing cars to pass, and giving way politely to people on the sidewalk. Adults who came across the whirring robot seemed wary of it, but children and younger people, who are likely to grow up with this technology as the norm, didn’t bat an eyelid and greeted the droid with a cheery “hello!” as it passed them – as if WALL-E had turned up with their dinner.

On top of ongoing pre-service tests in London and Tallinn, Starship is also testing its robots with food delivery partners in Bern, Düsseldorf and Hamburg. But, fascinating as the little droid is, it is surely time to consider the impact on all the food-delivery people these robots will soon be putting out of work. With its wage-free, 24/7 availability a compelling attraction to fast-food and last-mile grocery delivery entrepreneurs alike, it’s the kind of technology whose impact could be profound.

That’s because Starship is far from alone in what it is doing. Similar indoor delivery robots are also currently being unleashed to bring room-service orders to hotel guests in Asia, spelling the end for many hospitality jobs. At the same time, retailer Ocado Technology of Krakow, Poland, and the École Polytechnique Fédérale de Lausanne (EPFL) are codeveloping – with EU funding – a humanoid robot called Secondhands that can act as a smart, dexterous and of course tireless maintenance worker.

Millions of jobs

But at what cost to jobs? Non-governmental organisations like the World Economic Forum (WEF), business consultancies like PricewaterhouseCoopers (PwC) and financial institutions like the Bank of England have all issued deeply concerning reports predicting that AI-fuelled automation could take away millions of jobs in coming decades.

PwC estimates that in Germany 35% of jobs will be susceptible to replacement by AI algorithms, robots and automated systems by the early 2030s, with the US at 38%, the UK at 30% and Japan at 21%. But PwC cautions that such jobs may change rather than disappear altogether, probably because it is hard to be certain where AI tech is going. No one knows what machine intelligence will be capable of in six months, let alone in a decade.

In regions of intense, low-wage manufacturing like Asia, AI-powered automation may have an even greater impact, with some 50 million jobs said to be at risk over the next two decades, according to a study by UBS. China will bear the brunt, taking 15 million of those losses alone. UBS says the trick will be to construct a strong services sector in which human jobs can flourish. It notes that Singapore, Hong Kong and India have already done just that. “Governments should start to focus more on occupations that require a high level of personalisation, creativity and craftsmanship”, says UBS analyst Sundeep Gantori.

Industrial revolutions are, of course, nothing new. The first, the mechanical revolution fuelled by James Watt’s steam engines, was followed by that of electricity and the electrical machines of Nikola Tesla and Thomas Edison. That in turn was followed by the digital revolution, the computing machinery of Konrad Zuse, John von Neumann and Alan Turing.

Each of these technological insurgencies has had its own impact on employment, but they had one thing in common: the new machinery tended to create new jobs. In 2015, the consultancy Deloitte looked back at 144 years of employment data and discovered that over time more people ended up in new kinds of jobs, even if many older roles were replaced by machines. Personal computers ousted typewriters, for instance, getting rid of the need for office typists – but PCs also generated new jobs in software writing, troubleshooting IT departments and at computer-security companies.

Sum of the parts

Yet the brain-like capabilities of the Fourth Industrial Revolution – now becoming known as Industry 4.0 – make it quite different. Industry 4.0 encompasses many technologies: AI, machine learning, robotics, global connectivity, cloud computing and the Internet of Things. It’s the sum of these parts, with AI at its heart, that means machines can now use learning to continually improve themselves.

To use the earlier analogy, if PCs had been able to learn on their own how to troubleshoot themselves, and fight malware, the IT departments and antivirus industries might not have evolved, and so the jobs they created would not exist. So, self-learning systems could not only mean far fewer jobs, but a severe reduction of the role of humans as inventors and improvers of technology. In his book The Fourth Industrial Revolution, WEF founder Klaus Schwab warns that governments cannot simply do nothing about the upcoming effects of Industry 4.0. They must regulate the new technologies to capture their benefits and prevent gross inequality and societal fragmentation.

Gunter Bombaerts, assistant professor in philosophy and ethics of technology at the Eindhoven University of Technology says that companies, workers, customers, shareholders and regulators now have a strong chance to boost everyone’s quality of life, if only they would take it. Automation, he says, could cut the amount of “deadening” work no one wants to do. He adds that it would be important to come up with new ways, for all people, to organize labor in a qualitative way and eventually distribute wealth, too.

Gunter Bombaerts, Philosophy & Ethics, Industrial Engineering & Innovation Sciences, TU Eindhoven. According to Gunter, automation urges us to think about new ways to organise labour and the distribution of wealth.

Governments flying blind

Right now, however, nothing along the lines of Schwab and Bombaerts’ suggestions is happening. Rather than preparing for the societal shock of machine-intelligence-induced mass unemployment, governments are fuelling the problem by investing ever more heavily in research and development on AI, fearing their economies will lose out if other nations develop key technologies first. Many developed nations, for instance, are piling cash into driverless-vehicle research.

But what of the driving professionals – taxi and truck drivers – this will put out of work? How many will there be? Can the welfare system cope? As the peer-reviewed science journal Nature put it, the lack of data on how AI will affect jobs in the future means policymakers the world over are “flying blind into the next industrial revolution”. While our political and corporate leaders have their fingers in their ears, waiting for the bang, Industry 4.0 technology is proliferating apace.

At Airbus Industrie, engineers are developing an astonishing array of ideas for the aircraft factories of the future. Walk into one, wearing an augmented reality headset they have developed, and you will be able to “see”, overlaid on your visual field, which of 100 computers in a rack has been struck by a software fault or computer virus – letting you deactivate that computer immediately with a simple gesture.

At the same time, says the firm’s robotics coordinator Adolfo Suarez Roos, Airbus is developing humanoid robots that can adapt from doing extremely repetitive aircraft manufacturing tasks “where humans add no value” – to acting autonomously when needed and coping with unexpected situations. That is not easy. That’s why Swiss-based engineering giant ABB has invested in Vicarious, an AI start-up that is developing ways to give its own industrial robots “human level vision, language and motor control”. To help humans, your robot has to have at least some of their capabilities.

So much for the high end. What is more worrying on the employment front is that robots are now also being aimed at very low level, temporary jobs – including what was thought to have been a big hope for human employment: the on-demand or “gig” economy. In this, people work flexibly, being booked by employment agencies to do temporary work like packing boxes, sorting products or running quality control checks on production lines, perhaps.

But now Smart Robotics of Eindhoven has become the first company to create an employment agency for robots – with the express aim of replacing humans in the gig economy. Smart Robotics will rent modular, reconfigurable robots to companies on an ad-hoc basis to allow them to perform such picking, packing and product inspection work – and then take the robots back again when the job’s done.

On the train leaving Greenwich – after my street outing with the Starship delivery robot – it struck me that if such gig economy jobs are up for robotisation, and those of the pizza guys are too, there will be no escape from Industry 4.0 at any level. That this had occurred to me on the London Docklands Railway, which is driverless, was not lost on me.

By Paul Marks @PaulMarks12


Cobots: Our new partners at work

Collaborative robots are boosting productivity, but they will also require us to rethink how we approach our jobs.

ABB’s YuMi, a collaborative robot that slows down or stops moving when a human worker gets close.

 

Industrial robots have been a familiar sight on factory floors for several decades, but now they are being joined by a new breed known as cobots. Short for “collaborative robots”, cobots are designed to work alongside people, performing tasks that assist them – and vice versa. At BMW’s Dingolfing plant, for example, a ceiling-mounted cobot from German manufacturer Kuka takes on the repetitive strain of mounting gearboxes while its human counterparts safely add finishing touches with finesse and flexibility.

The first cobots were patented in 1999, designed as intelligent hoisting assistants for General Motors’ automotive plants, to help minimise injuries from ergonomically difficult tasks. Although small and nimble cobots accounted for less than 5% of global robot sales in 2015, Barclays Equity Research estimates that this $120 million market could jump to $3.1 billion by 2020 and $12 billion by 2025. That would mean 150,000 cobots sold in 2020 and 700,000 in 2025. One person tracking this trend is Sebastian Pfotenhauer, an innovation research professor at the Technical University of Munich (TUM). “Cobots have the potential not only to fundamentally change the lives of workers but also to change public spaces and many public service sectors”, he says. One question at the heart of this transformation is this: if robots are increasingly being trained to work with and alongside humans, should be we trained to work with robots?

Pfotenhauer says it’s vital to understand the use of cobots as “a social interaction” that shapes our roles and identities, instead of purely an engineering endeavour. “It’s not just that the robot adapts to what humans do”, he argues. “It’s a mutually co-shaping relationship.” Pfotenhauer believes this could be as simple as understanding the requirements of looking after cobots in their intended setting, or as complex as adjusting to fundamentally altered social dynamics at workplaces or in daily life.

Maarten Steinbuch (@M_Steinbuch), chair of Control Systems Technology at the Eindhoven University of Technology, also predicts a leap in workplace robot interaction, although he doubts that new advances in industry will lead to many people working elbow-to-elbow with cobots. “There’s the image of a physical robot sitting next to me performing tasks which are programmable and require accuracy, while I’m doing another task which may require more flexibility – but this might only happen 5% of the time in a manufacturing context. I think most of the time we will have optimised factory lines where robotic machines do their work and humans are used for many other tasks around a factory, including training robots.”

Cooperation in hospitals

Nevertheless, Steinbuch believes that we need to continually re-evaluate our technology interactions. “Dutch start-up Smart Robotics (@SmartRoboticsNL)has taken the initiative to re-educate people who are working on the shop floor so that they can work together with robots”, he explains. Their plan is to educate about 30,000 people over the next few years and actually teach them how to program and interact with robots instead of being pushed away by them. Some of the most popular cobots currently employed in factories have been Swiss firm ABB Robotics’ YuMi system, designed for consumer electronics assembly lines; Rethink Robotics’ touch-pad topped Sawyer and Baxter humanoids, used for anything from logistics to inspection; and the UR robotic arms of Denmark’s Universal Robots, which the company claims can automate virtually any manufacturing task.

But it’s outside factories that Steinbuch foresees the biggest future of work alongside robots, particularly in social settings such as hospitals, caring and frontline customer service. Medicine is a field in which surgeons are already using sophisticated machinery to perform complex operations they would be otherwise incapable of. Although smart, these so-called master-slave surgical technologies are often considered to be tools rather than robots. A case in point is the Da Vinci system, which enables precise keyhole surgery by translating a surgeon’s hand gestures into smaller, stabler and more accurate movements. But this distinction may soon no longer ring true. “My research group is also working on a robot for cochlear implants near the ear”, says Steinbuch. “You can envision that by using CT scan images, you no longer need the master-slave. You can also detach the robot from the surgeon and give it the autonomous task of performing the surgery itself while supervising, making it a true robot.” In terms of the concrete skills that future workplaces may require, Steinbuch sees ever-increasing demand for engineers, as well as “basic robot maintenance skills”. Pfotenhauer adds coding to the technical skill sets of future co-workers, as well as the capacity for lifelong learning and “a greater attention to social and political education to enable workers to navigate and shape the robot transformation”.

Make children familiar with robots

With the creation of more collaborative jobs in the coming years, Pfotenhauer says we will have to assume the responsibility of governing cobot growth in a socially responsible way. He says two important moves will be for universities to emphasise social consequences in technical subjects like engineering, and for countries with economic reliance on highly automated industries to emphasise highly skilled or creative education. “If some jobs are going to be either reconfigured or replaced, then a lot of the value that humans bring to the economy will be either in creativity or social tasks requiring human interaction.”

Kids and robots learn to write together

Both professors agree that discussion of technology’s risks and promises have to be introduced at even earlier ages, to engage the youngest minds with the issues they’ll inherit. One potential candidate for this job is Roboy, a child-sized humanoid robot with synthetic muscles and tendons instead of motorised joints, safe for humans to directly interact with(@RoboyJunior). Born in a Zurich artificial-intelligence lab, the pint-sized pioneer is evolving in a cross-disciplinary collaboration at TUM and has travelled with a robotics expert to visit universities, schools and events around the world to challenge people’s fears and preconceptions.

Meanwhile, researchers from MIT and Boston University are testing a system allowing people to correct robot mistakes using only brain signals. The system uses sensors to monitor a person’s brain activity as they watch a Baxter cobot perform sorting tasks, and identifies error-recognition signals to learn from mistakes. Researchers hope this could one day allow us to wordlessly, instantaneously direct our cobots.

By Joe Dodgshun @JDJourno


The European robotics industry fights back

Asia’s acquisition of two of the continent’s crown jewels came as a wake-up call. To stay competitive, Europe must innovate.

A crucial step towards a factory without borders: the lightweight robot LBR iiwa made by the German company Kuka.

Europe is the second-largest manufacturer of robots, after Asia and before North America. In 2015, it sold more than 50,000 industrial robots, a 10% increase from the previous year. The continent benefits from a long tradition of manufacturing automated systems, represented by such venerable companies as Germany’s Kuka, France’s Aldebaran and Switzerland’s ABB and Stäubli.

This expertise, however, is increasingly coveted by foreign companies, as evidenced by recent acquisitions. In 2015, Japan’s SoftBank acquired a 95% stake in Aldebaran. In 2016, Chinese company Midea, which makes household appliances and air conditioners, bought a majority stake in Kuka. “The acquisition of Kuka significantly changes the structure of the robotics market, in that Asia benefits considerably at the expense of Europe”, says Morten Paulsen, a Danish financial analyst at the brokerage firm CLSA Japan. “Since the development of the first robots in the 1960s, the market had been dominated by two European companies (ABB and Kuka) and two Japanese companies (Fanuc and Yaskawa).”

“To be competitive on the international market, Europe must rely on its innovation potential”, says Nils Axel Andersen, associate professor at the Technical University of Denmark (DTU). Andersen believes that the upcoming robotics “revolution” will not occur among the traditional companies. “Their business is profitable, but they don’t want to risk lowering the price of their products.” As a counterexample, Andersen suggests Universal Robots, a Danish company that produces collaborative robots called cobots. “These machines are less imposing, more flexible and a lot less expensive”, he explains. “They work with humans, not for them. Setting them up is also less complicated.” Universal Robots (@Universal_Robot)claims to have a 60% share of the cobot market.

An impenetrable Chinese market

According to Andersen, the robots of the future won’t look anything like today’s. “We need to move beyond the stage of machines designed to perform the same movement over and over. We can now craft robots that adapt to their environment.” At a recent international competition in Abu Dhabi, a team from DTU presented a robot programmed to adapt to its environment, as well as communicate and coordinate its work with other devices.

Another way Europe can remain competitive is to increase cooperation among various organisations.

“Collaboration between universities, companies and financial institutions is very developed in Europe”, says Paulsen.

“We have a real advantage, particularly compared with Japan.” An example is the Partnership for Robotics in Europe, launched in 2014 by the European Commission, together with the private sector and academia. It aims to help European companies increase their market share in the global robotics market, notably through joint investments of €2.8 billion over seven years. In 2016, 17 new projects received funding from this programme.

Despite the acquisitions of Kuka and Aldebaran, Andersen remains optimistic: “Europe has a long history in developing precision mechanics. The prospects look good.” The industry agrees. “There is no indication of a technological brain drain”, says Germany’s Mechanical Engineering Industry Association. “Foreign investment is beneficial. It creates new opportunities for growth and can facilitate access to the Chinese market.” The Association calls for a “robust” agreement on investment between China and Europe, as “it is incredibly difficult for foreign companies to access the Chinese market”.

By Julien Calligaro @juliencal


Team players

Machines are getting much better at learning from humans and interacting with them. The next challenge: getting robots to talk to each other.

Drones in swarms only need to maintain contact with their immediate neighbour in the group. But, just like birds, they also develop an idea of the overall size of the group they are part of.

When the world’s first hotel managed entirely by robots opened in Japan two years ago, the owners of the group behind the project explained that it was all about communication.

The robots must not only understand the human guests, but also be aware of the other robot “colleagues”.

This development is possible thanks to recent improvements in the analytical capabilities of machines, not least the increasing use of artificial intelligence.

A prime example is the cobot YuMi, launched in 2014 by the Swedish-Swiss industrial group ABB. This two-armed industrial robot uses a camera system to monitor the movements of its human co-workers; as soon as any of them gets too close, YuMi slows or interrupts whatever it is doing in order to protect its co-workers from potential injury. Haptic interaction is also possible. By simply grasping its arm and performing certain hand movements, a cobot can memorise a procedure and thus imitate its human colleagues. “These interactions between humans and robots are becoming increasingly reliable”, said Dirk Wollherr of the Department of Automatic and Control Engineering at the Technical University of Munich. This requires the machines to be fitted with multiple sensors. “Depending on what’s required, the robot can use its visual, haptic or auditory sensors to communicate with its environment.”

Google for robots

This environment will soon include robots which can exchange information amongst themselves. A spectacular step in this direction occurred in late 2014 with the launch of the RoboBrain platform developed by Stanford University. RoboBrain is a kind of search engine: a robot that encounters a problem can use this network to get information on how another robot reacted in a similar situation. Two years ago a robot was able, for the first time, to learn from another robot hundreds of kilometres away how to place mugs on bowls
that had been turned upside down.

Wollherr foresees a significant challenge before such a system can enjoy widespread use. “Each robot model is different”, he says. “Unlike computer code, whose language is always the same regardless of the operating system, there is no common programming language for robots. A robot must therefore be able to replicate the action of another robot in its own model-specific setup.” Much more promising is a team consisting of the same type of robot; to this end, the two leading robot manufacturers, Japan’s Fanuc and ABB, are working together to develop a digital network-connected system for specific manufacturing processes. This could have huge benefits for industry, as it can often take several days to re-program an industrial robot to adapt to a change in the production process.

This also has great potential for robot-based communication off the factory floor. Dario Floreano, professor of Intelligent Systems at the École Polytechnique Fédérale de Lausanne (EPFL), and his colleagues are working on ways of simplifying the guidance systems for drones. “It is very difficult to operate a drone by remote control – drones for professional use require several days of training before they can be properly steered”, he said. With this in mind, the EPFL team is developing a soft exoskeleton for flying drones. The person wearing the jacket controls the drone’s flight by moving her upper body while she sees the world from the drone’s camera. “It only takes a few minutes to learn the procedure”, explained Floreano. At the same time, the drone acquires a greater degree of autonomy: it could, for example, recognise and correct pilot error. It could also assume full control of a flight path as soon as it detects an increase in a pilot’s stress levels. Such human-drone interaction could also help rescue the victims of natural disasters more quickly, or locate flight recorders after a crash.

Researchers are also working on interlinked systems. A potential application here would be the surveillance of fields, where a swarm of drones would analyse the condition of the crops and decide which parts of the field need to be watered or have fertiliser applied. The inspiration for this came from the animal world, as Dario Floreano explains: “A bird gets its bearings from the movements of other birds flying around it. It forms a group with them which in turn overlaps with other groups, thus forming a network. This makes the work of the developers easier: the drones only need to maintain contact with their immediate neighbour in the group but, just like a bird, they also develop an idea of the overall size of the swarm.” The same principle works underwater, as demonstrated by the CoCoRo project at the University of Graz in Austria: more than 40 small robots communicate using LED signals, moving around like a school of fish. The first applications include research in marine biology.

By Robert Gloy

SIMILAR ARTICLES

The European Commission assigned a group of experts led by former WTO director-general Pascal Lamy to evaluate the impact of…
Christian_Henschel

Determined to understand what internet users are thinking, German entrepreneur Christian Henschel now manages one of Europe’s leading platforms in…
Sarah de Rijcke

New initiatives are helping women climb the ladder at technical universities.