5 Robotics and AI Takeaways from EmTechNext 2018

Manuela Veloso from Carnegie Mellon at the EmTechNext 2018 conference. Credit: Keith Shaw

June 07, 2018      

At this week’s EmTechNext 2018 show at MIT, speakers examined the impact of robotics and artificial intelligence and possible implications for the future of work. Across the two-day event, topics included issues around human-robot collaboration; the consequences of workforce training and higher education; and the broader economic impacts on the local, national and global stages.

Produced by MIT Technology Review, the EmTechNext conference attracted attendees within the financial, higher education, government, and corporate enterprise spaces. Here are five themes that emerged from multiple speakers at the event:

1. The end goal is humans and robots working together, not robots replacing humans

Speakers and the audience were well aware of the angst around automation, as well as research and theories that suggest that robots, AI, and other automation would eliminate jobs. Several speakers emphasized making robots that work better with humans, either through human-robot collaboration or human-robot interaction, rather than ones that just replace lower-skilled employees.

“When robots succeed, they’re never alone,” said David Mindell, the CEO of Humatics and co-chair of the MIT Task Force on the Future of Work, during his presentation. Mindell said the goal for robotics should not to strive for full autonomy, but rather “situated autonomy,” in which robots provide “the right level of autonomy at the right time for the right task.”

On the AI side, EmTechNext speakers mentioned that while artificial intelligence could outperform work done by humans, things would get better when humans and AI were working together. Daniela Rus, director of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL), discussed a study where cancer detection error rates were greatly reduced when AI and humans worked together (0.5%), compared with humans alone (3.5%) or AI alone (7.5%)

Daniela Rus MIT EmTechNext

Daniela Rus from MIT discussing how AI and humans working together can get better results. Credit: Keith Shaw

Julie Shah, an associate professor at MIT, said that at many companies, robots and humans co-exist with each other rather than collaborating. Much of her research examines how robots can better plan their actions, refine them when things change, and then execute on those changes.

For example, a better robot teammate on an automotive assembly line would be able to anticipate which tools the human worker would need for manual assembly and be able to grab the right tool at the right time.

In another example, Shah said adding intelligence to robots behind a light curtain would let the robot better anticipate where a worker wanted to go, rather than stopping too closely to the human worker or blocking their path.

“We don’t have to trade safety for efficiency,” Shah said. ‘With just the right intelligence on the robot, we can get the best of both worlds.”

Shah is also speaking on the topic of smart manufacturing and robotics at the upcoming Robotics and AI Summit at PTC LiveWorx on June 18, produced by Robotics Business Review.

Manuela Veloso, the head of the machine learning department at Carnegie Mellon University, showed a video of a mobile robot that could be taught to retrieve a cup of coffee from an office kitchen and deliver it to a second office.

By allowing the robot to ask the human for help in figuring out where the kitchen was located, as well as asking a human in the kitchen to put the coffee cup into the robot’s basket, the task could be completed more easily than trying to program full autonomy into the robot.

2. Robots are needed in some markets to solve labor shortages

While headlines and other pessimists make noise about robots replacing workers, in at least one industry, robots are addressing a labor shortage and increased customer demand. In the warehousing and e-commerce fulfillment space, several EmTechNext speakers talked about needing robots to not only help fulfill orders, but also take on picking tasks where companies can’t find humans to do the work.

Tye Brady, the chief technologist at Amazon Robotics, said increasing customer demand, including expectations of 2-day delivery via Amazon Prime, has driven the need for more robots to fill the orders along with humans in the company’s 150 fulfillment centers. “We could not meet customer demands if it wasn’t for robotics,” said Brady.

Melonee Wise, CEO of Fetch Robotics, cited a shortage of approximately 600,000 jobs in the warehousing space, which represents 10% of the total number of unemployed people in the U.S. “There’s all this concern about robots taking jobs, but I hate to break it to you, but there’s not enough people for the jobs in the United States,” she said.

This isn’t just a problem in the U.S.; it’s also a global issue, Wise added.

“We have created dreamers who don’t want to do warehouse jobs,” said Wise. “But there’s a problem with that – we still want to get our things – we want to get them tomorrow, in an hour, or now.”

Tom Galluzzo IAM Robotics EmTechNext

Tom Galluzzo from IAM Robotics discusses the need for autonomous robots in the warehouse space. Credit: Keith Shaw

Tom Galluzzo, CEO of IAM Robotics, said a generation of students who participated in robotics competitions like FIRST, as well as younger people who can make more money by driving with Uber than working in a warehouse, are less likely to be interested in physical labor jobs.

“It makes more sense to go drive for Uber than it does to pick products on a third shift in a warehouse,” said Galluzzo. “It’s not that the new generation is lazy — they’re not lazy; they’re smart. They’re getting paid more for the skills that they have that they’re more comfortable with. That’s because we’ve systematically educated them in STEM for the past generation.”

3. Education will need to transform as humans move to learn new skills continuously

In areas where automation and robotics might replace human labor, the need for retraining or workers learning new skills would grow, many speakers said. In several presentations, EmTechNext speakers highlighted either professions or skills that humans were good at — cognitive, emotional, creative — and ones that robots were good at — physical, repetitive, manual labor-oriented.

Joseph Aoun, Northeastern University Credit: Matthew Modoono/Northeastern University

Joseph Aoun, president of Northeastern University, said the goal of higher education needs to allow people to become “robot-proof” – that is, give them training, education and skills that they can use to find new jobs if automation is introduced.

Several EmTechNext presenters spoke of lifelong, or continuous learning, that is required for the workforce of the future. The old days of education stopping at age 21 or 22 could soon go away in order for people to remain competitive in the automated economy.

Who provides, and pays for, the education remained up for debate. Aoun suggested that higher education will need to evolve from treating adult education or continuous learning programs as “second-class,” relegating those programs to community colleges or professional development offerings. Other speakers said that employers and government had partial responsibility for educating the workforce.

When discussing how companies needed to step up to educate their workforce, Sanjay Sarma, the vice president for Open Learning at MIT, said more responsibility needed to go to corporate human resources departments.

“HR needs to be weaponized,” Sarma said. “Learning has to be the new rocket science.”

4. Lower-skilled workers will face difficulty in retraining without help

When asked whether they felt like their jobs would be replaced by robots or automation, a large majority of the highly educated and corporate professional audience kept their hands down. Speakers agreed that the lower-skilled jobs or tasks would be the ones that robots replace.

Iyad Rahwan, an associate professor at MIT, gave a sneak peek at some research he conducted that indicates the difficulty of people to find automation-proof jobs if they are replaced. In analyzing a number of skills and their likelihood of being replaced by automation, Rahwan discovered two large areas where similar skills could translate to different jobs with the right training,. However, there was a large gap between skills that were unrelated. In other words, a welder replaced by a welding robot would find it difficult to become a management analyst.

Rahwan said the idea of giving workers skills in computer science and programming might not be the answer, as many math and science-based skills are also likely to be threatened by AI and automation. Instead, he argued that people should learn the skills that humans are better suited for – creativity, cognition, and working with teams, a.k.a. “soft skills.”

“We need to teach people how to be less like computers and more with people,” he said.

5. Teaching robots and AI to learn is still very difficult

While many speakers agreed that the industry has made great strides in the past few years, they also admitted that the world is in the very early stages of development. In several cases, EmTechNext speakers highlighted the difficulty of robots to achieve certain tasks, ones that humans take for granted.

In discussing her coffee-retrieving robot, CMU’s Veloso said the robot can only complete that specific task – it would ignore other potential situations along that task.

For example, if someone was injured along the path of the robot, or if the area was flooded, the robot would continue to attempt to complete its task rather than try to help the human or notify someone about the flooding. The next goal within robotics is to try to get the robot to be aware of multiple situations and react accordingly.

In her EmTechNext presentation, Wise showed several examples of training robots beyond programming, including kinesthetic training and simulation. One included a software review of YouTube videos to get a sense of how people moved or completed tasks. The audience was very amused at one video of a robot arm attempting to learn how to flip pancakes, in which the robot failed more often than it succeeded.

Veloso and MIT’s Rus mentioned the need for better “explainability” – getting robots and AI to translate all of the data they’re generating, or the decisions they’re making, into more natural-language explanations that humans can better understand.