Factory robots are about to get even smarter. Japanese robot makers, which account for 52% of global supply, are pushing further into artificial intelligence and deep learning so that smart machines can work faster and with greater flexibility amid a shrinking workforce.
Deep learning systems can recognize objects in messy, real-world situations and they’re expected to enhance the perception abilities of industrial robots. At iREX 2017 in Tokyo, FANUC, Yaskawa Electric and other manufacturers put on demos showcasing how deep learning can optimize factory automation.
Deep learning for random bin picking
FANUC’s booth at the biennial show had its usual offerings of industrial robot arms, collaborative robots, a titanic car-hoisting robot, and its Field System, a networking platform for the industrial internet of things (IIoT) that can provide breakdown predictions. But staff were keen to point out demos developed in collaboration with Preferred Networks that exploited deep learning.
In one demo, a bin-picking LR Mate 200iD robot used a deep learning algorithm and a 3D sensor to analyze a bin with a random assortment of cylindrical work pieces. FANUC’s system ranked the cylinders according to which ones were unobstructed and easiest to grasp. A screen overlaid grasping probability scores on an image of the cylinders, with the unobstructed ones scoring near 100.
The system has a higher ratio of successful grasps compared to robots without deep learning functions, according to FANUC. It was trained for about eight hours to learn which pieces were within easy reach, but that can be reduced to two hours when using four networked robots. The system took about two years to develop, and FANUC plans to release the picking application in March 2018.
Identifying defects invisible to human eye
In another demo, a CR-7iA arm picked up smartphone casings (pictured atop this page) and held them up to a camera, which scanned for defects that were small enough to be nearly invisible to the naked eye. The approach differs from past techniques that required programming rules for the system to produce a pass/fail judgment on the quality of a component. With the deep learning approach, the system is trained using many images of components with and without flaws. It could be useful in Chinese electronics plants, which can have assembly lines with dozens of human inspectors.
“If you can combine networking and artificial intelligence, we can have a synergy effect,” says Kiyonori Inaba, general manager of FANUC’s Robot Business Division. “The knowledge of one robot can be digitized and transferred to other robots through networks.”
Yaskawa Electric, known for its Motoman line of industrial robots, showed off a bin picking application that uses deep learning. An arm was equipped with a gripper, a camera and a deep learning system that allowed it to grasp objects of various shapes using the same gripper.
The arm moved T-, Y- and ring-shaped work pieces around from bin to bin during the demo while a screen showed a live feed from the camera. Teaching isn’t necessary with the system because it uses deep learning to generate its own motions, according to Yaskawa; it also works with a 2D camera instead of a 3D camera, making it cheaper.
Enabling robots to mimic employees
Kawasaki Heavy Industries, which announced a cobot partnership with ABB Robotics, introduced an AI platform with a different focus. The Successor robot system involves robot arms mimicking the work of veteran technicians. These experts can remotely control robot arms to perform jobs that typically require fine adjustments or highly skilled workers to know when they’re being done right.
Through deep learning algorithms, the platform can learn the motions of an expert doing the work via remote control, and eventually learn them and then reproduce them. In a demo, Kawasaki invited iREX attendees to try out a newly developed video game-style controller, called a Communicator, to move an arm that was holding a seat intended for a large all-terrain vehicle. The controller can relay the haptic feedback of seat installation work, so operators can teach the robot what it feels like when the seat is aligned properly in the vehicle.
“The robot can get the hang of it as it learns the way an operator can jiggle the seat into place,” a Kawasaki staffer said. “Once the robot learns, it can pass on the knowledge to human workers as well. This can also be useful for passing on industrial skills in Japan as the workforce is getting smaller.”
The system can be used to automate assembly of order-made or small-lot products that were difficult to robotize; it will be sold in limited quantities starting in spring 2018 and will have a general release starting in 2019, according to Kawasaki.