Presented by:

5 Machine Vision and Cobot Takeaways From The Vision Show

Machine vision can augment other cobot safety measures, said speakers at The Vision Show.

April 12, 2018      

BOSTON — At the AIA Vision Show here this week, the Collaborative Robotics Track included sessions examining how machine vision is enabling robots, and cobots in particular, to be safer, more flexible, and more useful to small and midsize enterprises. The show, which has run from April 10 through today, also featured numerous exhibitors around sensors and robotics, as well as an opportunity for industry players to network.

The Vision Show is put on by the AIA — Advancing Vision + Imaging, which is part of the Ann Arbor, Mich.-based Association for Advancing Automation, or A3. (Robotics Business Review¬†is a media partner.) The conference and expo have grown significantly in the past few years, as more affordable sensors and collaborative robots draw vendors and attendees from around the world.

1. Cobots are ‘safer,’ but not ‘safe’

Collaborative robots are often touted as safer than bigger industrial robots, allowing for robots to work alongside people without safety cages. One recurring theme through several sessions was that, like any tool, prospective users of collaborative robots need to do risk assessments. Carl Vause, CEO of gripper maker Soft Robotics Inc., noted that if a company moves a cobot or puts it on a different process, another risk assessment is necessary.

Yes, collaborative robots are designed with padding, slower motion, and sensors to avoid harming humans, said Vision Show speakers. However, their end effectors and payloads could still be dangerous and might require a caged workcell. A box cutter, glue gun, or drill — or even a held object — could still be deadly in a confined space.

Customers should carefully select applications based on the required levels of human interaction, said Rick Maxwell, director of engineering of the Materials Handling Segment at FANUC Robotics. He added that further rounds of risk mitigation, assessment, and application review may be necessary. He and Carole Franklin, director of standards development for the Robotic Industries Association (part of A3), described the following measures:

  • Safety-rated monitored stop
  • Speed and separation monitoring
  • Power and force limiting
  • Hand guiding

International standards and sensor-level networks can also provide guidance for safe cobot use, said Tom Knauer, safety champion at Balluff. In such a network, wireless sensors can provide diagnostics and enable robots to operate in a “force field” effect of “anticipatory control.” Machine vision can help cobots avoid contact with human workers while still operating at speed in areas without people.

The use of video cameras to monitor workcells could also help improve safety, said Case Segraves, account manager for vision and traceability at the McNaughton-McKay Electric Co.

2. Industry 4.0 and IIoT: Beyond the buzzwords

Other benefits of using machine vision include the ability to send data to a PLC (programmable logic controller), WMS (warehouse management system), or ERP (enterprise resource planning) systems for views into production-line operations, product quality, and interactions between robots and people, said Knauer.

In the second keynote of the Collaborative Robotics Track, Diego Prilusky, general manager of Intel Studios, showed how volumetric video is transforming sports coverage for a new generation of fans. The FreeD feature requires more than 40 cameras, on-site servers, and a team of curators to produce 3D panning shots from football, basketball, and soccer games. The results are similar to the ability to rotate perspectives that video gamers have had for years, Prilusky said.

Intel has also built the “Home for the Art of Immersive Filmmaking,” a 3D studio in Los Angeles for “advancing the art and craft” of more interactive entertainment, including virtual reality. Prilusky shared a video recorded in the new facility.

It’s not a big leap to see how multiple cameras and machine vision could provide greater insights and control over swarms of robots on a shop floor. The big data storage and processing requirements may be out of reach of smaller organizations or the cloud for now, but the ability to visualize and manipulate data from, say, a field through drones for precision agriculture can’t be ignored.

On the Internet of Things (IoT) side, exhibitors such as Robotic Vision Technologies Inc., SICK, and Teledyne Dalsa showed wares including smart cameras that are Internet-connected. McNaughton-McKay Electric was able to help a manufacturer that supplies automotive parts to prove that it was placing headrests in boxes rather than omitting them, said Segraves. The headrests were going missing at some other stage in delivery.

On the other hand, Internet 4.0 is a concept mainly promoted in Europe, and not many companies are using industrial IoT yet, said David Dechow, staff engineer in the Intelligent Robotics/Machine Vision division at FANUC America Corp. Adding machine vision to industrial and collaborative robots will make them more flexible, he said. Connecting them and analyzing the data is an outstanding software and AI challenge.

3. Industrial automation and cobot makers still rivals

Cobot market leader Universal Robots A/S was among the exhibitors at this week’s Vision Show, but ABB and FANUC also showed their collaborative models.

As many as 2 million manufacturing jobs will go unfilled in the next decade, and it’s 10 times more difficult to find workers with robotics skills, said Zach Tomkinson, sales development manager at Universal Robots. He pointed to cobots’ ease of use and relatively low cost, as well as the UR+ ecosystem of 250 developer companies.

Machine vision is useful for different cobot functions.

2D and 3D vision paired with different robots for different functions. (Click here to enlarge.)

When paired with 2D cameras, cobot arms such as those of Universal Robots are suitable for pick-and-place operations, Tomkinson said. With 3D cameras, these affordable robots become useful for bin picking and inspection, but they require calibration.

Between 2015 and 2019, 54,000 collaborative robot units will be installed, at a compound annual growth rate (CAGR) of 67%, said Scot Denenberg, senior director of hardware at Veo Robotics Inc. By contrast, 25 million standard industrial robots will ship in that period, with a CAGR of 13%. He said that Veo’s products add perception and intelligence to industrial robots, making them collaborative.

With the right sensors and programming, any robot can be made safer and collaborative, said the industrial automation providers. The majority of factories require speed and accuracy over general-purpose automation, they said.

Rethink’s Baxter and Sawyer and the UR line are intended for multiple uses, but the more traditional robotics supplies have a bigger installed base and more specialized models. KUKA expects its medical robots to catch up to its industrial line in sales next year, according to Corey Ryan, manager of medical robotics for North America at KUKA Robotics Corp.

ABB’s YuMi two-armed cobot is designed specifically for small-parts assembly rather than food production or other purposes, said company representatives.

4. Ease of use is important, but so are integration, backups

Despite claims that cobots are “plug and play,” end users should carefully evaluate their processes and find the best places to apply the technology. More complex tasks or coordination with other robots and enterprise systems often require integrator assistance.

Simulation can help organizations overcome “islands of automation,” said Tony Melanson, vice president of marketing at Aethon, which supplies autonomous mobile robots to hospitals and manufacturers. “Movement is waste, according to lean principles,” he said.

Many ERP and inventory systems “don’t speak robot,” Melanson said. “We’re working on APIs [application programming interfaces] for that.”

Making sure that sensors are properly calibrated, that cobots have the best grippers, and that mobile robots are responsive to the level of randomness in their environments is a systems question, not just a machine-vision one, said FANUC’s Dechow.

Integrators are important for helping companies identify their pain points and combining solutions such as UR arms, Cognex cameras, and the software to tie them together, said Scott Denenberg, senior director of hardware at Veo Robotics Inc.

Machine vision will help cobots, mobile robots, and drones.

CyPhy Works CTO Helen Greiner describes principles of robot design at The Vision Show.

In her opening keynote, Helen Greiner, chief technology officer of CyPhy Works Inc., noted that modular robot design is important for servicing and maintenance. She learned that at iRobot when designing robotic vacuum cleaners, as well as the PackBot mobile manipulator for IED (improvised explosive device) removal. Endeavor Robotics was spun out iRobot for the defense robots.

CyPhy Works provides tethered drones and other systems that have provided security for events such as the Boston Marathon. CyPhy’s Persistent Aerial Reconnaissance and Communications (PARC) platform has removable struts, and the declining cost of lidar has helped more robots use localization technology, Greiner said.

“All sensors suck,” joked Greiner. She explained that for sensor fusion and safety, all mission-critical systems must have redundancies and backups. Several conference attendees speculated that the recent accidents involving self-driving cars may have been the result of systems not working properly.

“Humans make incredible primary systems, but terrible backup ones,” said Greiner. “Human-robot cooperation is essential for corner cases, but there can be overreliance on technology.”

5. RaaS will help SMEs

In the U.S., 6 million SMEs generate 70% of the manufacturing output, and 84% of such companies report labor shortages, said Zac Bogard, president of Productive Robotics Inc. Jeff Burnstein, president of A3, has explained to Congress that the challenge of automation isn’t how it will replace jobs but how it can help U.S. industry.

Robotics suppliers and end users must weigh both capabilities and costs in developing and evaluating systems.

“Take costs seriously,” Greiner said, who added that iRobot learned about more affordable processors and infrared scanners from its work with NASA and Hasbro.

The robotics-as-a-service (RaaS) model can also help small and midsize enterprises (SMEs) justify investments in automation. Examples of providers at The Vision Show included CyPhy Works and Aethon, whose TUGs are in hospitals and warehouses.

“Small companies have said, ‘Either we automate, or we evaporate,'” said Howie Choset, chief technology officer at the Advanced Robotics for Manufacturing Institute (ARM), a public-private partnership based in Pittsburgh. ARM has $80 million in federal funding and $173 million from industry and is working on both the technical innovation and workforce sides of robotics and industry.

“There are no American producers of industrial robotics — that’s a strategic concern,” Choset said. “We’re working with regional collaboratives to help create the ecosystem.”

Editor’s Note: Keith Shaw contributed to this article.