Listen to this article
Robots are finding their place alongside people in many work environments, including factories, warehouses, retail stores and even hospitals. As they do, we need to make it easy for people to trust robots, and for robots to take the right cues from humans.
This is especially true when autonomous mobile robots (AMRs) are in the mix, as these dynamic robots are just as the name states – autonomous. There is no one standing off to the side or in a backroom controlling the AMR with a remote control or traditional human-machine interface (HMI). The AMR is programmed to do a certain job, and then sent off to complete it – much like a human goes through a standard onboarding process and is then trusted to work independently.
Given the freedom that both AMRs and humans have to go about their business without much oversight, we must ensure they know how to interact with one another. We must teach them both proper social behaviors.
A Different Point of View
Teaching robots how to act more like humans might make some people nervous. But it is necessary given how they must interact with one another to keep workflows running smoothly in fast-paced, high-stakes supply chain operations. If you are to deliver to customers what they need, when they need it, everyone must be able to get along. We must show workers that AMRs know when to engage and when to keep their distance, which means we need to teach AMRs what’s okay and what’s not.
But before we can do that, we must ensure AMRs can see, understand, and react to what is happening around them. They cannot be in their own world, like many robots are today – seeing only “myself,” “obstacle” or “free space.” Likewise human workers must be able to look at what is happening around them from a different perspective. Their world cannot just be comprised of “myself,” “automation equipment,” “fixed infrastructure,” “other workers” and “many unknowns.” They must be able to understand the depth and breadth of the world around them, especially the extent of AMRs’ intelligence and capabilities.
If we don’t change how robots and people think and react to one another, we can fully expect people to feel intimidated, hesitant and unsure of themselves when AMRs enter their world. Their minds might trick them into believing a robot is too aggressive or too close – or that it is ignoring them. Alternatively, they could humanize the robot too much and make assumptions that could be harmful to all. Your AMRs will end up underutilized, and it will take you longer to get a return on investment (ROI) – if you get one at all.
83% of warehouse associates who work alongside AMRs today claim the autonomous robots have helped increase their productivity and reduce walking/travel time – a win-win for you and your front-line teams.
Your workers and customers will also miss out on the benefits AMRs offer them. And what are those benefits? Consider the results of a recent double-blind Warehousing Vision Study, recently commissioned by Zebra Technologies. In that study, 83% of warehouse associates who work alongside AMRs today claim the autonomous robots have helped increase their productivity and reduce walking/travel time – a win-win for you and your front-line teams. What’s more, three-quarters of associates say AMRs have helped reduce errors, which is good for you and your customers, while nearly two-thirds (65%) credit AMRs with career advancement opportunities, which helps with employee retention.
So, it is critical we eliminate the biases that result from a “me, myself and I” mindset or preconceived notions. We must ensure neither AMRs nor human workers fall victim to “sole agent syndrome.” The best way to do that is to put our heads (together) in the cloud.
New Techniques for Teaching Trustworthiness
For as long as I can remember, robotics automation innovation has been driven by three things – repeatability, scalability and increased throughput. That’s why many robotic arms, automated guided vehicles (AGVs) and static robots have been built to complete tasks within enclosed work cells, along conveyor lines or in travel lanes.
It is also why most robots are programmed to complete tasks using pre-defined motions, with behaviors fully controlled by a person. Most robots do not need to “figure things out.” They just need to do what a person tells them to do.
AMRs are different, however. While they collaborate and interact with people, they are reliant on a person guiding their every move – telling them when to stop, start or move in a different direction. They must be able to make both decisions and the right moves, on their own, without a person intervening.
Behaving Like People
At Zebra Technologies, we use customer scenarios, simulation, and the cloud to understand current AMR behaviors, as well as the changes needed to achieve desired behaviors. We then develop navigation behaviors for the robots, which are based on heuristics/biases that we encode into their navigation and planning code.
These heuristics/biases help AMRs behave more like people socially. For example, robots will drive down the right side of hallways in the United States and the left side of hallways in Great Britain because those are the social norms in those countries. By encoding these behaviors into AMRs’ navigation and planning, associates have a better understanding of how the robots will behave as they drive around the facility, which results in trust, better collaboration and improved robot performance.
Because our AMRs are managed via the cloud, it is also easy to record data that helps us understand each robot’s performance in the facility. We use velocity and path conformance for low frequency and high frequency interactions as a baseline to understand how changes to the navigation code improve performance. This allows us to create a vibrant diagram of how each robot performed in different facilities and then make refinements. Using these techniques, we have been able to measure as much as a 54% improvement in robot velocity as it moves through the facility with improved robot social behaviors.
And once robots can be trusted not to go rogue – after they are equipped with the right social behaviors – then human behavior toward those robots will change
The Cloud Is a (Training) Force Multiplier
Traditionally, when teaching robots their jobs, we would tell them what they need to do, give them operating parameters, execute the motion, then work to adapt their capabilities as needed. Now, thanks to machine learning, convolutional neural nets and other cloud-based technologies, we are providing AMRs with the ability to adapt to the world around them. They can detect and delineate between different semantic objects like people, forklifts, and pallets to make the right decisions about how to behave based on encoded behaviors as well as current sensory inputs.
These AMRs are not working exclusively from inferred guidance… they operating in reality. In other words, the cloud is enabling us to enocde AMRs with the social behaviors required to help humans feel comfortable working alongside them. In turn, it becomes easier to teach people the social behaviors required when working around robots. People will be able to see how AMRs successfully navigate around – or away from – a person who is in their bubble and should not be. They will also see how AMRs can enter their space safely to support them when assistance is needed.
So, the next time someone tells you the cloud is not doing much for robotics automation, remind them that if it were not for the cloud, AMRs would not be able to work autonomously – or collaboratively with people – the way they do today.
And once robots can be trusted not to go rogue – after they are equipped with the right social behaviors – then human behavior toward those robots will change. The hesitancy to engage an AMR will fade as confidence in the robot’s “demeanor” grows. People will begin to see and appreciate how AMRs can help them and adoption rates will rise. As a result, companies will be able to increase their use of robotics automation without resistance.
So, the next time someone tells you the cloud is not doing much for robotics automation, remind them that if it were not for the cloud, AMRs would not be able to work autonomously – or collaboratively with people – the way they do today. The cloud is driving robotics progress. It is truly a force multiplier, at least when it comes to instructing intelligent robots how to behave socially and teaching people that AMRs are friendly.
About the Author
Melonee Wise is the Vice President of Robotics Automation at Zebra Technologies. She joined Zebra through the acquisition of Fetch Robotics where she was the CEO. Wise was the second employee at Willow Garage where she led a team of engineers developing next-generation robot hardware and software, including ROS, the PR2, and TurtleBot. She serves as the Chair of the IFR Service Robot Group, as a robotics board member for A3, and on the MHI Roundtable Advisory Committee. Wise has received the MIT Technology Review’s TR35 and has been named to the Silicon Valley Business Journal’s Women of Influence and 40 Under 40, the Robotics Business Review RBR50, and as one of eight CEOs changing the way we work by Business Insider.
- Aerospace and Defense Manufacturers Must Prepare for the Robot Revolution
- Why Robots Need to See
- 5G / ML Robotics ‘Platforms’ Have Arrived, But Engineering Shortages Remain a Challenge
- Smart Radar Critical for Improving Autonomous Vehicle Perception
- Robotics-Assisted Surgery – Five Predictions for 2022