In the past 30 years, industrial robots have increased the productivity and quality in mass production enormously and simultaneously reduced the production costs. Since 2005, the number of industrial robots installed annually has more than doubled, to well above 250,000 in 2015, according to the International Federation of Robotics. The choice of robot prototype directly affects the likelihood of adoption.
The next evolution – or perhaps even revolution – in the field of robotics are the service robots that support us outside of the protected shop floors, such as at our workplace or at home. But service robots that work with us in our everyday environment need a lot more abilities.
Contrary to industrial robots, the focus is not on absolute precision in the movements, but instead on understanding environments and situations, performing localization tasks within our very complex surroundings, as well as flexible interaction with objects and people.
That means service robots have to be more “human” than industrial robots, but without necessarily looking like humans. Therefore it makes sense to take a look at nature and use the concepts present in humans and animals as a source of inspiration for a robot prototype.
ETH looks to biology for SEA motion
Our motion apparatus consists of joints, muscles, and tendons. Contrary to the joints customarily used in today’s robots, which are rigidly connected via a gearhead, we humans have an elastic coupling in the form of the tendons. It enables us to store energy and protects our skeleton against hard impacts.
In addition, it makes force-regulated interaction with the surroundings possible. Inspired by the natural motion apparatus, “Serial Elastic Actuators (SEA)” can be created that have a spring element downstream of the motor and gearhead.
The spring protects the gearhead against impacts, permits very accurate measurement of the forces involved in interactions and enables a highly efficient gait with up to 70% energy savings. Such SEA drives can, for example, be found in the ETH robot ANYmal sold by ANYbotics.
Natural perception guides robot prototype
Where perception is concerned, artificial systems are also increasingly getting closer to nature. The rapid development of cameras, inertial measurement units (IMU) and microprocessors that can be installed in smartphones have made the robots of today capable of navigating in similar ways to humans.
To this end, image data is combined with the measurements of the accelerations and rotational speeds of the IMU.
Analogous to the human equilibrium organ in the ear, the IMU enables a very quick estimate of movement, which, however, slowly drifts off as a result of the lack of reference to the environment.
The drift can be compensated through data fusion with the image data; this makes exact localization and 3D reconstruction of the environment possible (see image).
Nature is also frequently used as inspiration for the learning skills of robots. Neural networks, a primitive reproduction of the human brain, was first hyped in the 1980s and 1990s.
Today, the enormous increase in available computer power makes neural networks capable of providing promising results in the segmentation and classification of data (e.g. images) and in learning motion sequences and characteristics.
These approaches, which have gained new popularity under the name “deep learning,” can help a robot prototype analyze complex situations and learn sequences by themselves. Nevertheless, the research still lags far behind the skills of animals and humans.
All these new technologies have resulted in much progress for service robot prototypes. Time and time, however, they again also show the limits of what is possible.
As a robotics researcher, one gains large respect for the fascinating abilities of humans and animals, which, from the current vantage point, seem unlikely to ever be equaled by artificial systems.
About the author: Roland Siegwart is the director of the Autonomous Systems Lab of the Institute of Robotics and Intelligent Systems at ETH Zurich.
Editor’s note: This article is a special sponsored submission by maxon motor ag and appears in driven: The maxon motor magazine.