Listen to this article
The first collaborative robot (cobot) was manufactured in 1996. The system was designed for basic pick and place applications, and communicated with operators using motion resistance.
Cobots have come a long way since that time, working safely with humans. They are able to detect objects and people in their environment using vision sensors, and can even slow down, or stop functioning, in case of an unintended contact.
Modern cobots possess the ability to take corrective actions and minimize risks, but other sensing technologies, along with sophisticated software, allow them to do much more. For example, touch sensing technologies are quickly evolving to increase both the applicability and safety of cobots, especially for demanding applications that require handling delicate materials safely and precisely, such as healthcare.
Currently, there are several types of tactile sensors used in cobots, including piezoelectric, piezoresistive, capacitive and elastoresistivity types. Piezoelectric technologies are used for gathering data from the cobot’s joints and transmitting it to the controller. In contrast, capacities sensors can act as proximity sensors, allowing the cobot to slow down when it detects the presence of an obstacle.
For some applications, detection sensors are placed outside of collaborative robots. These devices are used to recognize when human workers when they enter the workspace, signaling to the system slow down or stop on those occasions.
Although collisions can still happen, traditional cobot sensing modalities ensure that the impact is minimized. To improve the reliability (and hence safety) of collaborative robots, tactile sensors empowered with smart software can be embedded at the end of the cobot arm, which improves collision avoidance and increases movement efficiently
Modern touch sensing systems use tactile sensors to capture information about an object in real time, such as its shape, size, and texture.
Precise Object Handling and More
Touch sensors are also useful for applications requiring precise object placement, such as loading parts into a fixture for machine tending. The sensing technology can find the exact part location and correct changes in the position or size of the raw stock material by measuring the insertion force.
Modern touch sensing systems use tactile sensors to capture information about an object in real time, such as its shape, size, and texture. The resulting data can then be to produce a highly accurate description of objects, as well as the ability to recognize defects and changes. For example, early research from the USC Viterbi School of Engineering used embedded tactile sensors with conductive fluid to simulate human touch, resulting in a robot differentiating between the texture of wool and that of cotton.
With a more effective sense of touch, cobots can also be used in applications where they interact with more fragile or deformable objects. For example, tactile technology in surgical cobots can be utilized enhance precision and accuracy. For this to be successful, multiple tactile sensors would have to be integrated using AI and machine learning.
With state-of-the-art sensors, actuators and software, cobots are now capable of experiencing physical sensations, allowing systems to ‘feel’ and identify many classes of objects – hard, soft, rigid, flexible, etc. – in the process. Continuing tactile sensor advancements will allow cobots and humans to perform increasingly complex tasks while working in a collaborative manner. The result, increased productivity and efficiency for a increasing range of application types in an equally increasing number of markets.
About the Author
Claudia Jarrett is the Country Manager for industrial automation components supplier EU Automation. In that role she oversees the operations of the company’s affiliation in the United States, while helping to develop new business and deliver growth via a multi-channel approach, that has a significant positive impact on business.
- RoboDK – Start-Up Profile – Powerful, Cross Platform, Robotics Simulation & Programming Environment
- Start-up Profile – Southie Autonomy – Simplified Industrial Robotics Programming Via a Gesture-based Interface
- Start-up Profile – Omnirobotic – Reducing the Cost of Robot Programming for ‘High Mix’ Manufacturers
- Prophesee Debuts Metavision Intelligence Suite SDK as Part of Ecosystem for Event-Based Visioning
- TriVision A/S – Start-Up Profile – Machine Vision Solutions for Food and Packaging Inspection
- Start-up Profile – DreamVu – Novel 360° 3D Vision System Enables VR, Localization, Mapping, Object Tracking, More