Robotics Business Review

  • Home
  • Analysis / Opinion
    • RBR Analysis
    • RBR Opinion
    • RBR Interviews
  • Business
    • Management
    • Workforce
    • Start-Ups
      • RBR Start-Up Profiles
      • RBR Start-Up Insights
    • Social Good
    • Investment
    • Transaction Database
  • Markets / Industries
    • Agriculture
    • Construction / Demolition
    • Consumer
    • Defense / Security
    • Energy / Mining
    • Health / Medical
    • Logistics / Supply Chain
    • Manufacturing
    • Public Safety
    • Retail
    • Robotics Development
    • Utilities
  • Resources
    • Websites
      • The Robot Report
      • Mobile Robot Guide
      • Collaborative Robotics Trends
    • Webinars / Digital Events
    • Case Studies
    • Company Directory
    • Podcasts
    • Research
  • Events
    • Robotics Summit
    • RoboBusiness
    • Healthcare Robotics Engineering Forum
    • Robotics Weeks
    • RoboBusiness Direct
    • R&D 100
  • RBR50
    • RBR50 2022
      • RBR50 2022 Honorees
    • RBR50 2021
      • RBR50 2021 Honorees
      • RBR50 2021 Digital Edition
    • RBR50 2020
      • RBR50 2020 Honorees
      • RBR50 2020 Digital Edition

How Robots Perceive the World Around Them

Different methods help robotics understand objects, shapes, and where they’re located.

How Robots Perceive the World Around Them

By Geet Amin | June 24, 2019

Robots can do amazing things, such as work with humans collaboratively in factories, deliver packages quickly within warehouse, and explore the surface of Mars. But despite those feats, we’re only beginning to see robots that can make a decent cup of coffee. For robots, being able to perceive and understand the world around them is critical to streamlined integration. Such habitual practices, such as turning on the coffee machine, dispensing the beans and finding the milk and sugar, require certain perceptual abilities that are still fantasies for many machines.

However, this is changing. Several different technologies are being used to help robots better perceive the environment in which they work. This includes understanding the objects around them, and measuring distance. Below is a sampling of these technologies.

LiDAR: light & laser-based distance sensors

Several companies develop LiDAR (light and laser-based distance measuring and object detection) technologies to help robots and autonomous vehicles perceive surrounding objects. The principle behind LiDAR is simply to shine light at a surface and measure the time it takes to return to its source.

VIAVI optical filters lidar

Source: VIAVI

By firing rapid pulses of laser light at a surface in quick succession, the sensor can build a complex “map” of the surface it’s measuring. There are currently three primary types of sensors: single beam sensors; multi-beam sensors; and rotational sensors.

Single beam sensors produce one beam of light, and are typically used to measure distances to large objects, such as walls, floors and ceilings. Within single beam sensors, the beams can be separated into: highly collimated beams, which are similar to those used in laser pointers (that is, the beam will remain small throughout the entire range). LED and pulsed diode beams are similar to flashlights (that is, the beam will diverge over large distances).

Multi-beam sensors simultaneously produce multiple detection beams, and are ideal for object and collision avoidance. Finally, rotational sensors produce a single beam while the device is rotated, and are often used for object detection and avoidance.

Part detection sensors

An important task often assigned to robots, especially within the manufacturing industry, is to pick up objects. More specifically, a robot needs to know where an object is located, and if it’s ready to be picked up. This requires the work of various sensors to help the machine detect the object’s position and orientation. A robot may already have sensors built-in to its part detection capabilities, which may serve as an adequate solution if you’re only looking to detect whether or not an object is present.

machine vision systems automationPart detection sensors are commonly used in industrial robots, and can detect whether or not a part has arrived at a particular location. There are a number of different types of these sensors, each with unique capabilities, including detecting the presence, shape, distance, color and orientation of an object.

Robot vision sensors offer several high-tech benefits to collaborative robots across industries. Both 2D and 3D vision allow robots to manipulate different parts, without reprogramming, pick up objects of an unknown position and orientation, and correct for inaccuracies.

3D vision & the future of robot “senses”

The introduction of robots into more intimate aspects of our lives (such as in our homes) requires a deeper and more nuanced understanding of three-dimensional objects. While robots can certainly “see” objects through cameras and sensors, interpreting what they see from a single glimpse is more difficult. A robot perception algorithm, developed by a Duke University graduate student and his thesis supervisor, can guess what an object is, how it’s oriented and “imagine” any parts of the object that may be out of view.

More on robot perception:

  • Machine Vision Sales in North America Declined in Q1 2019
  • Enhanced Robot ‘Vision’ Enables More Natural Interaction With Humans
  • Researchers Develop Method that Sees Object Shapes Around Corners
  • Neurala Launches SaaS Platform for Custom Vision AI Development
  • MIT Researchers Create Robot That Can Pick Up Any Object After Seeing It

The algorithm was developed by using 4,000 complete 3D scans of common household objects, including: an assortment of beds, chairs, desks, monitors, dressers, nightstands, tables, bathtubs and sofas. Each scan was then broken down into tens of thousands of voxels, stacked atop one another, to make processing easier. Using probabilistic principal component analysis, the algorithm learned categories of objects, their similarities and differences. This enables it to understand what a new object is without having to sift through its entire catalog for a match.

While still in its infancy, the implementation of this algorithm (or those of a similar nature) pushes robotics one step further towards working in tandem with humans in settings far less structured and predictable than a lab, factory or manufacturing plant.

The ability to perceive and interact with surrounding objects and the environment is critical to robotic functionality, and their applications working alongside humans. As technology advances, there will undoubtedly be a need for increased robotics education and literacy, as well as robotics technicians.

Geet AminAbout the author: Geet Amin is a senior technical and tutorial consultant at George Brown College. He has been with the college since 2010, and is very knowledgeable about the online Technical Training Certificate programs offered at the college. He provides IT and course content related support to the students enrolled in Robotics, Automation, PLC, Electronics and Electromechanical Technician certificate programs. He can be followed on Twitter and LinkedIn.

Related Articles Read More >

World Class Keynotes Featured at Healthcare Robotics Engineering Forum
Inside the RBR50 Robotics Innovation Awards
Mass Robotics
MassRobotics Joins Robotics Summit & Expo as Strategic Partner
Test Drive Boston Dynamics’ Spot Quadruped at Robotics Summit & Expo
The Robot Report Listing Database

Robot Report Podcast

May 7, 2022
Agility Robotics gets a boost from Amazon; The US Alliance of Robotics Clusters is born
See More >
Robotics Business Review
  • Advertising
  • Contact Us
  • Subscribe
  • Collaborative Robotics Trends
  • The Robot Report
  • Mobile Robot Guide
  • RoboBusiness Conference & Expo
  • Healthcare Robotics Engineering Forum
  • Robotics Summit Conference & Expo

Copyright © 2022 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search Robotics Business Review

  • Home
  • Analysis / Opinion
    • RBR Analysis
    • RBR Opinion
    • RBR Interviews
  • Business
    • Management
    • Workforce
    • Start-Ups
      • RBR Start-Up Profiles
      • RBR Start-Up Insights
    • Social Good
    • Investment
    • Transaction Database
  • Markets / Industries
    • Agriculture
    • Construction / Demolition
    • Consumer
    • Defense / Security
    • Energy / Mining
    • Health / Medical
    • Logistics / Supply Chain
    • Manufacturing
    • Public Safety
    • Retail
    • Robotics Development
    • Utilities
  • Resources
    • Websites
      • The Robot Report
      • Mobile Robot Guide
      • Collaborative Robotics Trends
    • Webinars / Digital Events
    • Case Studies
    • Company Directory
    • Podcasts
    • Research
  • Events
    • Robotics Summit
    • RoboBusiness
    • Healthcare Robotics Engineering Forum
    • Robotics Weeks
    • RoboBusiness Direct
    • R&D 100
  • RBR50
    • RBR50 2022
      • RBR50 2022 Honorees
    • RBR50 2021
      • RBR50 2021 Honorees
      • RBR50 2021 Digital Edition
    • RBR50 2020
      • RBR50 2020 Honorees
      • RBR50 2020 Digital Edition