July 19, 2016      

In my last article, we looked at how Canadian robotics has evolved from university research and startups to commercial success in logistics and healthcare. A prime source for the next generation of Canadian robotics entrepreneurs, as well as assistive robots, is the Institute for Robotics and Mechatronics at the University of Toronto.

The Institute for Robotics and Mechatronics (IRM) was created in 2010 to strengthen interdisciplinary research, enhance the student learning experience, and establish a central hub for mechatronics innovation. The institute currently involves more than 45 faculty members spread across six departments and institutes.

The IRM offers an undergraduate major or minor and a graduate emphasis in robotics and mechatronics, according to the University of Toronto. It has also established numerous industry partnerships to maximize the impact of the transformative technologies developed by its faculty and students.

Researchers are currently working on a variety of sensing, actuation, and computational capabilities for the next generation of mechatronic and robotic systems. A key research area is assistive robots, which conduct daily tasks on behalf of disabled people and their caretakers. The person with the disability controls the functioning of the robot.

Goldie Nejat, IRM’s director and an associate professor in the University of Toronto’s Mechanical and Industrial Engineering Department, is developing healthcare robots to perform daily activities ranging from preparing meals to reminding patients to take medication.

Assistive robots sensitive to human needs

The robots created by Nejat’s team can detect verbal instructions and even cues of the user’s emotions and intent. The assistive robots are also being designed to be aware of their environment and be able to react and adapt to new situations.

Nejat takes much of her inspiration from the human brain, making decisions based on the latest research in psychology or human social behavior sciences. Her interdisciplinary approach to robot design includes collaborators in healthcare, occupational therapy, nursing, and even residents in long-term care facilities. This valuable feedback allows Nejat to tailor her designs to the wish lists of actual users beyond assisting with daily activities.

Statistics Canada estimates people aged 65 and older will account for almost a quarter of the nation’s population by 2051. According to Nejat, healthcare robots represent the biggest source of funding for her lab from government and industry.

Robots to watch where they’re going

Timothy Barfoot, a University of Toronto associate professor, and his research team are pursuing another important branch of robotics research — developing systems that allow robots to move safely through the world with the assistance of visual sensors.

Some University of Toronto research is focusing on 'visual teach and repeat.''

Timothy Barfoot is working on route-planning for robots.

The team is investigating the potential of using regular video cameras in robot navigation, such as the kind found in smartphones. It is also testing and evaluating more sophisticated systems, including lidar.

“Lidar is basically a laser that spins around on top of the robot and measures the distance between it and objects around it, collecting what we call ‘point-cloud data,'” Barfoot explained. “If we stitch all of that together, we can build a 3D model of the environment that the robot is driving through.”

Barfoot and his co-researchers recently developed a technique called “visual teach and repeat” that addresses the challenge of planning a safe route from one location to another and then reliably issuing the necessary commands to the robot’s motors and wheels.

The approach calls for a human operator to drive a camera-equipped robot along a route that is already known to be safe. Along the way, the robot logs the visual information associated with the route. By comparing its own location with that of the previously taught route, the robot can adjust its actions to stay on the correct path.

“You can think of it as the human programming the robot in a really intuitive way without ever having to type code,” Barfoot said.

More on Assistive Robots: