Get the most out of Ro­bot­ics Business Review!



This is a preview article. Please register for access to all content.
Learn more about membership benefits and start your membership today!

Thought-controlled Robot Heralds Avatar Future
wills_surrogate
Bruce Willis from film Surrogates (2009)
Understands user's commands, but also infers particulars of the task
By RBR Staff



Researchers at the CNRS-AIST Joint Robotics Laboratory (a collaboration between France’s Centre National de la Recherche Scientifique and Japan’s National Institute of Advanced Industrial Science and Technology are developing software that allows a person to drive a robot with thoughts alone, reports Gizmag.

The researchers have achieved what they call “robotic re-embodiment”, which they accomplished by using an EEG controlled interface (an electrode cap) and then mapping and translating brain signals into commands the robot could understand.

This marks the first time an actual robot has been controlled in such a manner.

While it’s primarily intended for severely paralyzed people, there’s no doubt that this technology will eventually be used by pretty much everyone—whether it be to travel remotely to faraway or inaccessible places, or just to have it clean out your garage.

The system also utilizes artificial intelligence(AI) so that it not only understands the user’s intentions, but also infers the particulars of the task at hand. As a result, the robot won’t have to be micromanaged when performing simple tasks like walking to the end of a hall, or picking up a dropped object.

Essentially, once the user focuses on a target, the robot’s AI takes over and knows what to do with it.

The Japan Daily Press reports that the basic way the system works is that a person stares intently at a computer screen with flashing arrows. The processor then recognizes which of the arrows is being stared at and then sends a command to the robot, causing it to react with a pre-programmed movement (e.g. moving to the left if the left arrow is stared at).

The robots can also be instructed to do basic movements based on the user looking at pre-determined objects, like staring at an object instructs the robot to pick up it up and bring it to you.


See related article: Mind-controlled bionic limbs bring giant strides in prosthetics


Get premium access to all RBR content, join today!
Get your membership today!
Already a member? Log in.





Comments
No comments yet. Be the first to post a comment.

Name:

Email:


View comment guidelines

Remember me

Notify me of follow-up comments?




Special Focus: Robots and the Law

Special Focus: 3D Printing
3D Printing

The new reality of customizable, one-off production:
Additive Manufacturing (AM). Where it’s going, why and what’s
driving its emergence.


Tomorrow’s Supply Chains Today

What’s Wrong with Supply Chains? Plenty!

Hurco Files Patent for Adapter that Turns CNC Machines into 3D Printers
More in 3D Printing



Six Values Re-writing Laws of Healthcare Robotics

RVT Capitalizing on Robotic Vision Systems

U.S. Navy Testing Lockheed Martin Fortis Exoskeletons

Game Changer Awards: Early Bird Deadline Approaching

Savioke SaviOne Robot Butler Debuts at California Hotel