Augmented reality and virtual reality are combining with robotics in a number of industries

Virtual reality is being used as a training tool across a number of industries (Credit ESA/Corvaja)

March 12, 2018      

As the robotics industry continues to grow year after year, applications of robotic technology are being paired with a number of other innovative technologies causing a buzz in the world today. The convergence of robotics other industries is perhaps nowhere better seen than with augmented reality and virtual reality.

Augmented reality, or AR, is the inclusion of computer graphics in real-world environments. An example would be looking at a building and seeing superimposed computer-generated graphics showing you its history. In contrast, virtual reality, or VR, is the computer simulation of 3D environments that allow a user to function within them, such as roaming around and building objects in virtual worlds.

Augmented and virtual reality grew 17% as an industry in 2017, and the rate of growth is projected to continue to increase over the coming five years, as practical uses for the technology continue to be developed and existing uses move into further maturity. AR is improving accuracy and productivity in human pick-by-vision cases, as information overlays help workers identify correct items while freeing their hands to perform the tasks required of them.

Opportunities abound for AR, VR in manufacturing

As AR and VR converge with robotics, one key area where this hybrid-tech can be applied is the manufacturing industry.

Researchers from the University of Berkley have developed a platform that “trains” the artificial intelligence behind a physical robot, like a robot arm. Their training works by “exposing” up to a thousand objects to the AI in a virtual world so it understands everything about them, including how to operate them.

Such a capability could substantially improve worker safety and productivity of factories. It also begs the question as to whether AI will begin deciding how to best manufacture goods, without any human input, including making decisions about replacing humans.

A team from an engineering firm in Silicon Valley showed off a demo in 2016 of a robot arm controlled through virtual reality. It allows a person with a VR headset on to hold a “virtual handle” in the virtual world, which would then control the real world robot arm.

Can this be applied to help people manually control these arms and build objects in the virtual world (replicated in the real world), or can it unlock new possibilities for surgeons looking to train with robot arms?

Augmented reality can be used to control robot motion

NYU engineers have created a platform to control robot arms through augmented reality (Image: IEEE Spectrum)

While VR appears to tackle the problem of manufacturing something, augmented reality is tackling peripheral challenges. RealWear has created a head-mounted AR system that enables technicians in the aviation industry to view information such as instructions, freeing their hands to perform the task at hand.

Another example comes from a platform that has emerged from New York University that overlays environments with virtual robots. Except these robots exist in the real world, in the very environment that is being examined. The goal is to effectively allow operators to monitor and control swarm robots. And, unlike conventional systems that require hefty infrastructure and investment, this system can run on an iPad.

Such frugal innovations will take swarm robotics from being a feature of giants like Amazon to being a technology accessible to the “manufacturing masses.”

Augmented reality healthcare

Graduates from Technology Monterrey (ITESM), a university in Mexico, have developed an exoskeleton that uses both artificial intelligence and augmented reality. AI is used to crunch data that the exoskeleton is receiving from sensors, while augmented reality superimposes “routes” on users so that they work out a specific body part.

Meanwhile, a charity group in the U.K. has put together a humanoid robot that provides telepresence capabilities to kids at the Great Ormond Street Hospital in London. The robot allows these kids to explore the zoo. But it goes one step further. Children use VR, linked to the robot, to give them a first-person feeling as the navigate the park.

Russian military to use VR training

Virtual reality and augmented reality are being utilized by the Russian military to improve training

Russian soldiers will use VR in training simulations (Credit: Kronstadt Group)

When it comes to defense, Kronstadt Group, a defense contractor in Russia, has unveiled a “virtual battlefield” intended for use by the Russian army. The objective of the environment isn’t just to help soldiers train on their own, but to test the use of robots and drones, rendering “virtual models.”

In addition, the Russian military has begun testing a VR headset that allows soldiers to control drones. The soldiers simply turn their heads to turn the drones and move their heads up and down to change a drone’s altitude. With these two advances, Russian soldiers may have more familiarity and capability when it comes to military robots than soldiers from other nations.

Future of augmented reality, virtual reality, and robotics

While the convergence of VR, AR, and robotics takes place in manufacturing, healthcare, and defense, the potential for this to emerge in other sectors is unlimited. In fact, one idea could see future juries in criminal trials using “virtual reality robots” to visit crime scenes. This would provide more accuracy and clarity about what happened, how the environment looks and who is responsible — affecting the outcome of the trial.

When it comes to AR, VR, and robotics, the focus shouldn’t just be on how it is mixing together, but what challenges it poses.

For instance, DJI, in partnership with Epson and Edgybees (makers of an AR game), last year launched a new capability that allows DJI users to compete with their real-world drones in AR environments. Using smartglasses, users navigating a drone over a beach could see AR game maps superimposed, then they could move the drone over virtual obstacles, through virtual race lanes and more.

But, here’s the challenge. As more people do this, because of the smart glasses, they won’t be seeing the real world as they play the game. What if their drone hits someone or something, like a person or plane? This will put the onus on the drone itself to pay attention to its surroundings as users are playing games with it in augmented or virtual worlds.

There is no doubt that the advantages completely outweigh the risks when it comes to this new hybrid technology. And this leads to a question. Will the convergence of AR, VR, and robotics simply elevate what robots can already do, or will it unlock new capabilities and applications nobody has thought of yet?