March 14, 2013      

They call it Rapyuta

Researchers from five European universities have developed a cloud-computing platform for robots. The platform allows robots connected to the Internet to directly access the powerful computational, storage, and communications infrastructure of modern data centers?the giant server farms behind the likes of Google, Facebook, and Amazon?for robotics tasks and robot learning.


Human world too nuanced until now

Until recently, robots have not been capable of understanding and coping with unstructured environments (like the ones humans work in) because their systems have relied on knowing in advance the specifics of every possible situation they might encounter.

Each response to a contingency has had to be programmed in advance, and systems have had to rebuild their world model from sensor data each time they had to perform a new task.

This is one of the main reasons why, to date, robots have been mostly relegated to highly controlled and predictable environments like manufacturing plants, but have made few significant inroads into the human sphere.

The human world is just too nuanced and too complicated to be summarized within a limited set of specifications.

But what if robots could learn from their past experiences? And what if they could share their new-found knowledge instantaneously with their peers?

The rapid progress of wireless technology and availability of data centers accessible through the World Wide Web is adding a new dimension to Robotics.

Today robots can not only use the web as a source of shared information, but also as a powerful computational resource. Examples include online services for map building or online grasp planning. Such cloud based services for robots hold the potential for lighter, cheaper, and more powerful robots.

RoboEarth pioneered the idea of a World Wide Web for robots ? a large database that allows robots to upload, store, and share their knowledge about objects, environments, and actions at different levels of abstraction. This database enables robots with potentially different hardware and software to download, use, and improve shared knowledge.

The recently released RoboEarth Cloud Engine now complements the RoboEarth databases with computational capabilities and allows web services dedicated to robots. Creating such robot cloud services entails new scientific challenges:

  • What existing cloud services for humans can be leveraged for robotics?
  • Can we create services for tasks such as reasoning in an uncertain world or semantic knowledge?
  • What architectures provide the optimal trade-offs between content aggregation and caching vs. accessibility and scalability vs. response time for robotics applications?
  • To what extent can existing IaaS, PaaS, and SaaS resources be used for robotics?
  • What are useful metrics and optimal trade-offs between on-board computation and the use of cloud services?

See paper (pdf): The design and implementation of Rapyuta, the RoboEarth Cloud Engine.
by Dominique Hunziker, Mohanarajah Gajamohan, Markus Waibel, and Raffaello D?Andrea