A new World Wide Web strictly for robots

(Nanowerk News) To date, each robot constructed needs to be programmed, or go through its own learning process from scratch. New robots are like newborn babies who need the humans creating them to teach them everything, or learn gradually themselves. However, this could all change. Thanks to ROBOEARTH, robots will soon be able to share knowledge with their peers almost instantly instead of 'living' in a bubble.
Anyone who saw the box-office-topping movie I, Robot in 2004 will feel a little familiarity towards the concept behind the emerging field of 'cloud robotics'. Just like in the movie, it involves connecting existing robots to an alternative version of the World Wide Web so that they can all share information about new learnings or problems solved. However, unlike the sinister plot of the film, this field of science will be beneficial to the sector. The project is expected to result in less development time, with all robot knowledge remaining readily available instead of disappearing when the robot storing it becomes obsolete.
ROBOEARTH project logo
The ROBOEARTH project is pioneering the field of cloud robotics. Started in 2009, the four-year project aimed to create a giant network and database repository where all robots could store and share information about their behaviour and their environment. Such data can include software components, maps for navigation, task knowledge such as action recipes and manipulation strategies, as well as object-recognition models. In a few words, its use in all robots could help the sector move away from situations where robots are not capable of understanding and coping with unpredictable environments to scenarios where each robot can easily find its way around new problems by instantly accessing the knowledge of its peers.
In an exclusive interview with the research*eu results magazine, Dr Markus Waibel sheds light on the project's outcomes and their importance for the development of robotics.
What are the project's main objectives?
The goals of ROBOEARTH are to prove that connection to a networked information repository greatly speeds up the learning and adaptation process that allows robotic systems to perform complex tasks, and to show that a system connected to such a repository will be capable of autonomously carrying out useful tasks that were not explicitly planned for at design time.
What is new or innovative about the project and how it addresses this topic?
Today's robots do not use the Internet - each robot is an island. ROBOEARTH is pioneering cloud robotics: the idea that robots can tap into the huge benefits of converged infrastructure and shared services, much in the same way that personal computers benefitted when they became connected to the internet. ROBOEARTH is a World Wide Web for robots: a giant network and database repository where robots can share information and learn from each other about their behaviour and their environment. In addition, ROBOEARTH allows robots to outsource computation to the ROBOEARTH cloud engine (aka Rapyuta), which allows robots to take advantage of the rapid increase in data transfer rates to offload tasks without hard real-time requirements. This is of particular interest for mobile robots, where on-board computation entails additional power requirements which may reduce operating duration and constrain robot mobility, as well as increase costs.
What first drew you to research in this area?
Today, robots are mostly relegated to highly controlled and predictable environments like manufacturing plants, but have made few significant inroads into the human sphere. The human world is just too nuanced and too complicated to be summarised within a limited set of specifications. Thus far, robots have been operating in isolation from each other. If they are decommissioned, all that learning is lost. Even more disconcerting to researchers is the question: why are thousands of systems solving the same essential problems over and over again anyway?
What difficulties did you encounter and how did you solve them?
One of the problems we tackle in ROBOEARTH is: how can robots with different hardware and software share knowledge and benefit from each other's learning? To address this challenge we divided the problem into two parts: high level knowledge in ROBOEARTH is stored in an XML-based language, which is independent of specific hardware or software requirements. This interfaces with specific robots via a specific interface (a so-called Hardware Abstraction Layer). For example, robot actions are stored as high-level, general action recipes that can be translated to low-level, robot-specific motion primitives.
What are the concrete results from the research so far?
The project has produced a series of six demonstrators that show how ROBOEARTH can improve robot performance, learning, and autonomy, from serving drinks in a hospital setting to cloud-based mapping using a very-low-cost robot (USD ~300) equipped with a camera and wireless dongle.
What do you expect in terms of main outcomes from this project?
One main outcome has been the birth of a new research field. 'Cloud robotics' was unheard of when we started, and is now a rapidly evolving field of robotics that has attracted large players from major universities around the world to companies like Google. It allows robots to benefit from the powerful computational, storage, and communications resources of modern data centres. In addition, it removes overheads for maintenance and updates, and reduces dependence on custom middleware.
Source: Cordis