The latest news about robots, robotics
Posted: Apr 19, 2013
Robot learns to collaborate assembling IKEA furniture (w/video)
(Nanowerk News) Research in learning from demonstration has focused on transferring movements from humans to robots. However, a need is arising for robots that do not just replicate the task on their own, but that also interact with humans in a safe and natural way to accomplish tasks cooperatively. Robots with variable impedance capabilities opens the door to new challenging applications, where the learning algorithms must be extended by encapsulating force and vision information.
Researchers from the Istituto Italiano di Tecnologia and Universitat Politecnica de Catalunya propose a framework to transfer impedance-based behaviors to a torque-controlled robot by kinesthetic teaching. The proposed model encodes the examples as a task-parameterized statistical dynamical system, where the robot impedance is shaped by estimating virtual stiffness matrices from the set of demonstrations. A collaborative assembly task is used as testbed. The results show that the model can be used to modify the robot impedance along task execution to facilitate the collaboration, by triggering stiff and compliant behaviors in an on-line manner to adapt to the user's actions.
This video shows the result of a learning by imitation approach that allows two users to demonstrate an assembly skill requiring different levels of compliance. Each furniture item to assemble will have specific characteristic that needs that are transferred to the robot. Re-programming the robot for each new item would not be possible. Here, the robot can learn this skill by demonstration.
One user is grasping the robot and moving it by hand to demonstrate how it should collaborate with another user (kinesthetic teaching). A force sensor mounted at the wrist of the robot and a marker-based vision tracking system is used to record the position and orientation of the table legs that need to be mounted at four different point on the table top. After demonstration, the robot learns that it should first be compliant to let the user re-orient the table top in a comfortable pose to screw the corresponding table leg. Once the user starts to screw the leg, the robot becomes stiff to facilitate the task. This behavior is not pre-programmed, but is instead learned by the robot by extracting the regularities of the task from multiple demonstrations.
Source: Istituto Italiano di Tecnologia, Universitat Politecnica de Catalunya
If you liked this article, please give it a quick review on reddit or StumbleUpon. Thanks!