In a leap for robot development, the MIT researchers who built a robotic cheetah have now trained it to see and jump over hurdles as it runs - making this the first four-legged robot to run and jump over obstacles autonomously.
Scientists have created underwater robot swarms that function like schools of fish, exchanging information to monitor the environment, searching, maintaining, exploring and harvesting resources in underwater habitats.
Drones say goodbye to pilots. With the goal of achieving autonomous flight of these aerial vehicles, researchers developed a vision and learning system to control and navigate them without relying on a GPS signal or trained personnel.
Researchers have developed algorithms that enable robots to learn motor tasks through trial and error using a process that more closely approximates the way humans learn, marking a major milestone in the field of artificial intelligence.
Researchers are developing tests to take full measure of robotic grasping - specifically, the motion and effort that gripping and manipulating entail. Their immediate goal: To provide useful performance-benchmarking tools to support research and innovation leading to ever-more handy capable robot appendages.
There are many unspoken rules of human interaction, whether that's whether or not to look them in the eyes, the firmness of the handshake, smiling or words of greeting. Little things like this can lead to big judgements about trustworthiness or social acceptability. What if we can use this type of behaviour to help humans and robots interact?
To make cars as safe as possible, we crash them into walls to pinpoint weaknesses and better protect the people who use them. That's the idea behind a series of experiments by an engineering team who hacked a next generation teleoperated surgical robot to test how easily a malicious attack could hijack remotely-controlled operations in the future and to make those systems more secure.
A new programming approach gives robots more 'cognitive' capabilities, enabling humans to specify high-level goals, while a robot performs high-level decision-making to figure out how to achieve these goals.