The R2-D2 robot from Star Wars doesn't communicate in human language but is, nevertheless, capable of showing its intentions. For human-robot interaction, the robot does not have to be a true humanoid. Provided that it signals are designed in the right way.
A panel of academics and industry thinkers has looked ahead to 2030 to forecast how advances in AI might affect life in a typical North American city and spark discussion about how to ensure the safe, fair, and beneficial development of these rapidly developing technologies.
From self-driving cars and IBM's Watson to chess engines and AlphaGo, there is no shortage of news about machine learning, the field of artificial intelligence that studies how to make computers that can learn. Recently, parallel to these advances, scientists have started to ask how quantum devices and techniques might aid machine learning in the future.
Surgeons have performed the world's first operation inside the eye using a robot. They used the remotely controlled robot to lift a membrane 100th of a millimetre thick from the retina at the back of the right eye.
When you have too many robots together, they get so focused on not colliding with each other that they eventually just stop moving. New algorithms are different: they allow any number of robots to move within inches of each other, without colliding, to complete their task - swapping locations on his lab floor.
What can software designers and ICT specialists learn from maggots? Quite a lot, it would appear. Through understanding how complex learning processes in simple organisms work, scientists hope to usher in an era of self-learning robots and predictive computing.
Researchers using the liquid crystal elastomer technology, demonstrated a bioinspired micro-robot capable of mimicking caterpillar gaits in natural scale. The 15-millimeter long soft robot harvests energy from green light and is controlled by spatially modulated laser beam.