The latest news about robots, robotics
Posted: Oct 29, 2013
Bees' perfect landing inspires robot aircraft
(Nanowerk News) Scientists at The University of Queensland (UQ) have discovered how the honeybee can land anywhere with utmost precision and grace – and the knowledge may soon help build incredible robot aircraft ("A universal strategy for visually guided landing").
By sensing how rapidly their destination ‘zooms in' as they fly towards it, honeybees can control their flight speed in time for a perfect touchdown without needing to know how fast they're flying or how far away the destination is.
This discovery may advance the design of cheaper, lighter robot aircraft that only need a video camera to land safely on surfaces of any orientation, says Professor Mandyam Srinivasan at UQ's Queensland Brain Institute.
“Orchestrating a safe landing is one of the greatest challenges for flying animals and airborne vehicles,” says Professor Srinivasan.
“To achieve a smooth landing, it's essential to slow down in time for the speed to be close to zero at the time of touchdown.”
Humans can find out their distance from an object using stereo-vision – because their two eyes, which are separated by about 65 mm, capture different views of the object. However, because insects have close-set eyes, can't do the same thing, explains Professor Srinivasan.
“So in order to land on the ground, they use their eyes to sense the speed of the image of the ground beneath them,” he says.
“By keeping the speed of this image constant, they slow down automatically as they approach the ground, stopping just in time for touchdown.
“However, in the natural world, bees would only occasionally land on flat, horizontal surfaces. So it's important to know how they land on rough terrain, ridges, vertical surfaces or flowers with the same delicacy and grace.”
In the study, researchers trained honeybees to land on discs that were placed vertically, and filmed them with high-speed video cameras.
“The boards carried spiral patterns that could be rotated at various speeds by a motor,” says Professor Srinivasan.
“When we spun the spiral to make it appear to expand, the bees ‘hit the brakes' because they thought they were approaching the board much faster than they really were.
“When we spun the spiral the other way to make it appear to contract, the bees sped up, sometimes crashing into the disc. This shows that landing bees keep track of how rapidly the image ‘zooms in', and they adjust their flight speed to keep this ‘zooming rate' constant.
“Imagine you're in space and you don't know how far away you are from a star,” Professor Srinivasan says.
“As you fly towards it, the other stars ‘move away' and it becomes the focus. Then when the star starts to ‘zoom in' faster than the regular rate, you'll slow down to keep the ‘zooming rate' constant.
“It's the same for bees – when they're about to reach a flower, the image of the flower will expand faster than usual. This causes them to slow down more and more as they get closer, eventually stopping when they reach it.”
The researchers also developed a mathematical model for guiding landings, based on the bees' landing strategy.
Professor Srinivasan says unlike all current engineering-based methods, this visually guided technique does not require knowledge about the distance to the surface or the speed at which the surface is approached.
“The problem with current robot aircraft technology is they need to use radars or sonar or laser beams to work out how far the surface is,” Professor Srinivasan says.
“Not only is the equipment expensive and cumbersome, using active radiation can also give the aircraft away.
“On the other hand, this vision-based system only requires a simple video camera that can be found in smartphones. The camera, by ‘seeing' how rapidly the image expands, allows the aircraft to land smoothly and undetected on a wide range of surfaces with the precision of a honeybee.”