Butterfly-inspired AI technology takes flight

(Nanowerk News) When it comes to mating, two things matter for Heliconius butterflies: the look and the smell of their potential partner. The black and orange butterflies have incredibly small brains, yet they must process both sensory inputs at the same time — which is more than current artificial intelligence (AI) technologies can achieve without significant energy consumption. To make AI as smart as the butterflies, a team of Penn State researchers has created a multi-sensory AI platform that is both more advanced and uses less energy than other AI technologies.
Current AI technologies often fall short in mimicking the multi-sensory decision-making processes that humans and animals use, the researchers said. This can limit AI’s potential for uses in robotics and smart sensors that detect dangers like faulty structures or imminent chemical leaks.
“If you think about the AI we have today, we have very good image processors based on visual or excellent language processors that use audio,” said Saptarshi Das, associate professor of engineering science and mechanics and corresponding author of the study published today in Advanced Materials ("A Butterfly-Inspired Multisensory Neuromorphic Platform for Integration of Visual and Chemical Cues"). “But when you think about most animals and also human beings, decision-making is based on more than one sense. While AI performs quite well with a single sensory input, multi-sensory decision making is not happening with the current AI.”
Heliconius butterflies choose a mate via a simultaneous visual cue — seeing that the potential mate’s wing pattern is indeed one of a Heliconius butterfly — and chemical cue of pheromones released by the other butterfly. Of note, Das said, the butterfly manages this with a tiny brain that uses minimal energy. This is in direct contrast to modern computing, which consumes a significant amount of energy.
“Butterflies and many other animal brains are very tiny, and they use low amounts of resources, both in terms of energy used and physical size of the brain,” Das said. “And yet they perform computational tasks that rely on multiple sensory inputs at once.”
To mimic this behavior electronically, the researchers turned to a possible solution that involves 2D materials, which are one to a few atoms thick. The researchers developed a hardware platform made of two 2D materials, molybdenum sulfide (MoS2) and graphene. The MoS2 portion of the hardware platform is a memtransitor, an electronic that can perform both memory and information processes. The researchers chose MoS2 for its light-sensing capabilities, which mimic the visual capabilities of the butterfly. The graphene portion of the device is a chemitransistor that can detect chemical molecules and mimic the pheromone detection of the butterfly’s brain.
"The visual cue and the pheromone chemical cue drive the decision whether that female butterfly will mate with the male butterfly or not,” said co-author Subir Ghosh, second year doctoral student in engineering science and mechanics. “So, we got an idea inspired by that, thinking how we have 2D materials with those capabilities. The photoresponsive MoS2 and the chemically active graphene could be combined to create a visuochemical-integrated platform for AI and neuromorphic computing.”
The researchers tested their device by exposing their dual-material sensor to different colored lights, mimicking the visual cues, and applying solutions with varying chemical compositions resembling the pheromones released by butterflies. The goal was to see how well their sensor could integrate information from both the photo detector and chemisensor, similar to how a butterfly's mating success relies on matching wing color and pheromone strength.
By measuring the output response, the researchers determined that their devices could seamlessly integrate visual and chemical cues. This highlights the potential for their sensor to process and interpret diverse types of information simultaneously, they said.
“We also introduced adaptability in our sensor's circuits, such that one cue could play a more significant role than the other,” said Yikai Zheng, a fourth-year doctoral student in engineering science and mechanics and co-author of the study. “This adaptability is akin to how a female butterfly adjusts her mating behavior in response to varying scenarios in the wild."
The dual sensing in a single device is also more energy efficient, the researchers said, when contrasted with the current way AI systems operate. They collect data from different sensor modules and then shuttle it to a processing module, which can cause delays and excessive energy consumption.
Next, the researchers said they plan to expand from integrating two senses into their device to three senses, mimicking how a crayfish uses visual, tactile, and chemical cues to sense prey and predators. The goal is to develop hardware AI devices capable of handling complex decision-making scenarios in diverse environments.
“We could have sensor systems in places such as a power plant, that would detect potential issues such as leaks or failing systems based on multiple sensory cues,” Ghosh said. “Such as a chemical odor, or a change in vibration, or detecting weaknesses visually. This would then better help the system and staff determine what they need to do to fix it quickly because it was not just relying on one sense, but multiple ones.”
Source: By Jamie Oberdick, Penn State Materials Research Institute (Note: Content may be edited for style and length)
podcast
We curated a list with the (what we think) 10 best robotics and AI podcasts – check them out!
SmartWorlder logo
Also check out our Smartworlder section with articles on smart tech, AI and more.