Using years of research in insect vision, researchers at University of Adelaide are developing a new robot with visual systems borrowed from insects by combining principles of neuroscience, mechanical engineering and computer science.
Described in a new paper published in the Journal of The Royal Society Interface, the latest research goes onto explain how the learnings from both insects and humans can be applied in a model virtual reality simulation, enabling an artificial intelligence system to ‘pursue’ an object.
Visual systems in general involve detecting and tracking of objects against complex backgrounds while in motion or stagnant. University of Adelaide neuroscientist Dr Steven Wiederman (School of Medical Sciences) has shown that flying insects, such as dragonflies, show remarkable visually guided behaviour. This includes chasing mates or prey, even in the presence of distractions, like swarms of insects.
According to Mechanical Engineering PhD student Zahra Bagheri, insects go about performing their tasks despite their low visual acuity and a tiny brain.
“The dragonfly chases prey at speeds up to 60 km/h, capturing them with a success rate over 97%,” Ms Bagheri says.
In a bid to imitate an insect’s visual system, the team of engineers and neuroscientists at the university developed an unusual algorithm that emulates visual tracking.
“Instead of just trying to keep the target perfectly centred on its field of view, our system locks on to the background and lets the target move against it,” Ms Bagheri says. “This reduces distractions from the background and gives time for underlying brain-like motion processing to work. It then makes small movements of its gaze and rotates towards the target to keep the target roughly frontal.”
Researchers have tested their “active vision” system in virtual reality worlds composed of various natural scenes and found that it performs just as robustly as the state-of-the-art engineering target tracking algorithms, while running up to 20 times faster.
“This type of performance can allow for real-time applications using quite simple processors,” says Dr Wiederman, who is leading the project, and who developed the original motion sensing mechanism after recording the responses of neurons in the dragonfly brain.
“We are currently transferring the algorithm to a hardware platform, a bio-inspired, autonomous robot.”