Researchers from the University of South Australia are creating new technology based on the extraordinary vision of insects.
The dragonfly possesses visual processing skills that are cause for envy. They can remain in the air under very tight control waiting for potential mates, prey or predators.
Using their nearly 360-degree eyesight, they are able to discern targets against cluttered backgrounds and then take the appropriate action.
As reported, the past eight years have seen the team replicate the visual functionality of these insects and are using them as a basis to improve detection systems in cameras.
The bio-inspired research of the team led by the University’s Dr Russell Brinkworth, a neuroscientist, mechatronic engineer and robotics expert; and Professor Anthony Finn, Director of the Defence and Systems Institute; has a range of applications.
These applications include:
- Developing bionic eyes to improving the navigation systems of driverless cars
- Spotting drones in complicated environments
- Scanning forests to capture detailed information about individual trees
- Improving facial recognition techniques
- Monitoring wildlife in densely camouflaged areas
By replicating the dragonfly’s visual algorithms in a computer model, the researchers are building sensor systems that can find objects in different settings, which is something that computers currently fail to do well.
The scenes may be very bright or very dark, have either high or low contrasts, and are in complex and obscure landscapes.
Fatal accidents involving driverless cars occur because more progress needs to be made in the visual processing field.
Current camera systems struggle with differentiating between light and dark and different objects.
Dragonflies have the same ability as humans, animals and other insects to adjust to dark and light surroundings.
Moreover, they have superior tracking and detection skills. All these visual processes can be mapped to help build systems that can operate in complex environments.
The researchers will take the algorithms that insects use and modify them to suit whichever purpose it will serve them such as improving security camera footage or facial recognition.
The same biologically-inspired algorithms can also be applied to sound, making it easier to listen for objects in noisy environments.
This entails tracking small, quiet, slow-moving targets like drones based on both their visual and acoustic signatures.
Meanwhile, the same people are leading a University project that will help combat the growing global threat posed by IED-carrying drones.
Improvised explosive devices are among the deadliest weapons in modern warfare, killing or injuring more than 3000 soldiers in Afghanistan in 2017.
This weaponisation of drones by terror groups has led the Defence Science and Technology (DST) Group to invite researchers and experts from industry and academia to come up with technological solutions.
Using the algorithm inspired by insect neurology and physiology, their research team has developed electro-optic, infra-red and acoustic sensor technologies, which can detect remotely- piloted aircraft at impressive distances.
They achieved this by transitioning the model of the hoverfly beyond the biology and simulation and put it on embedded computers.
These are small, portable systems that allow the processing of images and data at around 100 frames a second, thereby, identifying targets in very complex settings in real time.
The drone detection project is expected to be completed by the end of 2020.