Search & rescueAutonomous drones can help search and rescue after disasters

By Vijayan Asari

Published 4 March 2019

When disasters happen – whether a natural disaster like a flood or earthquake, or a human-caused one like a mass shooting or bombing – it can be extremely dangerous to send first responders in, even though there are people who badly need help. Drones are useful, and are helping in the recovery after the deadly Alabama tornadoes, but most require individual pilots, who fly the unmanned aircraft by remote control. Autonomous drones could cover more ground more quickly, but would only be more effective if they were able on their own to help rescuers identify people in need.

Autonomous drones will choose search locations on their own

When disasters happen – whether a natural disaster like a flood or earthquake, or a human-caused one like a mass shooting or bombing – it can be extremely dangerous to send first responders in, even though there are people who badly need help.

Drones are useful, and are helping in the recovery after the deadly Alabama tornadoes, but most require individual pilots, who fly the unmanned aircraft by remote control. That limits how quickly rescuers can view an entire affected area, and can delay actual aid from reaching victims.

Autonomous drones could cover more ground more quickly, but would only be more effective if they were able on their own to help rescuers identify people in need. At the University of Dayton Vision Lab, we are working on developing systems that can help spot people or animals – especially ones who might be trapped by fallen debris. Our technology mimics the behavior of a human rescuer, looking briefly at wide areas and quickly choosing specific regions to focus in on, to examine more closely.

Looking for an object in a chaotic scene
Disaster areas are often cluttered with downed trees, collapsed buildings, torn-up roads and other disarray that can make spotting victims in need of rescue very difficult.

My research team has developed an artificial neural network system that can run in a computer onboard a drone. This system can emulate some of the excellent ways human vision works. It analyzes images captured by the drone’s camera and communicates notable findings to human supervisors.

First, our system processes the images to improve their clarity. Just as humans squint their eyes to adjust their focus, our technologies take detailed estimates of darker regions in a scene and computationally lighten the images. When images are too hazy or foggy, the system recognizes they’re too bright and reduces the whiteness of the image to see the actual scene more clearly.