August 7, 2020
A&A's Autonomous Flight Systems Lab builds a drone-based machine learning dataset to find those lost in the wilderness.
Training a drone to save lives
Jason Reinfeld, Chief of Special Operations in the Chelan County Sheriff’s Office, reports that his office led 41 Search and Rescue missions to locate missing people in the county wilderness last year alone, which was a relatively “light” year. These searches involved volunteers, some out overnight, deputies earning overtime and helicopters operated at $525 per hour. While the costs of these missions varies widely, a typical search costs the department about $9000. These missions strain local law enforcement budgets and take a lot of critical time, maybe too much time, to locate a person to bring them home safely.
Reinfeld knows that drones can help his office save lives in his county, especially in extremely rugged areas like The Enchantments Wilderness. Rough terrain can easily hide lost or injured hikers. At its most basic, a drone can scan the landscape faster than searchers on foot. Reinhold reports that their use has been limited due to the vast areas that might need to be searched on any particular mission. He cites battery life, stability and the difficulty in scanning large areas as drawbacks. But researchers in A&A’s Autonomous Flight Systems Lab (AFSL) know that their drones can do much better than that.
AFSL’s Chris Hayner, a UW physics student pursuing an A&A minor, explains, “Human eyes have limitations. It’s hard to know what you are looking at from drone footage, especially if the terrain is dynamic, and humans get tired and less effective the longer they are studying footage. We use object detection through machine learning to act as the eyes that never blink or tire.”
Building the machine learning dataset
Object detection is one of the main areas of computer vision that detects or distinguishes objects in digital images. The methods of object detection include machine learning in which a statistical model is trained to detect or classify predefined features, in this case, a human form. The AFSL students, led by Director and A&A Professor Emeritus Juris Vagners, are working with a subdivision of machine learning called Deep Convolutional Neural Networks (DCNN), which trains the program to identify features through a large set of processed images.
Hayner says, “One of the challenges of training the DCNN model is that we don’t have a large set of images to train the model off of. We are training it from scratch with photos we are taking. There are plenty of photo datasets of humans, but none of humans from a drone’s point of view in wilderness environments.”
These training sessions involved, up until the shutdown for the coronavirus pandemic, several Saturdays with about ten UW students headed out to various wilderness sites in Washington State where they have permission to fly drones. The students scatter in all directions to be captured in drone footage taken from above. Researchers will use the images from the footage from this high-tech game of “hide and seek” to train their program to identify humans in the wilderness.
Right now, AFSL has a dataset of about 29,000 pictures, taken exclusively by UW students from the lab’s drones in these hide and seek games. This processed dataset is currently yielding an accuracy of finding humans in the wilderness of about 92 percent, with even better results when combined with thermal data. With more photos and training by UW students, this accuracy will continue to go up.
Interdisciplinary efforts for better results
Seeding the model for this DCNN method takes an interdisciplinary team. Echo Liu, an AMATH CFRM graduate student bringing expertise in statistical modeling to the project, says, “Images contain very high-dimensional data, and many statistical models suffer from this limitation. As we develop this DCNN, the model is able to learn various features of objects of interest from our image data. Ultimately, with practice, the model will get more accurate.”
Practically speaking, the system is meant to work simultaneously with human operators looking at the live feed and the objects that are flagged to dramatically reduce the time and resources needed to locate a lost person.
From Chelan County, Reinfeld is looking forward to seeing how this research works out, “Decreasing the time it takes to locate someone is critical. Using an enhanced drone system instead of a helicopter would help us deploy faster, reach people faster and manage our financial resources better.”
This AFSL project is supported by funds from JCATI, the Joint Center for Aerospace Technology Innovation. This initiative stimulates economic development and job creation in Washington State by funding collaborations between the aerospace industry and academic researchers at the State's public 4-year insitutions of higher education. For more information, visit jcati.org. Additional partners include Applewhite Aero, Hood Technology, and the Chelan County Sheriff's Office.