Tracking Drones in Urban Settings

While a computer can be trained to visually spot a drone, an optical system would have a very limited range. A telescopic lens could be used, but then its field of view would be greatly constrained. Instead, Krolik is turning to the same technology that turned the tide against aerial enemies in World War II—radar. But the 1940s technology is getting a 2020s upgrade with the help of a type of machine learning called Deep Neural Networks (DNN).

Teaching Radar Street Smarts
Krolik’s idea is to set up a radar antenna to scan the area of urban landscape under surveillance. Over the course of a few days or weeks, in the absence of drones, the DNN trains itself to differentiate between cars, bicycles, people and other objects by learning their kinematics, seen as “micro-Doppler” in the radar returns, as well as the paths they take moving through the space.

“Most systems are designed in a laboratory to be taken out into the field,” said Krolik. “This one learns from its environment, because most of the time a drone isn’t there.”

For example, cars generally follow paths defined by roadways. And while bicycles and pedestrians have more variable dynamics, their micro-Doppler signatures are very distinctive. Over time, the algorithm learns what radar signals are normal for a given space so that when a drone flies by, with propeller motion and trajectory that is very different to what is normally found in the area, it will trip an alarm.

So far, it’s working. On Duke’s campus, the system has been able to successfully classify drones versus cyclists, pedestrians, cars and other objects 98 percent of the time.

To be clear, Krolik and his team aren’t flying drones across campus at all hours of the day and night. Instead, they train the algorithm to learn the normal traffic around the Science Drive parking garage and separately collect data from a drone flying in Duke Forrest. They then put the data together computationally and let the DNN go to work on the resulting mashup.

Hardwiring a Neural Network
For help with the drone-spotting DNN algorithm, Krolik turned to Helen Li, the Clare Boothe Luce Professor of Electrical and Computer Engineering at Duke. DNNs essentially work by sliding a window over an image in a grid-like fashion, determining which feature is present in each window, and passing that information on to a new layer of data. The process repeats itself until the image is distilled into its most basic features that allow the program to categorize it.

DNNs are inevitably computationally dense programs that can tie up a traditional CPU for far longer than a drone surveillance system would require. The algorithm, however, can be sped up by breaking the tasks into pieces that can be processed simultaneously. A common choice for hardware to tackle this challenge are Graphics Processing Units (GPUs), which are specialized processors originally designed to accelerate graphics rendering that is also useful for machine learning, video editing and gaming applications.

But anyone who has ever compiled an hour-long video or lost track of time gaming knows that GPUs produce a lot of heat by consuming a lot of power. To make their drone detection system more efficient, Li instead turned to Field Programmable Gate Arrays (FPGAs).

“While a GPU is super powerful, it’s also wasteful,” said Li. “We can instead make an application-specific design that is just right for radar signal processing.”

As the name implies, FPGAs can be designed and redesigned to process certain tasks more efficiently by hardwiring some of the computation into the device itself. This allows computer scientists to be surgical with how much computational power to provide each aspect of the algorithm.

“An FPGA can be optimized for a specific neural network model without having to support any other models in different configurations and sizes,” continues Li, who helped start the trend of using FPGAs for machine learning applications. “And where typical codes first have to go through an operating system and compilers before reaching the hardware, our approach essentially implements the DNN algorithm directly on the FPGA boards.”

Setting the Bar High
The result is a system that not only spots drones with 98 percent accuracy, but a system that also consumes 100 times less energy than a similar GPU-based system would, all while maintaining the performance and speed required to work in real-time.

Krolik and Li think the results so far are promising, and DARPA thinks so too. After completing the first half-million-dollar phase of the project and presenting their results, the project was funded for a second half-million-dollar grant over nine months. Their challenge over that extended period of time?

Birds.

“As it turns out, when you’re only looking at the speed and bearing of a flying object, a bird can look a lot like a drone,” said Krolik. “With the help of staff at the Duke Gardens, we’ve been collecting radar data on a wide variety of birds around the garden’s duck pond.  So far, our DNN algorithm has been able to differentiate birds from drones with over 97 percent accuracy. Now we have to put it all together to detect drones versus birds, cars and pedestrians in a truly urban setting. It’s been a lot of fun working with Helen and the rest of the team, and we have the rest of the summer to figure it out.”