UAV updateBird-like visual sense to help UAVs navigate in urban environments

Published 21 April 2011

The U.S. Office of Naval Research (ONR) has awarded researchers $4.5 million to develop a bird-sized, self-flying plane that could navigate through both forests and urban environments; the plane would be about the size of a crow, and, like a bird, would use vision to navigate, but it would use orientable propellers and not flap its wings; the drone will rely, in part, on a technology that emulates the visual system of animals called Convolutional Networks, which mimics the neural network in the mammalian visual cortex and can be trained quickly to interpret the world around it

Predicted steering and layers of convolutional networks // Source: nyu.edu

New York University’s Courant Institute of Mathematical Sciences has received a grant from the U.S. Office of Naval Research (ONR) to develop a bird-sized, self-flying plane that could navigate through both forests and urban environments.

The Courant Institute shares the $4.5 million, 5-year grant with MIT, Carnegie Mellon University (CMU), and Harvard University.

“The plane would be about the size of a crow, and, like a bird, would use vision to navigate, but it would use orientable propellers and not flap its wings,” explained Yann LeCun, a professor at NYU’s Courant Institute.

The work will rely, in part, on a technology that emulates the visual system of animals called Convolutional Networks, which mimics the neural network in the mammalian visual cortex and can be trained quickly to interpret the world around it. The vision system will run on a new type of computer chip that uses a “dataflow” architecture. Dubbed NeuFlow, the new chip will enable Convolutional Networks and other computer perception algorithms to run on very small and lightweight devices hundreds of times faster than a conventional computer.

“The NeuFlow hardware is a key element of this project, as it is the only vision architecture that is powerful enough and compact enough to do the job,” said LeCun, who is collaborating with Yale University researcher Eugenio Culurciello and his team on the NeuFlow project.

The ONR grant brings together seven researchers from diverse fields that include machine learning, computer vision, planning and control, aerodynamics, computational neuroscience, and the study of bird flight. Beside LeCun, team members include: J. Andrew Bagnell (CMU), Andrew Biewener (Harvard), Emilio Frazzoli (MIT), William Freeman (MIT), Martial Hebert (CMU), David Lentink (Wageningen University), and Russ Tedrake (MIT).

An NYU release notes that under a previously awarded National Science Foundation (NSF) grant, LeCun and his colleagues at Stanford University, MIT, and the University of California, Berkeley are working to develop new computational models of how the visual system learns to recognize objects. The project’s researchers hope to uncover new mechanisms that could explain the learning process in neural circuits.