Multi-touch control search-and-rescue robot swarms

Published 21 September 2010

The new Dream controller for Microsoft Surface could help speed up search-and-rescue operations; . when disaster strikes, search-and-rescue teams must quickly gather and assimilate the data needed to find survivors; a team of robots can help scout out for persons stuck in rubble or create new maps of the landscape; first responders, though, need ways to control those robots, and process incoming information quickly

Robot swarm ready to roll // Source: onemorepromethean.com

The Dream controller (short for Dynamically-Resizing, Ergonomic and Multi-touch) is a new program designed for Microsoft’s Surface touch-screen that lets one or more people take control of a single robot, or a swarm of them. The software offers new ways for users to manage search-and-rescue robots through a touch screen interface, while integrating with virtual maps. Several virtual robot controllers are automatically integrated with a view of a virtual map.

Kristina Grifantini writes in Technology Review that when disaster strikes, search-and-rescue teams must quickly gather and assimilate the data needed to find survivors. A team of robots can help scout out for persons stuck in rubble or create new maps of the landscape. First responders, though, need ways to control those robots, and process incoming information quickly.

Robots are usually controlled using a physical device, like a joystick or games console-type controller. Mark Micire, a researcher at the University of Massachusetts Lowell who deployed search-and-rescue robots at the World Trade Center after 9/11 and a member of the Massachusetts FEMA team, built the Dream system to help first responders. Since they do not need to use new device to control each of several robots while referring to a physical map, he says, it is possible to maneuver more quickly.

Right now, the state of practice is to use paper maps, with everyone gathered around,” says Holly Yanco, professor and head of the Robotics Lab at UMass Lowell. “We’ve designed the multi-touch application to replace these maps with interactive ones.” This lets live data — satellite imagery, sensor data, and video or photography from people, vehicles or robots—to be used nearly instantaneously.

With our design, a person can select the robot, then place his or her hands down to form the Dream controller to directly drive the robot and see the robot’s eye video,” says Yanco. “Once done, the controller disappears.”

Grifantini notes that each controller sizes itself to a person’s hand space and finger size, based on contact points made with the screen — so very large or very small hands would be able to control it just as easily. “We aren’t aware of any other multi-touch controllers that conform to the placement and size of a person’s hand,” says Yanco. The Dream controller’s hand and finger registration algorithm, which has a patent pending, is faster and outperforms other work, according to Yanco.