The point of using robots in search and rescue missions is to send them where it’s too dangerous for humans to go, something Mark Micire knows very well. Micire has worked with FEMA as a technical specialist and was with the group that used robots to search for people in the rubble left after 9/11.
Micire also holds a PhD in computer science and robotics from the University of Massachusetts Lowell, where he participated in research on using multi-touch surfaces to control robots ? technology that could one day prove incredibly useful for the search and rescue missions Micire has dedicated so much of his life to.
As a grad student, Micire ?was really excited about multi-touch,? says his former adviser Holly Yanco, a computer science professor at Lowell and founder of the university’s robotics lab. With a Mitsubishi Diamond Touch donated from the Mitsubishi Electric Research Laboratories, they first started experimenting with multi-touch as a map display before they moved on to trying to develop a multi-touch method of robotic control.
The first thing Yanco, Micire and the rest of their team tried to do was create a multi-touch version of an already-existing joystick that had been developed with search and rescue robots in mind.
What they found, when doing trial runs with the joystick widget on the multi-touch screen, was that it wasn’t intuitive. Users reacted to it in a lot of different ways, and without the tactile feedback of a physical joystick, kept having to look down and reorient themselves. ?People were using things in a way we hadn’t intended and we started to, at that point, really see the complexities within multitouch control,? Yanco says.
To try and come up with a better controller, Micire and one of his coworkers sat in the lab and drew joystick after joystick – ?everything back to the old Atari controllers,? he says. Eventually they zeroed in on the circular thumb motion provided by today’s PlayStation and Xbox controllers, and it became their inspiration for what would become the DREAM (Dynamically Resizing Ergonomic and Multi-touch) controller.
To use the DREAM controller, all you do is place your hand on the screen and circles appear underneath your fingers and track their movement. And no matter the size of your hand, the controller ergonomically adjusts itself to fit. Once the controller was ready, the team brought in first responders to test it out and found that ?it was performing as well and in some cases better than the physical joysticks,? Micire says.

The next step was to try to control multiple robots using the multi-touch screen. But since the initial design for single-robot control hadn’t performed well in user tests, the team decided to reverse-engineer it this time. Micire and his coworkers projected static PDF images onto the multi-touch displays and gave users instructions like ?tell the robot to go to area A,? without telling them what gestures to use to make that happen.
In a sample size of 30 to 40 people, Micire found a huge amount of variation in how people would, unprompted, attempt to control the robots. Some would tap on the robot, then tap on area A. Some would circle the robot and draw a path across the screen for it to follow, and so on. From that data, Micire, Yanco and the team were able to determine the most popular gestures, and use them to create a user interface that was as intuitive as possible. For first responders, ease of use of such a system is particularly important, as they may not have time to be trained heavily on its use before deploying it in the field for a search and rescue mission.
Once the gesture catalog was put together for multiple robots, the Lowell team ran a simulation using Microsoft Robotic Design Studio, where they had FEMA first responders command the robots in a search and rescue scenario across an entire city. Though the feedback they received from the first responders was incredibly positive, Micire says that there are still a lot of barriers to getting this technology in the field, first and foremost of which is packaging. ?You’ve got to be able to take these things outside,? he says. ?They have to be able to get wet and dusty.? According to Micire, the touchscreen technology that is currently available to us just isn’t compatible with Mother Nature.
Yanco disagrees, saying that in search and rescue situations, the main multitouch screens would be out of the way of the elements, in a command tent or trailer. The bigger problem, as she sees it, is that most search and rescue robots currently used were originally created to do something else, and then repurposed as disaster bots later. First, she says, we need to have bots that can be deployed safely in search and rescue situations.
Though Micire has now left Lowell and is working at NASA, the team’s research is ongoing. Yanco says their next steps involve trying to move the technology from simulations to real world applications, and trying to circumvent the obstacles to efficient robot search and rescue missions, whatever they may be.
See video (1:42)): DREAM controller in action at UMass-Lowell
See Video (2:04): Mike Micire with Android controller for space shuttle