VOLUME 19 #2

Current cover

click here to view pdf

 

DEPARTMENTS

Robots with a mind of their own

Tanner and Heinz with robot
Photo by Kathy F. Atkinson
Jeffrey Heinz, left, and Herbert Tanner watch their robot retrieve a printout.

RESEARCH | Babies learn to talk by listening intently to those around them. They observe which sounds come naturally together, intuitively building rules as to what fits and what does not.

Herbert Tanner believes similar rules exist in robotics, when it comes to deciding the sequence of actions a robot can execute.

Tanner, an assistant professor of mechanical engineering, and fellow researcher Jeffrey Heinz are applying formal language theory and linguistics algorithms to design robots that can “think for themselves.”

“Amazingly, we can use insights from how children learn language to design robots which can likewise learn from their experience,” says Heinz, an assistant professor of linguistics and cognitive science.

Their research project centers on a theoretical modeling and analysis framework that uses a combination of logic (the language of computer science) and differential equations (the language of traditional engineering). The new algorithms will give the robot the tools to devise plans on its own, based on its environment, the scientists say.

Currently, human designers program robots in advance to complete actions in a certain sequence. This is done because, while planning is easy for humans, it is analytically and computationally challenging for robots.

In one example, a robot is asked to pick up and deliver a computer printout. The robot uses high frequency wireless communications and a network of eight cameras to triangulate its location and its destination. The cameras offer pinpoint accuracy—within one millimeter—measuring the robots’ location 100 times per second using reflective markers. The sequence of maneuvers involved is currently pre-programmed, but Tanner has a plan to enable the robot to “program” itself.

“We want our robots to learn both from a human instructor and from their environment how to plan their actions,” he says.

In the future, Tanner hopes to include multiple robots, potentially using a combination of both ground and aerial machines.

“A team of robots that exploits the strengths of each member is more valuable than adding new capabilities to a single robot,” he says.

If successful, the research will allow robots to observe the behaviors around them and adapt their actions according to their surroundings. Tanner believes this could be useful in emergency response to automatically generate an action plan for how first responders can quickly coordinate relief efforts.

ADVERTISEMENT

Medics, for example, cannot reach victims of a disaster unless engineers have first cleared the rubble; meanwhile, engineers need to wait for firefighters to put out the flames before they can work.

“It becomes a problem of planning, scheduling and resource allocation,” Tanner says. “Having an automated planning tool generated by the robots can eliminate uncertainty and confusion and get things going.”

The research is funded by the National Science Foundation’s Cyber-Physical Systems program. Tanner is the principal investigator on the grant, which totals $1 million. Heinz and Calin Belta, a mechanical engineer at Boston University, serve as co-principal investigators on the project. z

Article byKaren B. Roberts, AS ’90

  • University of Delaware   •   Newark, DE 19716   •   USA   •   Phone: (302) 831-2792   •   © 2018