MIT computer system generates the right robot for the job


The RoboGrammar computer system, developed at MIT, can generate hundreds of thousands of robot designs to tackle particular terrain (Credit: Courtesy of the researchers)


Choosing the right shape is vital for a robot’s ability to traverse a particular terrain, but it is impossible to build and test every potential form. Now, however, an MIT system could solve that problem by simulating designs and working out which is the best fit.

The computer system, known as RoboGrammar, was developed by a team led by Allan Zhao at the Massachusetts Institute of Technology.

First, the user tells the system what parts are available, such as wheels and joints, before telling it what terrain the robot needs to navigate. RoboGrammar does the rest, generating an optimised structure and control programme for the robot.

The advance could inject computer-aided creativity into the field, a research announcement claimed.

“Robot design is still a very manual process,” said Zhao, the paper’s lead author and a PhD student in the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). He described RoboGrammar as “a way to come up with new, more inventive robot designs that could potentially be more effective.”

Despite a wide variety of potential applications, robots “tend to be very similar in their overall shape and design,” said Zhao. “When you think of building a robot that needs to cross various terrains, you immediately jump to a quadruped… we were wondering if that’s really the optimal design.”

While inventiveness was the goal for RoboGrammar, Zhao set some ‘ground rules’ to ensure each design works at a rudimentary level. The team developed a ‘graph grammar’, a set of constraints on the arrangement of components. Adjoining leg segments should be connected with a joint, not another leg segment, for example.

The rules were inspired by arthropods, including insects, spiders and lobsters. The limitations allowed flexible design, and can create more familiar forms such as quadrupeds. The computer can also use wheels instead of legs.

The system operates in three steps: defining the problem, drawing up possible solutions, then selecting the optimal ones. Problem definition is based on available robotic components and the terrain to be covered, which can include elements such as steps, flat areas and slippery surfaces.

RoboGrammar then uses the graph grammar to design hundreds of thousands of potential robot structures. Some might look like cars, while others could appear similar to spiders.  

“It was pretty inspiring for us to see the variety of designs,” said Zhao. “It definitely shows the expressiveness of the grammar.”

A ‘controller’ is developed for each design using an algorithm that prioritises rapid forward movement, governing the ‘movement sequence’ of the robot’s motors.

“The shape and the controller of the robot are deeply intertwined,” said Zhao, “which is why we have to optimise a controller for every given robot individually.”

Once each simulated robot is free to move about, the researchers seek high-performing designs with a neural network algorithm that iteratively samples and evaluates sets of robots, learning which ones work best for a given task.

The MIT team plans to build and test some of the optimal designs in the real world. The system could also be adapted to pursue goals other than traversing terrain, Zhao said.

Surprisingly, most designs ended up being four-legged despite the flexible limits, suggesting manual robot designers were right to gravitate towards quadrupeds.

The research will be presented at this month’s Siggraph Asia conference.


Want the best engineering stories delivered straight to your inbox? The Professional Engineering newsletter gives you vital updates on the most cutting-edge engineering and exciting new job opportunities. To sign up, click here.

Content published by Professional Engineering does not necessarily represent the views of the Institution of Mechanical Engineers.

Source Article