Open-source platform simulates wildlife for soft robotics designers

Email
Facebook
X.com
LinkedIn
WhatsApp

Since the term “soft robotics” was adopted in 2008, engineers in the field have been building diverse representations of flexible machines useful in exploration, locomotion, rehabilitation, and even space. One source of inspiration: the way animals move in the wild.

A team of MIT researchers has taken this a step further, developing SoftZoo, a bio-inspired platform that enables engineers to study soft robot co-design. The framework optimizes algorithms that consist of design, which determines what the robot will look like; and control, or the system that enables robotic motion, improving how users automatically generate outlines for potential machines.

Taking a walk on the wild side, the platform features 3-D models of animals such as panda bears, fishes, sharks, and caterpillars as designs that can simulate soft robotics tasks like locomotion, agile turning, and path following in different environments. Whether by snow, desert, clay, or water, the platform demonstrates the performance trade-offs of various designs in different terrains.

“Our framework can help users find the best configuration for a robot’s shape, allowing them to design soft robotics algorithms that can do many different things,” says MIT PhD student Tsun-Hsuan Wang, an affiliate of the Computer Science and Artificial Intelligence Laboratory (CSAIL) who is a lead researcher on the project. “In essence, it helps us understand the best strategies for robots to interact with their environments.”

SoftZoo is more comprehensive than similar platforms, which already simulate design and control, because it models movement that reacts to the physical features of various biomes. The framework’s versatility comes from a differentiable multiphysics engine, which allows for the simulation of several aspects of a physical system at the same time, such as a baby seal turning on ice or a caterpillar inching across a wetland environment. The engine’s differentiability optimizes co-design by reducing the number of the often expensive simulations required to solve computational control and design problems. As a result, users can design and move soft robots with more sophisticated, specified algorithms.

The system’s ability to simulate interactions with different terrain illustrates the importance of morphology, a branch of biology that studies the shapes, sizes, and forms of different organisms. Depending on the environment, some biological structures are more optimal than others, much like comparing blueprints for machines that complete similar tasks. 

These biological outlines can inspire more specialized, terrain-specific artificial life. “A jellyfish’s gently undulating geometry allows it to efficiently travel across large bodies of water, inspiring researchers to develop new breeds of soft robots and opening up unlimited possibilities of what artificial creatures cultivated entirely in silico can be capable of,” says Wang. “Additionally, dragonflies can perform very agile maneuvers that other flying creatures cannot complete because they have special structures on their wings that change their center of mass when they fly. Our platform optimizes locomotion the same way a dragonfly is naturally more adept at working through its surroundings.”

Robots previously struggled to navigate through cluttered environments because their bodies were not compliant with their surroundings. With SoftZoo, though, designers could develop the robot’s brain and body simultaneously, co-optimizing both terrestrial and aquatic machines to be more aware and specialized. With increased behavioral and morphological intelligence, the robots would then be more useful in completing rescue missions and conducting exploration. If a person went missing during a flood, for example, the robot could potentially traverse the waters more efficiently because it was optimized using methods demonstrated in the SotftZoo platform.

“SoftZoo provides open-source simulation for soft robot designers, helping them build real-world robots much more easily and flexibly while accelerating the machines’ locomotion capabilities in diverse environments,” adds study co-author Chuang Gan, a research scientist at the MIT-IBM Watson AI Lab who will soon be an assistant professor at the University of Massachusetts at Amherst.

“This computational approach to co-designing the soft robot bodies and their brains (that is, their controllers) opens the door to rapidly creating customized machines that are designed for a specific task,” adds Daniela Rus, director of CSAIL and the Andrew and Erna Viterbi Professor in the MIT Department of Electrical Engineering and Computer Science (EECS), who is another author of the work.

Before any type of robot is constructed, the framework could be a substitute for field testing unnatural scenes. For example, assessing how a bear-like robot behaves in a desert may be challenging for a research team working in the urban plains of Boston. Instead, soft robotics engineers could use 3-D models in SoftZoo to simulate different designs and evaluate how effective the algorithms controlling their robots are at navigation. In turn, this would save researchers time and resources.

Still, the limitations of current fabrication techniques stand in the way of bringing these soft robot designs to life. “Transferring from simulation to physical robot remains unsolved and requires further study,” says Wang. “The muscle models, spatially varying stiffness, and sensorization in SoftZoo cannot be straightforwardly realized with current fabrication techniques, so we are working on these challenges.”

In the future, the platform’s designers are eyeing applications in human mechanics, such as manipulation, given its ability to test robotic control. To demonstrate this potential, Wang’s team designed a 3-D arm throwing a snowball forward. By including the simulation of more human-like tasks, soft robotics designers could then use the platform to assess soft robotic arms that grasp, move, and stack objects.

Wang, Gan, and Rus wrote a paper on the work alongside EECS PhD student and CSAIL affiliate Pingchuan Ma, Harvard University postdoc Andrew Spielberg PhD ’21, Carnegie Mellon University PhD student Zhou Xian, UMass Amherst Associate Professor Hao Zhang, and MIT professor of brain and cognitive sciences and CSAIL affiliate Joshua B. Tenenbaum.

Wang completed this work during an internship at the MIT-IBM Watson AI Lab, with the NSF EFRI Program, DARPA MCS Program, MIT-IBM Watson AI Lab, and gift funding from MERL, Cisco, and Amazon all providing support for the project. The team’s research will be presented at the 2023 International Conference on Learning Representations this month.

  

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

More to explorer

Leave a Reply