Human-like brain helps robot out of a maze
How Commercial Robotics Could Improve the Inventory and Shipping Crisis of 2021
Interview with Huy Ha and Shuran Song: CoRL 2021 best system paper award winners

Congratulations to Huy Ha and Shuran Song who have won the CoRL 2021 best system paper award!
Their work, FlingBot: the unreasonable effectiveness of dynamic manipulations for cloth unfolding, was highly praised by the judging committee. “To me, this paper constitutes the most impressive account of both simulated and real-world cloth manipulation to date.”, commented one of the reviewers.
Below, the authors tell us more about their work, the methodology, and what they are planning next.
What is the topic of the research in your paper?
In my most recent publication with my advisor, Professor Shuran Song, we studied the task of cloth unfolding. The goal of the task is to manipulate a cloth from a crumpled initial state to an unfolded state, which is equivalent to maximizing the coverage of the cloth on the workspace.
Could you tell us about the implications of your research and why it is an interesting area for study?
Historically, most robotic manipulation research topics, such as grasp planning, are concerned with rigid objects, which have only 6 degrees of freedom since their geometry does not change. This allows one to apply the typical state estimation – task & motion planning pipeline in robotics. In contrast, deformable objects could bend and stretch in arbitrary directions, leading to infinite degrees of freedom. It’s unclear what the state of the cloth should even be. In addition, deformable objects such as clothes could experience severe self occlusion – given a crumpled piece of cloth, it’s difficult to identify whether it’s a shirt, jacket, or pair of pants. Therefore, cloth unfolding is a typical first step of cloth manipulation pipelines, since it reveals key features of the cloth for downstream perception and manipulation.
Despite the abundance of sophisticated methods for cloth unfolding over the years, they typically only address the easy case (where the cloth already starts off mostly unfolded) or take upwards of a hundred steps for challenging cases. These prior works all use single arm quasi-static actions, such as pick and place, which is slow and limited by the physical reach range of the system.
Could you explain your methodology?
In our daily lives, humans typically use both hands to manipulate cloths, and with as little as a single high velocity fling or two, we can unfold an initially crumpled cloth. Based on this observation, our key idea is simple: Use dual arm dynamic actions for cloth unfolding.
FlingBot is a self-supervised framework for cloth unfolding which uses a pick, stretch, and fling primitive for a dual-arm setup from visual observations. There are three key components to our approach. First is the decision to use a high velocity dynamic action. By relying on cloths’ mass combined with a high-velocity throw to do most of its work, a dynamic flinging policy can unfold cloths much more efficiently than a quasi-static policy. Second is a dual-arm grasp parameterization which makes satisfying collision safety constraints easy. By treating a dual-arm grasp not as two points but as a line with a rotation and length, we can directly constrain the rotation and length of the line to ensure arms do not cross over each other and do not try to grasp too close to each other. Third is our choice of using Spatial Action Maps, which learns translational, rotational, and scale equivariant value maps, and allows for sample efficient learning.
What were your main findings?
We found that dynamic actions have three desirable properties over quasi-static actions for the task of cloth unfolding. First, they are efficient – FlingBot achieves over 80% coverage within 3 actions on novel cloths. Second, they are generalizable – trained on only square cloths, FlingBot also generalizes to T-shirts. Third, they expand the system’s effective reach range – even when FlingBot can’t fully lift or stretch a cloth larger than the system’s physical reach range, it’s able to use high velocity flings to unfold the cloth.
After training and evaluating our model in simulation, we deployed and finetuned our model on a real world dual-arm system, which achieves above 80% coverage for all cloth categories. Meanwhile, the quasi-static pick & place baseline was only able to achieve around 40% coverage.
What further work are you planning in this area?
Although we motivated cloth unfolding as a precursor for downstream modules such as cloth state estimation, unfolding could also benefit from state estimation. For instance, if the system is confident it has identified the shoulders of the shirt in its state estimation, the unfolding policy could directly grasp the shoulders and unfold the shirt in one step. Based on this observation, we are currently working on a cloth unfolding and state estimation approach which can learn in a self-supervised manner in the real world.
About the authors
![]() |
Huy Ha is a Ph.D. student in Computer Science at Columbia University. He is advised by Professor Shuran Song and is a member of the Columbia Artificial Intelligence and Robotics (CAIR) lab. |
![]() |
Shuran Song is an assistant professor in computer science department at Columbia University, where she directs the Columbia Artificial Intelligence and Robotics (CAIR) Lab. Her research focuses on computer vision and robotics. She’s interested in developing algorithms that enable intelligent systems to learn from their interactions with the physical world, and autonomously acquire the perception and manipulation skills necessary to execute complex tasks and assist people. |
Find out more
- Read the paper on arXiv.
- The videos of the real-world experiments and code are available here, as is a video of the authors’ presentation at CoRL.
- Read more about the winning and shortlisted papers for the CoRL awards here.
Pietro Valdastri’s Plenary Talk – Medical capsule robots: a Fantastic Voyage

At the beginning of the new millennia, wireless capsule endoscopy was introduced as a minimally invasive method of inspecting the digestive tract. The possibility of collecting images deep inside the human body just by swallowing a “pill” revolutionized the field of gastrointestinal endoscopy and sparked a brand-new field of research in robotics: medical capsule robots. These are self-contained robots that leverage extreme miniaturization to access and operate in environments that are out of reach for larger devices. In medicine, capsule robots can enter the human body through natural orifices or small incisions, and detect and cure life-threatening diseases in a non-invasive manner. This talk provides a perspective on how this field has evolved in the last ten years. We explore what was accomplished, what has failed, and what were the lessons learned. We also discuss enabling technologies, intelligent control, possible levels of computer assistance, and highlight future challenges in this ongoing Fantastic Voyage.
Bio: Pietro Valdastri (Senior Member, IEEE) received the master’s degree (Hons.) from the University of Pisa, in 2002, and the Ph.D. degree in biomedical engineering, Scuola Superiore Sant’Anna in 2006. He is a Professor and a Chair of Robotics and Autonomous Systems with the University of Leeds. His research interests include robotic surgery, robotic endoscopy, design of magnetic mechanisms, and medical capsule robots. He is a recipient of the Wolfson Research Merit Award from the Royal Society.
Adaptive Swarm Robotics Could Revolutionize Smart Agriculture
Engineers teach AI to navigate ocean with minimal energy
These tiny liquid robots never run out of juice as long as they have food
A soft jig that could enhance the performance of general-purpose assembly robots
A wheeled car, quadruped and humanoid robot: Swiss-Mile Robot from ETH Zurich
Engineers build in-pipe sewer robot
How Can the Utilities Industry Benefit from Robotics?
Meet the Oystamaran

MIT students and researchers from MIT Sea Grant work with local oyster farmers in advancing the aquaculture industry by seeking solutions to some of its biggest challenges. Currently, oyster bags have to be manually flipped every one to two weeks to reduce biofouling. Image: John Freidah, MIT MechE
By Michaela Jarvis | Department of Mechanical Engineering
When Michelle Kornberg was about to graduate from MIT, she wanted to use her knowledge of mechanical and ocean engineering to make the world a better place. Luckily, she found the perfect senior capstone class project: supporting sustainable seafood by helping aquaculture farmers grow oysters.
“It’s our responsibility to use our skills and opportunities to work on problems that really matter,” says Kornberg, who now works for an aquaculture company called Innovasea. “Food sustainability is incredibly important from an environmental standpoint, of course, but it also matters on a social level. The most vulnerable will be hurt worst by the climate crisis, and I think food sustainability and availability really matters on that front.”
The project undertaken by Kornberg’s capstone class, 2.017 (Design of Electromechanical Robotic Systems), came out of conversations between Michael Triantafyllou, who is MIT’s Henry L. and Grace Doherty Professor in Ocean Science and Engineering and director of MIT Sea Grant, and Dan Ward. Ward, a seasoned oyster farmer and marine biologist, owns Ward Aquafarms on Cape Cod and has worked extensively to advance the aquaculture industry by seeking solutions to some of its biggest challenges.
Speaking with Triantafyllou at MIT Sea Grant — part of a network of university-based programs established by the federal government to protect the coastal environment and economy — Ward had explained that each of his thousands of floating mesh oyster bags need to be turned over about 11 times a year. The flipping allows algae, barnacles, and other “biofouling” organisms that grow on the part of the bag beneath the water’s surface to be exposed to air and light, so they can dry and chip off. If this task is not performed, water flow to the oysters, which is necessary for their growth, is blocked.
The bags are flipped by a farmworker in a kayak, and the task is monotonous, often performed in rough water and bad weather, and ergonomically injurious. “It’s kind of awful, generally speaking,” Ward says, adding that he pays about $3,500 per year to have the bags turned over at each of his two farm sites — and struggles to find workers who want to do the job of flipping bags that can grow to a weight of 60 or 70 pounds just before the oysters are harvested.
Presented with this problem, the capstone class Kornberg was in — composed of six students in mechanical engineering, ocean engineering, and electrical engineering and computer science — brainstormed solutions. Most of the solutions, Kornberg says, involved an autonomous robot that would take over the bag-flipping. It was during that class that the original version of the “Oystamaran,” a catamaran with a flipping mechanism between its two hulls, was born.

A combination of mechanical engineering, ocean engineering, and electrical engineering and computer sciences students work together to design a robot to help with flipping oyster bags at Ward Aquafarm on Cape Cod. The “Oystamaran” robot uses a vision system to position and flip the bags. Image: Lauren Futami, MIT MechE
Ward’s involvement in the project has been important to its evolution. He says he has reviewed many projects in his work on advisory boards that propose new technologies for aquaculture. Often, they don’t correspond with the actual challenges faced by the industry.
“It was always ‘I already have this remotely operated vehicle; would it be useful to you as an oyster farmer if I strapped on some kind of sensor?’” Ward says. “They try to fit robotics into aquaculture without any industry collaboration, which leads to a robotic product that doesn’t solve any of the issues we experience out on the farm. Having the opportunity to work with MIT Sea Grant to really start from the ground up has been exciting. Their approach has been, ‘What’s the problem, and what’s the best way to solve the problem?’ We do have a real need for robotics in aquaculture, but you have to come at it from the customer-first, not the technology-first, perspective.”
Triantafyllou says that while the task the robot performs is similar to work done by robots in other industries, the “special difficulty” students faced while designing the Oystamaran was its work environment.
“You have a floating device, which must be self-propelled, and which must find these objects in an environment that is not neat,” Triantafyllou says. “It’s a combination of vision and navigation in an environment that changes, with currents, wind, and waves. Very quickly, it becomes a complicated task.”
Kornberg, who had constructed the original central flipping mechanism and the basic structure of the vessel as a staff member at MIT Sea Grant after graduating in May 2020, worked as a lab instructor for the next capstone class related to the project in spring 2021. Andrew Bennett, education administrator at MIT Sea Grant, co-taught that class, in which students designed an Oystamaran version 2.0, which was tested at Ward Aquafarms and managed to flip several rows of bags while being controlled remotely. Next steps will involve making the vessel more autonomous, so it can be launched, navigate autonomously to the oyster bags, flip them, and return to the launching point. A third capstone class related to the project will take place this spring.

The students operate the “Oystamaran” robot remotely from the boat. Image: John Freidah, MIT MechE
Bennett says an ideal project outcome would be, “We have proven the concept, and now somebody in industry says, ‘You know, there’s money to be made in oysters. I think I’ll take over.’ And then we hand it off to them.”
Meanwhile, he says an unexpected challenge arose with getting the Oystamaran to go between tightly packed rows of oyster bags in the center of an array.
“How does a robot shimmy in between things without wrecking something? It’s got to wiggle in somehow, which is a fascinating controls problem,” Bennett says, adding that the problem is a source of excitement, rather than frustration, to him. “I love a new challenge, and I really love when I find a problem that no one expected. Those are the fun ones.”
Triantafyllou calls the Oystamaran “a first for the industry,” explaining that the project has demonstrated that robots can perform extremely useful tasks in the ocean, and will serve as a model for future innovations in aquaculture.
“Just by showing the way, this may be the first of a number of robots,” he says. “It will attract talent to ocean farming, which is a great challenge, and also a benefit for society to have a reliable means of producing food from the ocean.”