Machine Shop Robots Get Real
A four-legged robotic system for playing soccer on various terrains
By Rachel Gordon | MIT CSAIL
If you’ve ever played soccer with a robot, it’s a familiar feeling. Sun glistens down on your face as the smell of grass permeates the air. You look around. A four-legged robot is hustling toward you, dribbling with determination.
While the bot doesn’t display a Lionel Messi-like level of ability, it’s an impressive in-the-wild dribbling system nonetheless. Researchers from MIT’s Improbable Artificial Intelligence Lab, part of the Computer Science and Artificial Intelligence Laboratory (CSAIL), have developed a legged robotic system that can dribble a soccer ball under the same conditions as humans. The bot used a mixture of onboard sensing and computing to traverse different natural terrains such as sand, gravel, mud, and snow, and adapt to their varied impact on the ball’s motion. Like every committed athlete, “DribbleBot” could get up and recover the ball after falling.
Programming robots to play soccer has been an active research area for some time. However, the team wanted to automatically learn how to actuate the legs during dribbling, to enable the discovery of hard-to-script skills for responding to diverse terrains like snow, gravel, sand, grass, and pavement. Enter, simulation.
A robot, ball, and terrain are inside the simulation — a digital twin of the natural world. You can load in the bot and other assets and set physics parameters, and then it handles the forward simulation of the dynamics from there. Four thousand versions of the robot are simulated in parallel in real time, enabling data collection 4,000 times faster than using just one robot. That’s a lot of data.
Video: MIT CSAIL
The robot starts without knowing how to dribble the ball — it just receives a reward when it does, or negative reinforcement when it messes up. So, it’s essentially trying to figure out what sequence of forces it should apply with its legs. “One aspect of this reinforcement learning approach is that we must design a good reward to facilitate the robot learning a successful dribbling behavior,” says MIT PhD student Gabe Margolis, who co-led the work along with Yandong Ji, research assistant in the Improbable AI Lab. “Once we’ve designed that reward, then it’s practice time for the robot: In real time, it’s a couple of days, and in the simulator, hundreds of days. Over time it learns to get better and better at manipulating the soccer ball to match the desired velocity.”
The bot could also navigate unfamiliar terrains and recover from falls due to a recovery controller the team built into its system. This controller lets the robot get back up after a fall and switch back to its dribbling controller to continue pursuing the ball, helping it handle out-of-distribution disruptions and terrains.
“If you look around today, most robots are wheeled. But imagine that there’s a disaster scenario, flooding, or an earthquake, and we want robots to aid humans in the search-and-rescue process. We need the machines to go over terrains that aren’t flat, and wheeled robots can’t traverse those landscapes,” says Pulkit Agrawal, MIT professor, CSAIL principal investigator, and director of Improbable AI Lab.” The whole point of studying legged robots is to go terrains outside the reach of current robotic systems,” he adds. “Our goal in developing algorithms for legged robots is to provide autonomy in challenging and complex terrains that are currently beyond the reach of robotic systems.”
The fascination with robot quadrupeds and soccer runs deep — Canadian professor Alan Mackworth first noted the idea in a paper entitled “On Seeing Robots,” presented at VI-92, 1992. Japanese researchers later organized a workshop on “Grand Challenges in Artificial Intelligence,” which led to discussions about using soccer to promote science and technology. The project was launched as the Robot J-League a year later, and global fervor quickly ensued. Shortly after that, “RoboCup” was born.
Compared to walking alone, dribbling a soccer ball imposes more constraints on DribbleBot’s motion and what terrains it can traverse. The robot must adapt its locomotion to apply forces to the ball to dribble. The interaction between the ball and the landscape could be different than the interaction between the robot and the landscape, such as thick grass or pavement. For example, a soccer ball will experience a drag force on grass that is not present on pavement, and an incline will apply an acceleration force, changing the ball’s typical path. However, the bot’s ability to traverse different terrains is often less affected by these differences in dynamics — as long as it doesn’t slip — so the soccer test can be sensitive to variations in terrain that locomotion alone isn’t.
“Past approaches simplify the dribbling problem, making a modeling assumption of flat, hard ground. The motion is also designed to be more static; the robot isn’t trying to run and manipulate the ball simultaneously,” says Ji. “That’s where more difficult dynamics enter the control problem. We tackled this by extending recent advances that have enabled better outdoor locomotion into this compound task which combines aspects of locomotion and dexterous manipulation together.”
On the hardware side, the robot has a set of sensors that let it perceive the environment, allowing it to feel where it is, “understand” its position, and “see” some of its surroundings. It has a set of actuators that lets it apply forces and move itself and objects. In between the sensors and actuators sits the computer, or “brain,” tasked with converting sensor data into actions, which it will apply through the motors. When the robot is running on snow, it doesn’t see the snow but can feel it through its motor sensors. But soccer is a trickier feat than walking — so the team leveraged cameras on the robot’s head and body for a new sensory modality of vision, in addition to the new motor skill. And then — we dribble.
“Our robot can go in the wild because it carries all its sensors, cameras, and compute on board. That required some innovations in terms of getting the whole controller to fit onto this onboard compute,” says Margolis. “That’s one area where learning helps because we can run a lightweight neural network and train it to process noisy sensor data observed by the moving robot. This is in stark contrast with most robots today: Typically a robot arm is mounted on a fixed base and sits on a workbench with a giant computer plugged right into it. Neither the computer nor the sensors are in the robotic arm! So, the whole thing is weighty, hard to move around.”
There’s still a long way to go in making these robots as agile as their counterparts in nature, and some terrains were challenging for DribbleBot. Currently, the controller is not trained in simulated environments that include slopes or stairs. The robot isn’t perceiving the geometry of the terrain; it’s only estimating its material contact properties, like friction. If there’s a step up, for example, the robot will get stuck — it won’t be able to lift the ball over the step, an area the team wants to explore in the future. The researchers are also excited to apply lessons learned during development of DribbleBot to other tasks that involve combined locomotion and object manipulation, quickly transporting diverse objects from place to place using the legs or arms.
The research is supported by the DARPA Machine Common Sense Program, the MIT-IBM Watson AI Lab, the National Science Foundation Institute of Artificial Intelligence and Fundamental Interactions, the U.S. Air Force Research Laboratory, and the U.S. Air Force Artificial Intelligence Accelerator. The paper will be presented at the 2023 IEEE International Conference on Robotics and Automation (ICRA).
Robots predict human intention for faster builds
Bionic robot arms as flexible and gentle as an elephant’s trunk
Robotics In Manufacturing: Threat or Opportunity?
Origami-inspired robots can sense, analyze and act in challenging environments
A highly sensitive robot gripper with no need for pneumatics
A four-legged robotic system for playing soccer on various terrains
Robotic hand can identify objects with just one grasp
Automated Support Removal & Finishing for Metal AM — Robots vs CNC
Robotic system offers hidden window into collective bee behavior
By Celia Luterbacher
Honeybees are famously finicky when it comes to being studied. Research instruments and conditions and even unfamiliar smells can disrupt a colony’s behavior. Now, a joint research team from the Mobile Robotic Systems Group in EPFL’s School of Engineering and School of Computer and Communication Sciences and the Hiveopolis project at Austria’s University of Graz have developed a robotic system that can be unobtrusively built into the frame of a standard honeybee hive.
Composed of an array of thermal sensors and actuators, the system measures and modulates honeybee behavior through localized temperature variations.
“Many rules of bee society – from collective and individual interactions to raising a healthy brood – are regulated by temperature, so we leveraged that for this study,” explains EPFL PhD student Rafael Barmak, first author on a paper on the system recently published in Science Robotics. “The thermal sensors create a snapshot of the bees’ collective behavior, while the actuators allow us to influence their movement by modulating thermal fields.”
“Previous studies on the thermal behavior of honeybees in winter have relied on observing the bees or manipulating the outside temperature,” adds Martin Stefanec of the University of Graz. “Our robotic system enables us to change the temperature from within the cluster, emulating the heating behavior of core bees there, and allowing us to study how the winter cluster actively regulates its temperature.”
A ‘biohybrid superorganism’ to mitigate colony collapse
Bee colonies are challenging to study in winter since they are sensitive to cold, and opening their hives risks harming them in addition to influencing their behavior. But thanks to the researchers’ biocompatible robotic system, they were able to study three experimental hives, located at the Artificial Life Lab at the University of Graz, during winter and to control them remotely from EPFL. Inside the device, a central processor coordinated the sensors, sent commands to the actuators, and transmitted data to the scientists, demonstrating that the system could be used to study bees with no intrusion – or even cameras – required.
Mobile Robotic Systems Group head Francesco Mondada explains that one of the most important aspects of the system – which he calls a ‘biohybrid superorganism’ for its combination of robotics with a colony of individuals acting as a living entity – is its ability to simultaneously observe and influence bee behavior.
“By gathering data on the bees’ position and creating warmer areas in the hive, we were able to encourage them to move around in ways they would never normally do in nature during the winter, when they tend to huddle together to conserve energy. This gives us the possibility to act on behalf of a colony, for example by directing it toward a food source, or discouraging it from dividing into too-small groups, which can threaten its survival.”
The scientists were able to prolong the survival of a colony following the death of its queen by distributing heat energy via the actuators. The system’s ability to mitigate colony collapse could have implications for bee survivability, which has become a growing environmental and food security concern as the pollinators’ global populations have declined.
Never-before-seen behaviors
In addition to its potential to support colonies, the system has shed light on honeybee behaviors that have never been observed, opening new avenues in biological research.
“The local thermal stimuli produced by our system revealed previously unreported dynamics that are generating exciting new questions and hypotheses,” says EPFL postdoctoral researcher and corresponding author Rob Mills. “For example, currently, no model can explain why we were able to encourage the bees to cross some cold temperature ‘valleys’ within the hive.”
The researchers now plan to use the system to study bees in summertime, which is a critical period for raising young. In parallel, the Mobile Robotic Systems Group is exploring systems using vibrational pathways to interact with honeybees.
“The biological acceptance aspect of this work is critical: the fact that the bees accepted the integration of electronics into the hive gives our device great potential for different scientific or agricultural applications,” says Mondada.
This work was supported by the EU H2020 FET project HIVEOPOLIS (no. 824069), coordinated by Thomas Schmickl, and by the Field of Excellence COLIBRI (Complexity of Life in basic Research and Innovation) at the University of Graz.