DeepMind’s latest research at ICLR 2022
MimicEducationalRobots teach robotics for the future

The robotics industry is changing. The days of industrial robot arms working behind enclosures, performing pre-programmed identical tasks are coming to an end. Robots that can interact with each other and other equipment are becoming standard and robots are expanding to more aspects of our lives. My name is Brett Pipitone, and I am the founder, CEO, and sole employee of mimicEducationalRobots. I believe that robots will soon become an inescapable part of modern life, and I seek to prepare today’s students to work with these emerging technologies.
The mimicEducationalRobots product line consists of a family of three robots. The largest and most sophisticated is mimicArm. The adorable tinyBot is small and capable. The newest robot, bitsyBot (currently on Kickstarter) is perfect for those taking the first steps into the robotics world. Despite having different features, all three robots are designed to communicate with each other and use special sensors to make decisions about their environment. Interfaces are simple but powerful, allowing users to learn quickly and without frustration.
mimicEducationalRobots believes that every student will encounter robots in their everyday life, no matter their career path. Learning robotics allows students to become comfortable and familiar with technology that is rapidly becoming commonplace in day-to-day life. Through their combinations of features, the mimicEducationalRobots products introduce technology that’s still in its infancy, such as human/robot interaction and cooperative robotics, at a level students can understand. This is why every mimicEducationalRobots robot starts with manual control, allowing students to get the feel of their robot and what it can and can’t do. Once they’ve mastered manual control programming is a smaller leap. The mimicEducationalRobots programming software simplifies this transition by reflecting the same motions the students have been making manually with simple commands like “robotMove” and “robotGrab”.
For more complex programs, mimicEducationalRobots believes that their robots should mimic industry as closely as possible. This means doing as much as possible with the simplest possible sensor. Things start small, with a great big tempting pushable button called, incidentally, “greatBigButton”. This is the students’ first introduction to human interaction as they program their robot to react to a button press. From there things get much more exciting without getting very much more complicated. For example, an array of non-contact IR thermometers called Grid-EYE allows mimicArm to detect faces using nothing but body heat. A simple IR proximity sensor allows tinyBot or bitsyBot to react when offered a block before the block touches any part of the robot. There’s even a cable that allows robots to communicate with each other and react to what the other is doing. These simple capabilities allow students to create a wide range of robotic behaviors.

mimicEducationalRobots is a homegrown business designed and built by an engineer and dad passionate about teaching people of all ages about robotics. I created the robots’ brains using a bare circuit board, template, some solder paste, and tweezers. Every component is added by hand and the board is soldered together with a toaster oven I modified. Once cooled the boards are programmed using a modified Arduino UNO R3, one of the best technological tools for beginners.

Other physical robot parts are designed using 3D modeling software and made either on a 3D printer or CNC router. I have two 3D printers in his basement running 24 hours a day, producing at least 20 robot kits a week. The CNC router requires a great deal more supervision but is capable of turning out four sets of beautiful neon plastic parts every 30 minutes.
mimicEducationalRobots is a new kind of company, producing a new kind of product, for a new kind of consumer. Their products demonstrate just how fundamental technology, and in particular open source technology, has changed our world. I hope students learning on mimicArm, tinyBot, or bitsyBot will help create the next life-changing technological leap.
To learn more about the family of mimicEducationalRobots visit www.mimicRobots.com
Robotic Disruption in US Workforce Leads to New Opportunities and Challenges
Sensing Parkinson’s symptoms

MyoExo integrates a series of sensors into a wearable device capable of detecting slight changes in muscle strain and bulging, enabling it to measure and track the symptoms of Parkinson’s disease. Credit: Oluwaseun Araromi
By Matthew Goisman/SEAS Communications
Nearly one million people in the United States live with Parkinson’s disease. The degenerative condition affects the neurons in the brain that produce the neurotransmitter dopamine, which can impact motor function in multiple ways, including muscle tremors, limb rigidity and difficulty walking.
There is currently no cure for Parkinson’s disease, and current treatments are limited by a lack of quantitative data about the progress of the disease.
MyoExo, a translation-focused research project based on technology developed at the Harvard John A. Paulson School of Engineering (SEAS) and the Wyss Institute for Biologically Inspired Engineering, aims to provide that data. The team is refining the technology and starting to develop a business plan as part of the Harvard Innovation Lab’s venture program. The MyoExo wearable device aims to not only provide a remote monitoring device for patients at-home setting but also be sensitive enough to aid early diagnostics of Parkinson’s disease.
“This is a disease that’s affecting a lot of people and it seems like the main therapeutics that tackle this have not changed significantly in the past several decades,” said Oluwaseun Araromi, Research Associate in Materials Science and Mechanical Engineering at SEAS and the Wyss Institute.
The MyoExo technology consists of a series of wearable sensors, each one capable of detecting slight changes in muscle strain and bulging. When integrated into a wearable device, the data can provide what Araromi described as “muscle-centric physiological signatures.”
“The enabling technology underlying this is a sensor that detects small changes in the shape of an object,” he said. “Parkinson’s disease, especially in its later stages, really expresses itself as a movement disorder, so sensors that can detect shape changes can also detect changes in the shape of muscle as people move.”
MyoExo emerged from research done in the Harvard Biodesign Lab of Conor Walsh, the Paul A. Maeder Professor of Engineering and Applied Sciences, and the Microrobotics Lab of Rob Wood, the Charles River Professor of Engineering and Applied Sciences at SEAS. Araromi, Walsh and Wood co-authored a paper on their research into resilient wearable sensors in November 2020, around the same time the team began to focus on medical applications of the technology.
“If we had these hypersensitive sensors in something that a person was wearing, we could detect how their muscles were bulging,” Walsh said. “That was more application-agnostic. We didn’t know exactly where that would be the most important, and I credit Seun and our Wyss collaborators for being the ones to think about identifying Parkinson’s applications.”
Araromi sees the MyoExo technology as having value for three major stakeholders: the pharmaceutical industry, clinicians and physicians, and patients. Pharmaceutical companies could use data from the wearable system to quantify their medications’ effect on Parkinson’s symptoms, while clinicians could determine if one treatment regimen is more effective than another for a specific patient. Patients could use the system to track their own treatment, whether that’s medication, physical therapy, or both.
“Some patients are very incentivized to track their progress,” Araromi said. “They want to know that if they were really good last week and did all of the exercises that they were prescribed, their wearable device would tell them their symptomatology has reduced by 5% compared to the week before. We envision that as something that would really encourage individuals to keep and adhere to their treatment regiments.”

MyoExo’s sensor technology is based on research conducted in the Harvard Biodesign Lab of Conor Walsh and the Microrobotics Lab of Rob Wood at SEAS, and further developed through the Wyss Institute for Biologically Inspired Engineering and Harvard Innovation Labs venture program. Credit: Oluwaseun Araromi
Araromi joined SEAS and the Wyss Institute as a postdoctoral researcher in 2016, having earned a Ph.D in mechanical engineering from the University of Bristol in England and completed a postdoc at the Swiss Federal Institute of Technology Lausanne.
His interest in sensor technology made him a great fit for research spanning the Biodesign and Microrobotics labs, and his early work included helping develop an exosuit to aid with walking.
“I was initially impressed with Seun’s strong background in materials, transduction and physics,” Walsh said. “He really understood how you’d think about creating novel sensors with soft materials. Seun’s really the translation champion for the project in terms of driving forward the technology, but at the same time trying to think about the need in the market, and how we demonstrate that we can meet that.”
The technology is currently in the human testing phase to demonstrate proof of concept detection of clinically-relevant metrics with support from the Wyss Institute Validation Project program. Araromi wants to show that the wearable device can quantify the difference between the muscle movements of someone with Parkinson’s and someone without. From there, the goal is to demonstrate that the device can quantify whether a person has early- or late-stage symptoms of the disease, as well as their response to treatment.
“We are evaluating our technology and validating our technical approach, making sure that as it’s currently constructed, even in this crude form, we can get consistent data and results,” Araromi said. “We’re doing this in a small pilot phase, such that if there are issues, we can fix those issues, and then expand to a larger population where we would test our device on more individuals with Parkinson’s disease. That should really convince ourselves and hopefully the community that we are able to reach a few key technical milestones, and then garner more interest and potentially investment and partnership.”
Interactive Perception at Toyota Research Institute
Dr. Carolyn Matl, Research Scientist at Toyota Research Institute, explains why Interactive Perception and soft tactile sensors are critical for manipulating challenging objects such as liquids, grains, and dough. She also dives into “StRETcH” a Soft to Resistive Elastic Tactile Hand, a variable stiffness soft tactile end-effector, presented by her research group.
Carolyn Matl

Carolyn Matl is a research scientist at the Toyota Research Institute, where she works on robotic perception and manipulation with the Mobile Manipulation Team. She received her B.S.E in Electrical Engineering from Princeton University in 2016, and her Ph.D. in Electrical Engineering and Computer Sciences at the University of California, Berkeley in 2021. At Berkeley, she was awarded the NSF Graduate Research Fellowship and was advised by Ruzena Bajcsy. Her dissertation work focused on developing and leveraging non-traditional sensors for robotic manipulation of complicated objects and substances like liquids and doughs.
Carolyn Matl’s Related Research Videos
Links
- StRETcH: a Soft to Resistive Elastic Tactile Hand
- Carolyn Matl’s personal website
- Download mp3 (48.7 MB)
- Subscribe to Robohub using iTunes, RSS, or Spotify
- Support us on Patreon
Engineers Meet Industry Demand By Growing Their Robotics & IAS Expertise
Molecular robots that work cooperatively in swarms
High Performance Inertial Sensors for Robotic Systems
New method allows robot vision to identify occluded objects
#ICRA2022 Science Communication Awards

Number of awards: 2
Amount: travel and accommodation (up to 2k) + free registration
Application deadline: 22 April 2022
We are offering up to 2 Science Communication Awards to motivated roboticists keen on helping us cover ICRA. Your coverage could include videos, interviews, podcasts, blogs, social media, or art. Please apply by sending the following information by April 22 to sabine.hauert@bristol.ac.uk with the subject [ICRA SciCommAward]:
- Name
- Who you are (one paragraph)
- Previous experience in science communication (one paragraph with links to material)
- What you’d like to cover at ICRA and where you will publish material produced (one paragraph)
- Agreement to cross-post your content to the ICRA website or other third-party channels (yes/no).
- Estimated travel budget
Awards will be announced on 25 April.
Smart, Compact Electromechanical Actuators Improve Productivity and Space Efficiency for Industrial AGVs
Combining turgor design and electro-osmosis to create strong and fast hydrogel actuators
ABB Robotics showcases future of retail using recovered marine plastic at London’s Selfridges
A flexible way to grab items with feeling

The GelSight Fin Ray gripper holds a glass Mason jar with its tactile sensing. Photo courtesy of MIT CSAIL.
By Rachel Gordon | MIT CSAIL
The notion of a large metallic robot that speaks in monotone and moves in lumbering, deliberate steps is somewhat hard to shake. But practitioners in the field of soft robotics have an entirely different image in mind — autonomous devices composed of compliant parts that are gentle to the touch, more closely resembling human fingers than R2-D2 or Robby the Robot.
That model is now being pursued by Professor Edward Adelson and his Perceptual Science Group at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). In a recent project, Adelson and Sandra Liu — a mechanical engineering PhD student at CSAIL — have developed a robotic gripper using novel “GelSight Fin Ray” fingers that, like the human hand, is supple enough to manipulate objects. What sets this work apart from other efforts in the field is that Liu and Adelson have endowed their gripper with touch sensors that can meet or exceed the sensitivity of human skin.
Their work was presented last week at the 2022 IEEE 5th International Conference on Soft Robotics.
The fin ray has become a popular item in soft robotics owing to a discovery made in 1997 by the German biologist Leif Kniese. He noticed that when he pushed against a fish’s tail with his finger, the ray would bend toward the applied force, almost embracing his finger, rather than tilting away. The design has become popular, but it lacks tactile sensitivity. “It’s versatile because it can passively adapt to different shapes and therefore grasp a variety of objects,” Liu explains. “But in order to go beyond what others in the field had already done, we set out to incorporate a rich tactile sensor into our gripper.”
The gripper consists of two flexible fin ray fingers that conform to the shape of the object they come in contact with. The fingers themselves are assembled from flexible plastic materials made on a 3D printer, which is pretty standard in the field. However, the fingers typically used in soft robotic grippers have supportive cross-struts running through the length of their interiors, whereas Liu and Adelson hollowed out the interior region so they could create room for their camera and other sensory components.
The camera is mounted to a semirigid backing on one end of the hollowed-out cavity, which is, itself, illuminated by LEDs. The camera faces a layer of “sensory” pads composed of silicone gel (known as “GelSight”) that is glued to a thin layer of acrylic material. The acrylic sheet, in turn, is attached to the plastic finger piece at the opposite end of the inner cavity. Upon touching an object, the finger will seamlessly fold around it, melding to the object’s contours. By determining exactly how the silicone and acrylic sheets are deformed during this interaction, the camera — along with accompanying computational algorithms — can assess the general shape of the object, its surface roughness, its orientation in space, and the force being applied by (and imparted to) each finger.
Liu and Adelson tested out their gripper in an experiment during which just one of the two fingers was “sensorized.” Their device successfully handled such items as a mini-screwdriver, a plastic strawberry, an acrylic paint tube, a Ball Mason jar, and a wine glass. While the gripper was holding the fake strawberry, for instance, the internal sensor was able to detect the “seeds” on its surface. The fingers grabbed the paint tube without squeezing so hard as to breach the container and spill its contents.
The GelSight sensor could even make out the lettering on the Mason jar, and did so in a rather clever way. The overall shape of the jar was ascertained first by seeing how the acrylic sheet was bent when wrapped around it. That pattern was then subtracted, by a computer algorithm, from the deformation of the silicone pad, and what was left was the more subtle deformation due just to the letters.
Glass objects are challenging for vision-based robots because of the refraction of the light. Tactile sensors are immune to such optical ambiguity. When the gripper picked up the wine glass, it could feel the orientation of the stem and could make sure the glass was pointing straight up before it was slowly lowered. When the base touched the tabletop, the gel pad sensed the contact. Proper placement occurred in seven out of 10 trials and, thankfully, no glass was harmed during the filming of this experiment.
Wenzhen Yuan, an assistant professor in the Robotics Institute at Carnegie Mellon University who was not invovled with the research, says, “Sensing with soft robots has been a big challenge, because it is difficult to set up sensors — which are traditionally rigid — on soft bodies,” Yuan says. “This paper provides a neat solution to that problem. The authors used a very smart design to make their vision-based sensor work for the compliant gripper, in this way generating very good results when robots grasp objects or interact with the external environment. The technology has lots of potential to be widely used for robotic grippers in real-world environments.”
Liu and Adelson can foresee many possible applications for the GelSight Fin Ray, but they are first contemplating some improvements. By hollowing out the finger to clear space for their sensory system, they introduced a structural instability, a tendency to twist, that they believe can be counteracted through better design. They want to make GelSight sensors that are compatible with soft robots devised by other research teams. And they also plan to develop a three-fingered gripper that could be useful in such tasks as picking up pieces of fruit and evaluating their ripeness.
Tactile sensing, in their approach, is based on inexpensive components: a camera, some gel, and some LEDs. Liu hopes that with a technology like GelSight, “it may be possible to come up with sensors that are both practical and affordable.” That, at least, is one goal that she and others in the lab are striving toward.
The Toyota Research Institute and the U.S. Office of Naval Research provided funds to support this work.