Page 2 of 2
1 2

Robot-ants that communicate and work together


A team of EPFL researchers has developed tiny 10-gram robots that are inspired by ants: they can communicate with each other, assign roles among themselves and complete complex tasks together. These reconfigurable robots are simple in structure, yet they can jump and crawl to explore uneven surfaces. The researchers have just published their work in Nature.

Individually, ants have only so much strength and intelligence. However, as a colony, they can use complex strategies for achieving sophisticated tasks to survive their larger predators.
At EPFL, researchers in NCCR Robotics Professor Jamie Paik’s Laboratory have reproduced this phenomenon, developing tiny robots that display minimal physical intelligence on an individual level but that are able to communicate and act collectively. Despite being simple in design and weighing only 10 grams, each robot has multiple locomotion modes to navigate any type of surface. Collectively, they can quickly detect and overcome obstacles, pass them and move objects much larger and heavier than themselves. The related research has been published in Nature.

Robots modeled on trap-jaw ants
These three-legged, T-shaped origami robots are called Tribots. They can be assembled in only a few minutes by folding a stack of thin, multi-material sheets, making them suitable for mass production. Completely autonomous and untethered, Tribots are equipped with infrared and proximity sensors for detection and communication purposes. They could accommodate even more sensors depending on the application.

“Their movements are modeled on those of Odontomachus ants. These insects normally crawl, but to escape a predator, they snap their powerful jaws together to jump from leaf to leaf”, says Zhenishbek Zhakypov, the first author. The Tribots replicate this catapult mechanism through an elegant origami robot design that combines multiple shape-memory alloy actuators. As a result, a single robot can produce three distinctive locomotive motions – crawling, rolling and jumping both vertically and horizontally – just like these creatively resilient ants.

Roles: leader, worker and explorer
Despite having the same “anatomy”, each robot is assigned a specific role depending on the situation. ‘Explorers’ detect physical obstacles in their path, such as objects, valleys and mountains. After detecting an obstacle, they inform the rest of the group. Then, the “leader” gives the instructions. The ‘workers’, meanwhile, pool their strength to move objects. “Each Tribot, just like Odontomachus ants, can have different roles. However, they can also take on new roles instantaneously when faced with a new mission or an unknown environment, or even when other members get lost. This goes beyond what the real ants can do” says Paik.

Future applications
In practical situations, such as in an emergency search mission, Tribots could be deployed en masse. And thanks to their multi-locomotive and multi-agent communication capabilities, they could locate a target quickly over a large surface without relying on GPS or visual feedback. “Since they can be manufactured and deployed in large numbers, having some ‘casualties’ would not affect the success of the mission,” adds Paik. “With their unique collective intelligence, our tiny robots are better equipped to adapt to unknown environments. Therefore, for certain missions, they would outperform larger, more powerful robots.” The development of robots for search-and-rescue applications and the study of collective robotics are key research areas within the NCCR Robotics consortium, of which Jamie Paik’s lab is part.
In April, Jamie Paik has presented her reconfigurable robots at the TED2019 Conference in Vancouver. Her talk is available here.

Literature
Zhenishbek Zhakypov, Kazuaki Mori, Koh Hosoda and Jamie Paik, Designing minimal and scalable insect-inspired multi-locomotion millirobots, Nature
DOI: 10.1038/s41586-019-1388-8

A prosthetic that restores the sense of where your hand is

Researchers have developed a next-generation bionic hand that allows amputees to regain their proprioception. The results of the study, which have been published in Science Robotics, are the culmination of ten years of robotics research.

The next-generation bionic hand, developed by researchers from EPFL, the Sant’Anna School of Advanced Studies in Pisa and the A. Gemelli University Polyclinic in Rome, enables amputees to regain a very subtle, close-to-natural sense of touch. The scientists managed to reproduce the feeling of proprioception, which is our brain’s capacity to instantly and accurately sense the position of our limbs during and after movement – even in the dark or with our eyes closed.

The new device allows patients to reach out for an object on a table and to ascertain an item’s consistency, shape, position and size without having to look at it. The prosthesis has been successfully tested on several patients and works by stimulating the nerves in the amputee’s stump. The nerves can then provide sensory feedback to the patients in real time – almost like they do in a natural hand.

The findings have been published in the journal Science Robotics. They are the result of ten years of scientific research coordinated by NCCR Professor Silvestro Micera, who teaches bioengineering at EPFL and the Sant’Anna School of Advanced Studies, and Paolo Maria Rossini, director of neuroscience at the A. Gemelli University Polyclinic in Rome. NCCR Robotics supported the research, together with the European Commission and the Bertarelli Foundation.

Sensory feedback

Current myoelectric prostheses allow amputees to regain voluntary motor control of their artificial limb by exploiting residual muscle function in the forearm. However, the lack of any sensory feedback means that patients have to rely heavily on visual cues. This can prevent them from feeling that their artificial limb is part of their body and make it more unnatural to use.

Recently, a number of research groups have managed to provide tactile feedback in amputees, leading to improved function and prosthesis embodiment. But this latest study has taken things one step further.

“Our study shows that sensory substitution based on intraneural stimulation can deliver both position feedback and tactile feedback simultaneously and in real time,” explains Micera. “The brain has no problem combining this information, and patients can process both types in real time with excellent results.”

Intraneural stimulation re-establishes the flow of external information using electric pulses sent by electrodes inserted directly into the amputee’s stump. Patients then have to undergo training to gradually learn how to translate those pulses into proprioceptive and tactile sensations.

This technique enabled two amputees to regain high proprioceptive acuity, with results comparable to those obtained in healthy subjects. The simultaneous delivery of position information and tactile feedback allowed the two amputees to determine the size and shape of four objects with a high level of accuracy (75.5%).

“These results show that amputees can effectively process tactile and position information received simultaneously via intraneural stimulation,” says Edoardo D’Anna, EPFL researcher and lead author of the study.

The technologies pioneered by this study will be further explored during the third and final phase of NCCR Robotics, which will run until 2022 with Micera leading the Wearable Robotics research programme. “During the next phase, we plan to expand the use of implanted systems for prosthetics and rehabilitation, and the implants we used in this experiment will be tested in combination with different wearable devices and for different applications” explains Micera.

Literature
E. D’Anna, G. Valle, A. Mazzoni, I. Strauss, F. Iberite, J. Patton, F. Petrini, S. Raspopovic, G. Granata, R. Di Iorio, M. Controzzi, C. Cipriani, T. Stieglitz, P. M. Rossini, and S. Micera, “A closed-loop hand prosthesis with simultaneous intraneural tactile and position feedback”, Science Robotics.

A robot recreates the walk of a fossilized animal

OroBOT – Credit: Maxime Marendaz

Using the fossil and fossilized footprints of a 300-million-year-old animal, scientists from EPFL and Humboldt-Universität zu Berlin have identified the most likely gaits of extinct animals and designed a robot that can recreate an extinct animal’s walk. This study can help researchers better understand how vertebrate locomotion evolved over time.

How did vertebrates walk 300 million years ago? Could they already stand upright on their legs? Did they move in a balanced, energy-efficient way? Scientists at EPFL’s Biorobotics Laboratory – supported by the National Center of Competence in Research (NCCR) Robotics – and the Interdisciplinary Laboratory Image Knowledge Gestaltung at Humboldt-Universität zu Berlin set out to answer these questions. Using the fossilized skeleton and footprints of Orobates pabsti – a vertebrate that, on the evolutionary tree, comes between amphibians on one hand and reptiles and mammals on the other – the scientists created computer simulations and a robot. Drawing on experimental studies of four living amphibian and reptile species, they used these tools to gauge how plausible different ways of walking were for the fossilized animal.

“Orobates is an ideal candidate for understanding how land vertebrates evolved because it is in the lineage leading to modern amniotes. These animals formed in eggs laid on land and became largely independent of water,” says John Nyakatura, a professor at Humboldt-Universität. What’s more, Orobates is the oldest-known vertebrate for which scientists have been able to link a fossil with its fossilized footprints. “This combination is what enabled us to carry out our unique quantitative study, which paves the way to replicating the walk of other fossilized animals,” says NCCR Robotics Professor Auke Ijspeert. The researchers’ findings appear in Nature.

A motion-based model and then a robotic one

To better understand how Orobates walked and pinpoint just how advanced its locomotion was, the scientists at Humboldt-Universität developed a digital model of its skeleton based on the animal’s fossil and the biomechanics of modern animals with sprawling postures. They used this model to carry out the first kinematic computer simulation of Orobates’ gait as it walks on its digitalized footprints. This simulation focuses on movements (rather than forces) and identifies gaits where the animal’s bones do not collide or come out of their joints.

In parallel, two scientists at EPFL’s Biorobotics Laboratory – post-doctoral researcher Kamilo Melo and PhD student Tomislav Horvat, both members of NCCR Robotics at the time of the study – used the fossilized animal anatomy to build a robot called OroBOT. Designed and scaled to match the shape and movements of the extinct animal, OroBOT was used to calculate the physics of how Orobates walked. “We tested our hypotheses about the animal’s locomotion dynamics with our robotic model, which factors in the real-world physics of the animal’s gait,” says Melo.

Tomislav Horvat and Kamilo Melo – Credit: Maxime Marendaz

Testing hundreds of different gaits, based on contemporary animals

The interdisciplinary team of scientists tested hundreds of different gaits with their robot in order to determine which ones Orobates could have used – and those that it clearly did not. The gaits they tested were based on biomechanical principles extracted from similar modern-day animals such as caimans, salamanders, iguanas and skinks, which they analyzed through X-ray videos and force measurements. “We studied the biomechanics of their movements and determined which mechanical principles they all followed”, says Nyakatura. The research team looked at three features in particular: how erect the animal stood on its legs; how its backbone bent; and how much its elbow or shoulder joints bent as it walked. These three features determine what the researchers call the animal’s “sprawling gait space”. They created a powerful interactive website where fellow scientists – and the wider public – can explore the universe of movements that Orobates could have used.

With these results, they came up with the most likely ways that Orobates may have walked. They scored the gaits based on how much energy was required, how stable the movements were, how the leg forces compared with those of other sprawling animals, and how closely the movements aligned with the fossilized footprints. The gaits with good scores appear quite athletic and most closely resemble the movements of caimans. This suggests that Orobates probably already held itself a little upright on its legs – unlike salamanders and skinks. Its locomotion was thus more advanced — more upright, balanced and mechanically power-saving — than had been previously thought.

The study concludes that advanced locomotion, as in Orobates, may have evolved before the common ancestor of reptiles and mammals lived. The novel approach developed for this study can be applied by other scientists in their work, and it could be modified to study other evolutionary transitions, such as the origins of flight or galloping gaits in mammals. Last but not least, being able to select the most efficient gait for any given morphology is of fundamental importance for the walking robots that NCCR Robotics researchers are developing, in particular for search-and-rescue applications.

Orobates fossil – Credit: Maxime Marendaz

Orobates fossil

Literature

John A. Nyakatura, Kamilo Melo, Tomislav Horvat, Kostas Karakasiliotis, Vivian R. Allen, Amir Andikfar, Emanuel Andrada, Patrick Arnold, Jonas Lauströer, John R. Hutchinson, Martin S. Fischer and Auke J. Ijspeert. “Reverse-engineering the locomotion of a stem amniote“, Nature, 17 January 2019.

A new drone can change its shape to fly through a narrow gap

A research team from the University of Zurich and EPFL has developed a new drone that can retract its propeller arms in flight and make itself small to fit through narrow gaps and holes. This is particularly useful when searching for victims of natural disasters.

Inspecting a damaged building after an earthquake or during a fire is exactly the kind of job that human rescuers would like drones to do for them. A flying robot could look for people trapped inside and guide the rescue team towards them. But the drone would often have to enter the building through a crack in a wall, a partially open window, or through bars – something the typi-cal size of a drone does not allow.
To solve this problem, researchers from Scaramuzza lab at the University of Zurich and Floreano lab at EPFL created a new kind of drone. Both groups are part of the National Centre of Competence in Research (NCCR) Robotics funded by the Swiss National Science Foundation. Inspired by birds that fold their wings in mid-air to cross narrow passages, the new drone can squeeze itself to pass through gaps and then go back to its previous shape, all the while continuing to fly. And it can even hold and transport objects along the way.

Mobile arms can fold around the main frame
“Our solution is quite simple from a mechanical point of view, but it is very versatile and very au-tonomous, with onboard perception and control systems,” explains Davide Falanga, researcher at the University of Zurich and the paper’s first author. In comparison to other drones, this morphing drone can maneuver in tight spaces and guarantee a stable flight at all times.
The Zurich and Lausanne teams worked in collaboration and designed a quadrotor with four pro-pellers that rotate independently, mounted on mobile arms that can fold around the main frame thanks to servo-motors. The ace in the hole is a control system that adapts in real time to any new position of the arms, adjusting the thrust of the propellers as the center of gravity shifts.
“The morphing drone can adopt different configurations according to what is needed in the field,” adds Stefano Mintchev, co-author and researcher at EPFL. The standard configuration is X-shaped, with the four arms stretched out and the propellers at the widest possible distance from each other. When faced with a narrow passage, the drone can switch to a “H” shape, with all arms lined up along one axis or to a “O” shape, with all arms folded as close as possible to the body. A “T” shape can be used to bring the onboard camera mounted on the central frame as close as possible to objects that the drone needs to inspect.

First step to fully autonomous rescue searches
In the future, the researchers hope to further improve the drone structure so that it can fold in all three dimensions. Most importantly, they want to develop algorithms that will make the drone truly autonomous, allowing it to look for passages in a real disaster scenario and automatically choose the best way to pass through them. “The final goal is to give the drone a high-level instruction such as ‘enter that building, inspect every room and come back’ and let it figure out by itself how to do it,” says Falanga.

Literature
Davide Falanga, Kevin Kleber, Stefano Mintchev, Dario Floreano, Davide Scaramuzza. The Foldable Drone: A Morphing Quadrotor that can Squeeze and Fly. IEEE Robotics and Auto-mation Letter, 10 December 2018. DOI:10.1109/LRA.2018.2885575

Small flying robots able to pull objects up to 40 times their weight


Researchers from EPFL and Stanford have developed small drones that can land and then move objects that are 40 times their weight, with the help of powerful winches, gecko adhesives and microspines.

A closed door is just one of many obstacles that no longer pose a barrier to the small flying robots developed jointly by Ecole Polytechnique Fédérale de Lausanne (EPFL) and Stanford University. Equipped with advanced gripping technology – inspired by gecko and insect feet – and able to interact with the world around them, these robots can work together to lasso a door handle and tug the door open. Read More

Page 2 of 2
1 2