By Matt Shipman
Researchers who created a soft robot that could navigate simple mazes without human or computer direction have now built on that work, creating a “brainless” soft robot that can navigate more complex and dynamic environments.
“In our earlier work, we demonstrated that our soft robot was able to twist and turn its way through a very simple obstacle course,” says Jie Yin, co-corresponding author of a paper on the work and an associate professor of mechanical and aerospace engineering at North Carolina State University. “However, it was unable to turn unless it encountered an obstacle. In practical terms this meant that the robot could sometimes get stuck, bouncing back and forth between parallel obstacles.
“We’ve developed a new soft robot that is capable of turning on its own, allowing it to make its way through twisty mazes, even negotiating its way around moving obstacles. And it’s all done using physical intelligence, rather than being guided by a computer.”
Physical intelligence refers to dynamic objects – like soft robots – whose behavior is governed by their structural design and the materials they are made of, rather than being directed by a computer or human intervention.
As with the earlier version, the new soft robots are made of ribbon-like liquid crystal elastomers. When the robots are placed on a surface that is at least 55 degrees Celsius (131 degrees Fahrenheit), which is hotter than the ambient air, the portion of the ribbon touching the surface contracts, while the portion of the ribbon exposed to the air does not. This induces a rolling motion; the warmer the surface, the faster the robot rolls.
However, while the previous version of the soft robot had a symmetrical design, the new robot has two distinct halves. One half of the robot is shaped like a twisted ribbon that extends in a straight line, while the other half is shaped like a more tightly twisted ribbon that also twists around itself like a spiral staircase.
This asymmetrical design means that one end of the robot exerts more force on the ground than the other end. Think of a plastic cup that has a mouth wider than its base. If you roll it across the table, it doesn’t roll in a straight line – it makes an arc as it travels across the table. That’s due to its asymmetrical shape.
“The concept behind our new robot is fairly simple: because of its asymmetrical design, it turns without having to come into contact with an object,” says Yao Zhao, first author of the paper and a postdoctoral researcher at NC State. “So, while it still changes directions when it does come into contact with an object – allowing it to navigate mazes – it cannot get stuck between parallel objects. Instead, its ability to move in arcs allows it to essentially wiggle its way free.”
The researchers demonstrated the ability of the asymmetrical soft robot design to navigate more complex mazes – including mazes with moving walls – and fit through spaces narrower than its body size. The researchers tested the new robot design on both a metal surface and in sand.
“This work is another step forward in helping us develop innovative approaches to soft robot design – particularly for applications where soft robots would be able to harvest heat energy from their environment,” Yin says.
The paper, “Physically Intelligent Autonomous Soft Robotic Maze Escaper,” appears in the journal Science Advances. First author of the paper is Yao Zhao, a postdoctoral researcher at NC State. Hao Su, an associate professor of mechanical and aerospace engineering at NC State, is co-corresponding author. Additional co-authors include Yaoye Hong, a recent Ph.D. graduate of NC State; Yanbin Li, a postdoctoral researcher at NC State; and Fangjie Qi and Haitao Qing, both Ph.D. students at NC State.
The work was done with support from the National Science Foundation under grants 2005374, 2126072, 1944655 and 2026622.
By Roger Van Scyoc
On a cool afternoon at the heart of the University of Washington’s campus, autumn, for a few fleeting moments, appears to have arrived early. Tiny golden squares resembling leaves flutter then fall, switching from a frenzied tumble to a graceful descent with a snap.
Aptly named “microfliers” and inspired by Miura-fold origami, these small robotic devices can fold closed during their descent after being dropped from a drone. This “snapping” action changes the way they disperse and may, in the future, help change the way scientists study agriculture, meteorology, climate change and more.
“In nature, you see leaves and seeds disperse in just one manner,” said Kyle Johnson, an Allen School Ph.D. student and a first co-author of the paper on the subject published in Science Robotics. “What we were able to achieve was a structure that can actually act in two different ways.”
When open flat, the devices tumble chaotically, mimicking the descent of an elm leaf. When folded closed, they drop in a more stable manner, mirroring how a maple leaf falls from a branch. Through a number of methods — onboard pressure sensor, timer or a Bluetooth signal — the researchers can control when the devices transition from open to closed, and in doing so, manipulate how far they disperse through the air.
How could they achieve this? By reading between the lines.
“The Miura-ori origami fold, inspired by geometric patterns found in leaves, enables the creation of structures that can ‘snap’ between a flat and more folded state,” said co-senior author Vikram Iyer, an Allen School professor and co-director of the Computing for the Environment (CS4Env) initiative. “Because it only takes energy to switch between the states, we began exploring this as an energy efficient way to change surface area in mid-air, with the intuition that opening or closing a parachute will change how fast an object falls.”
That energy efficiency is key to being able to operate without batteries and scale down the fliers’ size and weight. Fitted with a battery-free actuator and a solar power-harvesting circuit, microfliers boast energy-saving features not seen in larger and heavier battery-powered counterparts such as drones. Yet they are robust enough to carry sensors for a number of metrics, including temperature, pressure, humidity and altitude. Beyond measuring atmospheric conditions, the researchers say a network of these devices could help paint a picture of crop growth on farmland or detect gas leaks near population centers.
“This approach opens up a new design space for microfliers by using origami,” said Shyam Gollakota, the Thomas J. Cable Endowed Professor in the Allen School and director of the school’s Mobile Intelligence Lab who was also a co-senior author. “We hope this work is the first step towards a future vision for creating a new class of fliers and flight modalities.”
Weighing less than half a gram, microfliers require less material and cost less than drones. They also offer the ability to go where it’s too dangerous for a human to set foot.
For instance, Johnson said, microfliers could be deployed when tracking forest fires. Currently, firefighting teams sometimes rappel down to where a fire is spreading. Microfliers could assist in mapping where a fire may be heading and where best to drop a payload of water. Furthermore, the team is working on making more components of the device biodegradable in the case that they can’t be recovered after being released.
“There’s a good amount of work toward making these circuits more sustainable,” said Vicente Arroyos, another Allen School Ph.D. student and first co-author on the paper. “We can leverage our work on biodegradable materials to make these more sustainable.”
Besides improving sustainability, the researchers also tackled challenges relating to the structure of the device itself. Early prototypes lacked the carbon fiber roots that provide the rigidity needed to prevent accidental transitions between states.
Collecting maple and elm leaves from outside their lab, the researchers noticed that while their origami structures exhibited the bistability required to change between states, they flexed too easily and didn’t have the venation seen in the found foliage. To gain more fine-grained control, they took another cue from the environment.
“We looked again to nature to make the faces of the origami flat and rigid, adding a vein-like pattern to the structure using carbon fiber,” Johnson said. “After that modification, we no longer saw a lot of the energy that we input dissipate over the origami’s faces.”
In total, the researchers estimate that the development of their design took about two years. There’s still room to grow, they added, noting that the current microfliers can only transition from open to closed. They said newer designs, by offering the ability to switch back and forth between states, may offer more precision and flexibility in where and how they’re used.
During testing, when dropped from an altitude of 40 meters, for instance, the microfliers could disperse up to distances of 98 meters in a light breeze. Further refinements could increase the area of coverage, allowing them to follow more precise trajectories by accounting for variables such as wind and inclement conditions.
Related to their previous work with dandelion-inspired sensors, the origami microfliers build upon the researchers’ larger goal of creating the internet of bio-inspired things. Whereas the dandelion-inspired devices featured passive flight, reflecting the manner in which dandelion seeds disperse through the wind, the origami microfliers function as complete robotic systems that include actuation to change their shape, active and bi-directional wireless transmission via an onboard radio, and onboard computing and sensing to autonomously trigger shape changes upon reaching a target altitude.
“This design can also accommodate additional sensors and payload due to its size and power harvesting capabilities,” Arroyos said. “It’s exciting to think about the untapped potential for these devices.”
The future, in other words, is quickly taking shape.
“Origami is inspired by nature,” Johnson added, smiling. “These patterns are all around us. We just have to look in the right place.”
The project was an interdisciplinary work by an all-UW team. The paper’s co-authors also included Amélie Ferran, a Ph.D. student in the mechanical engineering department, as well as Raul Villanueva, Dennis Yin and Tilboon Elberier, who contributed as undergraduate students studying electrical and computer engineering, and mechanical engineering professors Alberto Aliseda and Sawyer Fuller.
Johnson and Arroyos, who co-founded and currently lead the educational nonprofit AVELA – A Vision for Engineering Literacy & Access, and their teammates have done outreach efforts in Washington state K-12 schools related to the research, including showing students how to create their own bi-stable leaf-out origami structure using a piece of paper. Check out a related demonstration video here, and learn more about the microflier project here and in a related UW News release and GeekWire story.
By Helen Massy-Beresford
Imagine a single technology that could help a robot perform safety checks at a nuclear plant, cure a person’s arachnophobia and simulate the feeling of a hug from a distant relative.
Welcome to the world of “extended reality”. Researchers funded by the EU have sought to demonstrate its enormous potential.
Their goal was to make augmented reality, in which the real world is digitally enhanced, and virtual reality – a fully computer-generated environment – more immersive for users.
One of the researchers, Erik Hernandez Jimenez, never imagined the immediate relevance of a project that he led when it started in mid-2019. Within a year, the Covid-19 pandemic had triggered countless lockdowns that left people working and socialising through video connections from home.
‘We thought about how to apply this technology, how to feel human touch even at a distance, when we were all locked at home and contact with others was through a computer,’ said Hernandez Jimenez.
He coordinated the EU research initiative, which was named TACTILITY and ran from July 2019 until the end of September 2022.
The TACTILITY team developed a glove that simulates the sense of touch. Users have the sensation of touching virtual objects through electrical pulses delivered by electrodes embedded in the glove.
The sensations range from pushing a button and feeling pressure on the finger to handling a solid object and feeling its shape, dimensions and texture.
Glove and suit
‘TACTILITY is about including tactile feedback in a virtual-reality scenario,’ said Hernandez Jimenez, who is a project manager at Spanish research institute TECNALIA.
He said the principle could be extended from the glove to a whole body suit.
Compared with past attempts to simulate touch sensations with motors, the electro-tactile feedback technique produces a more realistic result at a lower cost, according to Hernandez Jimenez.
This opens up the possibility of making the technology more widely accessible.
The research bolsters European Commission efforts to develop the virtual-worlds domain, which could provide 860 000 new jobs in Europe this decade as the worldwide sector grows from €27 billion in 2022.
The EU has around 3 700 companies, research organisations and governmental bodies that operate in this sphere, according to the Commission.
Phobias to factories
The TACTILITY researchers looked at potential healthcare applications.
“We thought about how to apply this technology, how to feel human touch even at a distance.”
– Erik Hernandez Jimenez, TACTILITY
That’s where spiders come into the picture. They were among the objects in the project’s experiments to mimic touch.
‘One that was quite impressive – although I didn’t like it at all – was feeling a spider or a cockroach crawling over your hand,’ Hernandez Jimenez said.
A potential use for the technology is treating phobias through exposure therapy in which patients are gradually desensitised to the source of their fear. That could start by virtually “touching” cartoon-like creepy crawlies before progressing to more lifelike versions.
The tactile glove can also be used in the manufacturing industry, helping the likes of car manufacturers train their workers to perform tricky manoeuvres on the factory floor.
Furthermore, it can help people collaborate more effectively with remotely controlled robots in hazardous environments. An example is a nuclear power plant, where a person in a control room can virtually “feel” what a robot is touching.
‘They get another sense and another kind of feedback, with more information to perform better checks,’ Hernandez Jimenez said.
Joyful and playful
Wearable technologies for virtual-reality environments are also being inspired by the gaming industry.
Researchers in a second EU-funded project sought to expand the prospects for technologies already widely used for professional purposes. The initiative, called WEARTUAL, ran from May 2019 until late 2021.
“Wearables are fashion items – they’re part of the way we construct our identity.”
– Oğuz ‘Oz’ Buruk, WEARTUAL
‘Our project focused on the more experiential side – joyful and playful activities,’ said Oğuz ‘Oz’ Buruk, who coordinated WEARTUAL and is assistant professor of gameful experience at Tampere University in Finland.
Until recently, experiencing a virtual-reality environment involved a hand-held controller or head-mounted display.
The WEARTUAL researchers looked at ways of incorporating wearables worn, for example, on the wrist or ankle into virtual reality to give people a sense of greater immersion.
That could mean having their avatar – a representative icon or figure in the virtual world – blush when nervous or excited to enhance their ability to express themselves.
On the cusp
The team developed a prototype that could integrate varying physical sensations into the virtual world by transferring to it real-life data such as heart rate.
Buruk is interested in how games will look in the “posthuman” era, when people and machines increasingly converge through bodily implants, robotics and direct communication between the human brain and computers.
He signals that it’s hard to overestimate the eventual impact of advances in this area on everyday life, albeit over varying timescales: wearables are likely to be much more widely used in virtual reality in the next decade, while widespread use of bodily implants is more likely to take 50 to 100 years.
As technology and human bodies become ever more closely linked, the experience of transferring them to a virtual world will be enhanced, encouraging people to spend increasing amounts of time there, according to Buruk.
Virtual-reality technologies are already being used for practical purposes such as gamifying vital information including fire-safety procedures, making it more interactive and easier to learn. This type of use could expand to many areas.
On a very different front, several fashion houses already sell clothes that can be worn in virtual environments, allowing people to express their identity and creativity.
‘Wearables are fashion items – they’re part of the way we construct our identity,’ Buruk said. ‘Investments in virtual reality, extended reality and augmented reality are increasing every day.’
Research in this article was funded by the EU via the Marie Skłodowska-Curie Actions (MSCA).
This article was originally published in Horizon, the EU Research and Innovation magazine.