Archive 23.09.2023

Page 2 of 5
1 2 3 4 5

Soft robotic tool provides new ‘eyes’ in endovascular surgery

Scientists at the Max Planck Institute for Intelligent Systems in Stuttgart have developed a soft robotic tool that promises to one day transform minimally invasive endovascular surgery. The two-part magnetic tool can help to visualise in real time the fine morphological details of partial vascular blockages such as stenoses, even in the narrowest and most curved vessels. It can also find its way through severe blockages such as chronic total occlusions. This tool could one day take the perception of endovascular medical devices a step further.

Intravascular imaging techniques and microcatheter procedures are becoming ever more advanced, revolutionizing the diagnosis and treatment of many diseases. However, current methods often fail to accurately detect the fine features of vascular disease, such as those seen from within occluded vessels, due to limitations such as uneven contrast agent diffusion and difficulty in safely accessing occluded vessels. Such limitations can delay rapid intervention and treatment of a patient.

Scientists at the Max Planck Institute for Intelligent Systems in Stuttgart have looked at this problem. They have leveraged the concepts of soft robotics and microfabrication to develop a miniature soft magnetic tool that looks like a very slim eel. This tool may one day take the perception capabilities of endovascular devices one step further. In a paper and in a video, the team shows how the tool, which is propelled forward by the blood flow, travels through the narrowest artificial vessels – whether there is a sharp bend, curve, or obstacle.

When the tool reaches an occlusion like a partially blocked artery, it performs a wave-like deformation given the external magnetic field (more on that below). Then, the deformed soft body will be gently in contact with the surrounding occluded structures. Lastly, the real-time shapes of the device when we retract it will ‘visualize’ the morphological details inside the vessel, which facilitates the drug release at occlusion, as well as the sizing and placement of medical devices like stents and balloons for following treatment.

When there is a severe occlusion with only tiny microchannels for the blood to flow through, the tool can utilize the force from the blood to easily slide through these narrow channels. Which way was chosen indicates to the surgeon which access route to take for the following medical operation.

“The methods of diagnosing and treating endovascular narrow diseases such as vascular stenosis or chronic total occlusion are still very limited. It is difficult to accurately detect and cross these areas in the very complex network of vessels inside the body”, says Yingbo Yan, who is a guest researcher in the Physical Intelligence Department at MPI-IS. He is the first author of the paper “Magnetically-assisted soft milli-tools for occluded lumen morphology detection”, which was published in Science Advances on August 18, 2023. “We hope that our new soft robotic tool can one day help accurately detect and navigate through the many complex and narrow vessels inside a body, and perform treatments more effectively, reducing potential risks.”

This tiny and soft tool has a 20 mm long magnetic Active Deformation Segment (ADS) and a 5mm long Fluid Drag-driven Segment (FDS). The magnetization profile of ADS is pre-programmed with a vibrating-sample magnetometer, providing a uniform magnetic field. Under an external magnetic field, this part can deform into a sinusoidal shape, easily adapting to the surrounding environment and deforming into various shapes. Thus, continuous monitoring of the shape changes of ADS while retracting it can provide detailed morphological information of the partial occlusions inside a vessel.

The FDS was fabricated using a soft polymer. Small beams on its side are bent by the fluidic drag from the incoming flow. In this way, the entire tool is carried towards the area with the highest flow velocity. Therefore, learning the location of the FDS while advancing it can point to the location and the route of the microchannel inside the severe occlusions.

“Detection of vascular diseases in the distal and hard-to-reach vascular regions such as the brain can be more challenging clinically, and our tool could work with Stentbot in the untethered mode”, says Tianlu Wang, a postdoc in the Physical Intelligence Department at MPI-IS and another first author of the work. “Stentbot is a wireless robot used for locomotion and medical functions in the distal vasculature we recently developed in our research group. We believe this new soft robotic tool can add new capabilities to wireless robots and contribute new solutions in these challenging regions.”

“Our tool shows potential to greatly improve minimally invasive medicine. This technology can reach and detect areas that were previously difficult to access. We expect that our robot can help make the diagnosis and treatment of, for instance, stenosis or a CTO more precise and timelier”, says Metin Sitti, Director of the Physical Intelligence Department at MPI-IS, Professor at Koç University and ETH Zurich.

Using sound waves to propel a microrobot through narrow tubes

A team of robotic and acoustic engineers from the Institute of Robotics and Intelligent Systems, ETH Zurich, and Institut für Theoretische Physik, Center for Soft Nanoscience, Westfälische Wilhelms-Universität Münster, has developed a microrobot that can be propelled through narrow tubes using sound waves. In their paper published in the journal Science Advances the group describes how they designed their robots and how well they worked when tested.

‘Brainless’ robot can navigate complex obstacles

By Matt Shipman

Researchers who created a soft robot that could navigate simple mazes without human or computer direction have now built on that work, creating a “brainless” soft robot that can navigate more complex and dynamic environments.

“In our earlier work, we demonstrated that our soft robot was able to twist and turn its way through a very simple obstacle course,” says Jie Yin, co-corresponding author of a paper on the work and an associate professor of mechanical and aerospace engineering at North Carolina State University. “However, it was unable to turn unless it encountered an obstacle. In practical terms this meant that the robot could sometimes get stuck, bouncing back and forth between parallel obstacles.

“We’ve developed a new soft robot that is capable of turning on its own, allowing it to make its way through twisty mazes, even negotiating its way around moving obstacles. And it’s all done using physical intelligence, rather than being guided by a computer.”

Physical intelligence refers to dynamic objects – like soft robots – whose behavior is governed by their structural design and the materials they are made of, rather than being directed by a computer or human intervention.

As with the earlier version, the new soft robots are made of ribbon-like liquid crystal elastomers. When the robots are placed on a surface that is at least 55 degrees Celsius (131 degrees Fahrenheit), which is hotter than the ambient air, the portion of the ribbon touching the surface contracts, while the portion of the ribbon exposed to the air does not. This induces a rolling motion; the warmer the surface, the faster the robot rolls.

However, while the previous version of the soft robot had a symmetrical design, the new robot has two distinct halves. One half of the robot is shaped like a twisted ribbon that extends in a straight line, while the other half is shaped like a more tightly twisted ribbon that also twists around itself like a spiral staircase.

This asymmetrical design means that one end of the robot exerts more force on the ground than the other end. Think of a plastic cup that has a mouth wider than its base. If you roll it across the table, it doesn’t roll in a straight line – it makes an arc as it travels across the table. That’s due to its asymmetrical shape.

“The concept behind our new robot is fairly simple: because of its asymmetrical design, it turns without having to come into contact with an object,” says Yao Zhao, first author of the paper and a postdoctoral researcher at NC State. “So, while it still changes directions when it does come into contact with an object – allowing it to navigate mazes – it cannot get stuck between parallel objects. Instead, its ability to move in arcs allows it to essentially wiggle its way free.”

The researchers demonstrated the ability of the asymmetrical soft robot design to navigate more complex mazes – including mazes with moving walls – and fit through spaces narrower than its body size. The researchers tested the new robot design on both a metal surface and in sand.

“This work is another step forward in helping us develop innovative approaches to soft robot design – particularly for applications where soft robots would be able to harvest heat energy from their environment,” Yin says.

The paper, “Physically Intelligent Autonomous Soft Robotic Maze Escaper,” appears in the journal Science Advances. First author of the paper is Yao Zhao, a postdoctoral researcher at NC State. Hao Su, an associate professor of mechanical and aerospace engineering at NC State, is co-corresponding author. Additional co-authors include Yaoye Hong, a recent Ph.D. graduate of NC State; Yanbin Li, a postdoctoral researcher at NC State; and Fangjie Qi and Haitao Qing, both Ph.D. students at NC State.

The work was done with support from the National Science Foundation under grants 2005374, 2126072, 1944655 and 2026622.

Creation of training data to estimate the states of care robot users

A research team led by Assistant Professor Mizuki Takeda from the Department of Mechanical Engineering, Toyohashi University of Technology, has developed a technique to generate training data for robots that operate based on estimations of the user's state using machine learning. The research is published in the journal IEEE Access.

Flexible robot that can sneak into small spaces for mapping, inspections

A lizard-like soft robot that can creep into walls, ductwork, and pipes to perform inspections and three-dimensional mapping tasks that could be dangerous or impossible for humans has been developed by WPI researchers partnered with the City of Worcester.

A framework for risk-aware robot navigation in unknown environments

Mobile robots have become increasingly sophisticated and are now being deployed in a growing number of real-world environments, including airports, malls, museums, health care facilities and other settings. So far, however, most of these robots have been introduced in clearly defined indoor environments, as opposed to completing missions that would require them to travel across the city or explore unknown and unmapped spaces.

Battery-free origami microfliers from UW researchers offer a new bio-inspired future of flying machines

Researchers at the University of Washington developed small robotic devices that can change how they move through the air by “snapping” into a folded position during their descent. Shown here is a timelapse photo of the “microflier” falling in its unfolded state, which makes it tumble chaotically and spread outward in the wind. Photo by Mark Stone/University of Washington

By Roger Van Scyoc

On a cool afternoon at the heart of the University of Washington’s campus, autumn, for a few fleeting moments, appears to have arrived early. Tiny golden squares resembling leaves flutter then fall, switching from a frenzied tumble to a graceful descent with a snap.

Aptly named “microfliers” and inspired by Miura-fold origami, these small robotic devices can fold closed during their descent after being dropped from a drone. This “snapping” action changes the way they disperse and may, in the future, help change the way scientists study agriculture, meteorology, climate change and more.

“In nature, you see leaves and seeds disperse in just one manner,” said Kyle Johnson, an Allen School Ph.D. student and a first co-author of the paper on the subject published in Science Robotics. “What we were able to achieve was a structure that can actually act in two different ways.”

When open flat, the devices tumble chaotically, mimicking the descent of an elm leaf. When folded closed, they drop in a more stable manner, mirroring how a maple leaf falls from a branch. Through a number of methods — onboard pressure sensor, timer or a Bluetooth signal — the researchers can control when the devices transition from open to closed, and in doing so, manipulate how far they disperse through the air.

How could they achieve this? By reading between the lines.

“The Miura-ori origami fold, inspired by geometric patterns found in leaves, enables the creation of structures that can ‘snap’ between a flat and more folded state,” said co-senior author Vikram Iyer, an Allen School professor and co-director of the Computing for the Environment (CS4Env) initiative. “Because it only takes energy to switch between the states, we began exploring this as an energy efficient way to change surface area in mid-air, with the intuition that opening or closing a parachute will change how fast an object falls.”

That energy efficiency is key to being able to operate without batteries and scale down the fliers’ size and weight. Fitted with a battery-free actuator and a solar power-harvesting circuit, microfliers boast energy-saving features not seen in larger and heavier battery-powered counterparts such as drones. Yet they are robust enough to carry sensors for a number of metrics, including temperature, pressure, humidity and altitude. Beyond measuring atmospheric conditions, the researchers say a network of these devices could help paint a picture of crop growth on farmland or detect gas leaks near population centers.

“This approach opens up a new design space for microfliers by using origami,” said Shyam Gollakota, the Thomas J. Cable Endowed Professor in the Allen School and director of the school’s Mobile Intelligence Lab who was also a co-senior author. “We hope this work is the first step towards a future vision for creating a new class of fliers and flight modalities.”

Weighing less than half a gram, microfliers require less material and cost less than drones. They also offer the ability to go where it’s too dangerous for a human to set foot.

For instance, Johnson said, microfliers could be deployed when tracking forest fires. Currently, firefighting teams sometimes rappel down to where a fire is spreading. Microfliers could assist in mapping where a fire may be heading and where best to drop a payload of water. Furthermore, the team is working on making more components of the device biodegradable in the case that they can’t be recovered after being released.

“There’s a good amount of work toward making these circuits more sustainable,” said Vicente Arroyos, another Allen School Ph.D. student and first co-author on the paper. “We can leverage our work on biodegradable materials to make these more sustainable.”

Besides improving sustainability, the researchers also tackled challenges relating to the structure of the device itself. Early prototypes lacked the carbon fiber roots that provide the rigidity needed to prevent accidental transitions between states.

The research team took inspiration from elm and maple leaves in designing the microfliers. When open flat, the devices tumble chaotically, similar to how an elm leaf falls from a branch. When they are “snapped” into a folded position, as shown here, they descend in a more stable, straight downward manner like a maple leaf. Photo by Mark Stone/University of Washington

Collecting maple and elm leaves from outside their lab, the researchers noticed that while their origami structures exhibited the bistability required to change between states, they flexed too easily and didn’t have the venation seen in the found foliage. To gain more fine-grained control, they took another cue from the environment.

“We looked again to nature to make the faces of the origami flat and rigid, adding a vein-like pattern to the structure using carbon fiber,” Johnson said. “After that modification, we no longer saw a lot of the energy that we input dissipate over the origami’s faces.”

In total, the researchers estimate that the development of their design took about two years. There’s still room to grow, they added, noting that the current microfliers can only transition from open to closed. They said newer designs, by offering the ability to switch back and forth between states, may offer more precision and flexibility in where and how they’re used.

During testing, when dropped from an altitude of 40 meters, for instance, the microfliers could disperse up to distances of 98 meters in a light breeze. Further refinements could increase the area of coverage, allowing them to follow more precise trajectories by accounting for variables such as wind and inclement conditions.

Related to their previous work with dandelion-inspired sensors, the origami microfliers build upon the researchers’ larger goal of creating the internet of bio-inspired things. Whereas the dandelion-inspired devices featured passive flight, reflecting the manner in which dandelion seeds disperse through the wind, the origami microfliers function as complete robotic systems that include actuation to change their shape, active and bi-directional wireless transmission via an onboard radio, and onboard computing and sensing to autonomously trigger shape changes upon reaching a target altitude.

“This design can also accommodate additional sensors and payload due to its size and power harvesting capabilities,” Arroyos said. “It’s exciting to think about the untapped potential for these devices.”

The future, in other words, is quickly taking shape.

“Origami is inspired by nature,” Johnson added, smiling. “These patterns are all around us. We just have to look in the right place.”

The project was an interdisciplinary work by an all-UW team. The paper’s co-authors also included Amélie Ferran, a Ph.D. student in the mechanical engineering department, as well as Raul Villanueva, Dennis Yin and Tilboon Elberier, who contributed as undergraduate students studying electrical and computer engineering, and mechanical engineering professors Alberto Aliseda and Sawyer Fuller.

Johnson and Arroyos, who co-founded and currently lead the educational nonprofit AVELA – A Vision for Engineering Literacy & Access, and their teammates have done outreach efforts in Washington state K-12 schools related to the research, including showing students how to create their own bi-stable leaf-out origami structure using a piece of paper. Check out a related demonstration video here, and learn more about the microflier project here and in a related UW News release and GeekWire story.

Virtual-reality tech is fast becoming more real

Virtual-reality technology could help cure people of phobias including about spiders. © Leena Robinson, Shutterstock.com

By Helen Massy-Beresford

Imagine a single technology that could help a robot perform safety checks at a nuclear plant, cure a person’s arachnophobia and simulate the feeling of a hug from a distant relative.

Welcome to the world of “extended reality”. Researchers funded by the EU have sought to demonstrate its enormous potential.

Relevant research

Their goal was to make augmented reality, in which the real world is digitally enhanced, and virtual reality – a fully computer-generated environment – more immersive for users.

One of the researchers, Erik Hernandez Jimenez, never imagined the immediate relevance of a project that he led when it started in mid-2019. Within a year, the Covid-19 pandemic had triggered countless lockdowns that left people working and socialising through video connections from home.   

‘We thought about how to apply this technology, how to feel human touch even at a distance, when we were all locked at home and contact with others was through a computer,’ said Hernandez Jimenez. 

He coordinated the EU research initiative, which was named TACTILITY and ran from July 2019 until the end of September 2022. 

The TACTILITY team developed a glove that simulates the sense of touch. Users have the sensation of touching virtual objects through electrical pulses delivered by electrodes embedded in the glove.

The sensations range from pushing a button and feeling pressure on the finger to handling a solid object and feeling its shape, dimensions and texture. 

Glove and suit

‘TACTILITY is about including tactile feedback in a virtual-reality scenario,’ said Hernandez Jimenez, who is a project manager at Spanish research institute TECNALIA.

He said the principle could be extended from the glove to a whole body suit. 

Compared with past attempts to simulate touch sensations with motors, the electro-tactile feedback technique produces a more realistic result at a lower cost, according to Hernandez Jimenez. 

This opens up the possibility of making the technology more widely accessible. 

The research bolsters European Commission efforts to develop the virtual-worlds domain, which could provide 860 000 new jobs in Europe this decade as the worldwide sector grows from €27 billion in 2022. 

The EU has around 3 700 companies, research organisations and governmental bodies that operate in this sphere, according to the Commission.

Phobias to factories

The TACTILITY researchers looked at potential healthcare applications. 

“We thought about how to apply this technology, how to feel human touch even at a distance.”

– Erik Hernandez Jimenez, TACTILITY

That’s where spiders come into the picture. They were among the objects in the project’s experiments to mimic touch.

‘One that was quite impressive – although I didn’t like it at all – was feeling a spider or a cockroach crawling over your hand,’ Hernandez Jimenez said.  

A potential use for the technology is treating phobias through exposure therapy in which patients are gradually desensitised to the source of their fear. That could start by virtually “touching” cartoon-like creepy crawlies before progressing to more lifelike versions.  

The tactile glove can also be used in the manufacturing industry, helping the likes of car manufacturers train their workers to perform tricky manoeuvres on the factory floor.

Furthermore, it can help people collaborate more effectively with remotely controlled robots in hazardous environments. An example is a nuclear power plant, where a person in a control room can virtually “feel” what a robot is touching. 

‘They get another sense and another kind of feedback, with more information to perform better checks,’ Hernandez Jimenez said. 

Joyful and playful

Wearables for virtual reality. © Oğuz ‘Oz’ Buruk, 2021

Wearable technologies for virtual-reality environments are also being inspired by the gaming industry. 

Researchers in a second EU-funded project sought to expand the prospects for technologies already widely used for professional purposes. The initiative, called WEARTUAL, ran from May 2019 until late 2021.

“Wearables are fashion items – they’re part of the way we construct our identity.”

– Oğuz ‘Oz’ Buruk, WEARTUAL

‘Our project focused on the more experiential side – joyful and playful activities,’ said Oğuz ‘Oz’ Buruk, who coordinated WEARTUAL and is assistant professor of gameful experience at Tampere University in Finland. 

Until recently, experiencing a virtual-reality environment involved a hand-held controller or head-mounted display. 

The WEARTUAL researchers looked at ways of incorporating wearables worn, for example, on the wrist or ankle into virtual reality to give people a sense of greater immersion. 

That could mean having their avatar – a representative icon or figure in the virtual world – blush when nervous or excited to enhance their ability to express themselves.

On the cusp

The team developed a prototype that could integrate varying physical sensations into the virtual world by transferring to it real-life data such as heart rate.  

Buruk is interested in how games will look in the “posthuman” era, when people and machines increasingly converge through bodily implants, robotics and direct communication between the human brain and computers.  

He signals that it’s hard to overestimate the eventual impact of advances in this area on everyday life, albeit over varying timescales: wearables are likely to be much more widely used in virtual reality in the next decade, while widespread use of bodily implants is more likely to take 50 to 100 years.

As technology and human bodies become ever more closely linked, the experience of transferring them to a virtual world will be enhanced, encouraging people to spend increasing amounts of time there, according to Buruk.

Virtual-reality technologies are already being used for practical purposes such as gamifying vital information including fire-safety procedures, making it more interactive and easier to learn. This type of use could expand to many areas.

On a very different front, several fashion houses already sell clothes that can be worn in virtual environments, allowing people to express their identity and creativity.  

‘Wearables are fashion items – they’re part of the way we construct our identity,’ Buruk said. ‘Investments in virtual reality, extended reality and augmented reality are increasing every day.’

Research in this article was funded by the EU via the Marie Skłodowska-Curie Actions (MSCA).


This article was originally published in Horizon, the EU Research and Innovation magazine.

Page 2 of 5
1 2 3 4 5