All posts by University of Bristol

Octopus inspires new suction mechanism for robots

Suction cup grasping a stone – Image credit: Tianqi Yue

The team, based at Bristol Robotics Laboratory, studied the structures of octopus biological suckers,  which have superb adaptive suction abilities enabling them to anchor to rock.

In their findings, published in the journal PNAS today, the researchers show how they were able create a multi-layer soft structure and an artificial fluidic system to mimic the musculature and mucus structures of biological suckers.

Suction is a highly evolved biological adhesion strategy for soft-body organisms to achieve strong grasping on various objects. Biological suckers can adaptively attach to dry complex surfaces such as rocks and shells, which are extremely challenging for current artificial suction cups. Although the adaptive suction of biological suckers is believed to be the result of their soft body’s mechanical deformation, some studies imply that in-sucker mucus secretion may be another critical factor in helping attach to complex surfaces, thanks to its high viscosity.

Lead author Tianqi Yue explained: “The most important development is that we successfully demonstrated the effectiveness of the combination of mechanical conformation – the use of soft materials to conform to surface shape, and liquid seal – the spread of water onto the contacting surface for improving the suction adaptability on complex surfaces. This may also be the secret behind biological organisms ability to achieve adaptive suction.”

Their multi-scale suction mechanism is an organic combination of mechanical conformation and regulated water seal. Multi-layer soft materials first generate a rough mechanical conformation to the substrate, reducing leaking apertures to just micrometres. The remaining micron-sized apertures are then sealed by regulated water secretion from an artificial fluidic system based on the physical model, thereby the suction cup achieves long suction longevity on diverse surfaces but with minimal overflow.

 

Tianqi added: “We believe the presented multi-scale adaptive suction mechanism is a powerful new adaptive suction strategy which may be instrumental in the development of versatile soft adhesion.

”Current industrial solutions use always-on air pumps to actively generate the suction however, these are noisy and waste energy.

“With no need for a pump, it is well known that many natural organisms with suckers, including octopuses, some fishes such as suckerfish and remoras, leeches, gastropods and echinoderms, can maintain their superb adaptive suction on complex surfaces by exploiting their soft body structures.”

The findings have great potential for industrial applications, such as providing a next-generation robotic gripper for grasping a variety of irregular objects.

The team now plan to build a more intelligent suction cup, by embedding sensors into the suction cup to regulate suction cup’s behaviour.

Paper

Bioinspired multiscale adaptive suction on complex dry surfaces enhanced by regulated water secretion’ by Tianqi Yue, Weiyong Si, Alex Keller, Chenguang Yang, Hermes Bloomfield-Gadêlha and Jonathan Rossiter in PNAS.

New dual-arm robot achieves bimanual tasks by learning from simulation

Dual arm robot holding crisp. Image: Yijiong Lin

The new Bi-Touch system, designed by scientists at the University of Bristol and based at the Bristol Robotics Laboratory, allows robots to carry out manual tasks by sensing what to do from a digital helper.

The findings, published in IEEE Robotics and Automation Letters, show how an AI agent interprets its environment through tactile and proprioceptive feedback, and then control the robots’ behaviours, enabling precise sensing, gentle interaction, and effective object manipulation to accomplish robotic tasks.

This development could revolutionise industries such as fruit picking, domestic service, and eventually recreate touch in artificial limbs.

Lead author Yijiong Lin from the Faculty of Engineering, explained: “With our Bi-Touch system, we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks that are tailored towards the touch. And more importantly, we can directly apply these agents from the virtual world to the real world without further training.

“The tactile bimanual agent can solve tasks even under unexpected perturbations and manipulate delicate objects in a gentle way.”

Bimanual manipulation with tactile feedback will be key to human-level robot dexterity. However, this topic is less explored than single-arm settings, partly due to the availability of suitable hardware along with the complexity of designing effective controllers for tasks with relatively large state-action spaces. The team were able to develop a tactile dual-arm robotic system using recent advances in AI and robotic tactile sensing.

The researchers built up a virtual world (simulation) that contained two robot arms equipped with tactile sensors. They then design reward functions and a goal-update mechanism that could encourage the robot agents to learn to achieve the bimanual tasks and developed a real-world tactile dual-arm robot system to which they could directly apply the agent.

The robot learns bimanual skills through Deep Reinforcement Learning (Deep-RL), one of the most advanced techniques in the field of robot learning. It is designed to teach robots to do things by letting them learn from trial and error akin to training a dog with rewards and punishments.

For robotic manipulation, the robot learns to make decisions by attempting various behaviours to achieve designated tasks, for example, lifting up objects without dropping or breaking them. When it succeeds, it gets a reward, and when it fails, it learns what not to do. With time, it figures out the best ways to grab things using these rewards and punishments. The AI agent is visually blind relying only on proprioceptive feedback – a body’s ability to sense movement, action and location and tactile feedback.

They were able to successfully enable to the dual arm robot to successfully safely lift items as fragile as a single Pringle crisp.

Co-author Professor Nathan Lepora added: “Our Bi-Touch system showcases a promising approach with affordable software and hardware for learning bimanual behaviours with touch in simulation, which can be directly applied to the real world. Our developed tactile dual-arm robot simulation allows further research on more different tasks as the code will be open-source, which is ideal for developing other downstream tasks.”

Yijiong concluded: “Our Bi-Touch system allows a tactile dual-arm robot to learn sorely from simulation, and to achieve various manipulation tasks in a gentle way in the real world.

“And now we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks that are tailored towards the touch.”

Machine-learning method used for self-driving cars could improve lives of type-1 diabetes patients

Artificial Pancreas System with Reinforcement Learning. Image credit: Harry Emerson

Scientists at the University of Bristol have shown that reinforcement learning, a type of machine learning in which a computer program learns to make decisions by trying different actions, significantly outperforms commercial blood glucose controllers in terms of safety and effectiveness. By using offline reinforcement learning, where the algorithm learns from patient records, the researchers improve on prior work, showing that good blood glucose control can be achieved by learning from the decisions of the patient rather than by trial and error.

Type 1 diabetes is one of the most prevalent auto-immune conditions in the UK and is characterised by an insufficiency of the hormone insulin, which is responsible for blood glucose regulation.

Many factors affect a person’s blood glucose and therefore it can be a challenging and burdensome task to select the correct insulin dose for a given scenario. Current artificial pancreas devices provide automated insulin dosing but are limited by their simplistic decision-making algorithms.

However a new study, published in the Journal of Biomedical Informatics, shows offline reinforcement learning could represent an important milestone of care for people living with the condition. The largest improvement was in children, who experienced an additional one-and-a-half hours in the target glucose range per day.

Children represent a particularly important group as they are often unable to manage their diabetes without assistance and an improvement of this size would result in markedly better long-term health outcomes.

Lead author Harry Emerson from Bristol’s Department of Engineering Mathematics, explained: “My research explores whether reinforcement learning could be used to develop safer and more effective insulin dosing strategies.

“These machine learning driven algorithms have demonstrated superhuman performance in playing chess and piloting self-driving cars, and therefore could feasibly learn to perform highly personalised insulin dosing from pre-collected blood glucose data.

“This particular piece of work focuses specifically on offline reinforcement learning, in which the algorithm learns to act by observing examples of good and bad blood glucose control.

“Prior reinforcement learning methods in this area predominantly utilise a process of trial-and-error to identify good actions, which could expose a real-world patient to unsafe insulin doses.”

Due to the high risk associated with incorrect insulin dosing, experiments were performed using the FDA-approved UVA/Padova simulator, which creates a suite of virtual patients to test type 1 diabetes control algorithms. State-of-the-art offline reinforcement learning algorithms were evaluated against one of the most widely used artificial pancreas control algorithms. This comparison was conducted across 30 virtual patients (adults, adolescents and children) and considered 7,000 days of data, with performance being evaluated in accordance with current clinical guidelines. The simulator was also extended to consider realistic implementation challenges, such as measurement errors, incorrect patient information and limited quantities of available data.

This work provides a basis for continued reinforcement learning research in glucose control; demonstrating the potential of the approach to improve the health outcomes of people with type 1 diabetes, while highlighting the method’s shortcomings and areas of necessary future development.

The researchers’ ultimate goal is to deploy reinforcement learning in real-world artificial pancreas systems. These devices operate with limited patient oversight and consequently will require significant evidence of safety and effectiveness to achieve regulatory approval.

Harry added: ”This research demonstrates machine learning’s potential to learn effective insulin dosing strategies from the pre-collected type 1 diabetes data. The explored method outperforms one of the most widely used commercial artificial pancreas algorithms and demonstrates an ability to leverage a person’s habits and schedule to respond more quickly to dangerous events.”

Sponge makes robotic device a soft touch

Robot sponge. Image credit: Tianqi Yue

This easy-to-make sponge-jamming device can help stiff robots handle delicate items carefully by mimicking the nuanced touch, or variable stiffness, of a human.

Robots can skip, jump and do somersaults, but they’re too rigid to hold an egg easily. Variable-stiffness devices are potential solutions for contact compliance on hard robots to reduce damage, or for improving the load capacity of soft robots.

This study, published at the IEEE International Conference on Robotics and Automation (ICRA) 2023, shows that variable stiffness can be achieved by a silicone sponge.

Lead author Tianqi Yue from Bristol’s Department of Engineering Mathematics explained: “Stiffness, also known as softness, is important in contact scenarios.

“Robotic arms are too rigid so they cannot make such a soft human-like grasp on delicate objects, for example, an egg.

“What makes humans different from robotic arms is that we have soft tissues enclosing rigid bones, which act as a natural mitigating mechanism.

“In this paper, we managed to develop a soft device with variable stiffness, to be mounted on the end robotic arm for making the robot-object contact safe.”

Robot sponge in action. Video Credit: Tianqi Yue.

Silicone sponge is a cheap and easy-to-fabricate material. It is a porous elastomer just like the cleaning sponge used in everyday tasks.

By squeezing the sponge, the sponge stiffens which is why it can be transformed into a variable-stiffness device.

This device could be used in industrial robots in scenarios including gripping jellies, eggs and other fragile substances. It can also be used in service robots to make human-robot interaction safer.

Mr Yue added: “We managed to use a sponge to make a cheap and nimble but effective device that can help robots achieve soft contact with objects. The great potential comes from its low cost and light weight.

“We believe this silicone-sponge based variable-stiffness device will provide a novel solution in industry and healthcare, for example, tunable-stiffness requirement on robotic polishing and ultrasound imaging.”

The team will now look at making the device achieve variable stiffness in multiple directions, including rotation.

Paper: “A Silicone-sponge-based Variable-stiffness Device” by Tianqi Yue at the IEEE International Conference on Robotics and Automation (ICRA) 2023.

Robot fish makes splash with motion breakthrough

Robot fish. Image credit: Tsam Lung You

The robot fish was fitted with a twisted and coiled polymer (TCP) to drive it forward, a light-weight low cost device that relies on temperature change to generate movement, which also limits its speed.

A TCP works by contracting like muscles when heated, converting the energy into mechanical motion. The TCP used in this work is warmed by Joule heating – the pass of current through an electrical conductor produces thermal energy and heats up the conductor. By minimising the distance between the TCP on one side of the robot fish and the spring on the other, this activates the fin at the rear, enabling the robot fish to reach new speeds. The undulating flapping of its rear fin was measured at a frequency of 2Hz, two waves per second. The frequency of the electric current is the same as the frequency of tail flap.

The findings, published at the 6th IEEE-RAS International Conference on Soft Robotics (RoboSoft 2023), provide a new route to raising the actuation – the action of causing a machine or device to operate – frequency of TCPs through thermomechanical design and shows the possibility of using TCPs at high frequency in aqueous environments.

Lead author Tsam Lung You from Bristol’s Department of Engineering Mathematics said: “Twisted and coiled polymer (TCP) actuator is a promising novel actuator, exhibiting attractive properties of light weight, low-cost high energy density and simple fabrication process.

“They can be made from very easily assessable materials such as a fishing line and they contract and provide linear actuation when heated up. However, because of the time needed for heat dissipation during the relaxation phase, this makes them slow.”

By optimising the structural design of the TCP-spring antagonistic muscle pair and bringing their anchor points closer together, it allowed the posterior fin to swing at a larger angle for the same amount of TCP actuation.

Antagonistic muscles. Image credit: Tsam Lung You

Although this requires greater force, TCP is a strong actuator with high work energy density, and is still able to drive the fin.

Until now, TCPs have been mostly used for applications such as wearable devices and robotic hands. This work opens up more areas of application where TCP can be used, such as marine robots for underwater exploration and monitoring.

Tsam Lung You added: “Our robotic fish swam at the fastest actuation frequency found in a real TCP application and also the highest locomotion speed of a TCP application so far.

“This is really exciting as it opens up more opportunities of TCP application in different areas.”

The team now plan to expand the scale and develop a knifefish-inspired TCP-driven ribbon fin robot that can swim agilely in water.

Sea creatures inspire marine robots which can operate in extra-terrestrial oceans

RoboSalps in action. Credits: Valentina Lo Gatto

These robotic units called RoboSalps, after their animal namesakes, have been engineered to operate in unknown and extreme environments such as extra-terrestrial oceans.

Although salps resemble jellyfish with their semi-transparent barrel-shaped bodies, they belong to the family of Tunicata and have a complex life cycle, changing between solitary and aggregate generations where they connect to form colonies.

RoboSalps have similarly light, tubular bodies and can link to each other to form ‘colonies’ which gives them new capabilities that can only be achieved because they work together.

Researcher Valentina Lo Gatto of Bristol’s Department of Aerospace Engineering is leading the study. She is also a student at the EPSRC Centre of Doctoral Training in Future Autonomous and Robotic Systems (FARSCOPE CDT).

She said: “RoboSalp is the first modular salp-inspired robot. Each module is made of a very light-weight soft tubular structure and a drone propeller which enables them to swim. These simple modules can be combined into ‘colonies’ that are much more robust and have the potential to carry out complex tasks. Because of their low weight and their robustness, they are ideal for extra-terrestrial underwater exploration missions, for example, in the subsurface ocean on the Jupiter moon Europa.”

RoboSalps are unique as each individual module can swim on its own. This is possible because of a small motor with rotor blades – typically used for drones – inserted into the soft tubular structure.

When swimming on their own, RoboSalps modules are difficult to control, but after joining them together to form colonies, they become more stable and show sophisticated movements.

In addition, by having multiple units joined together, scientists automatically obtain a redundant system, which makes it more robust against failure. If one module breaks, the whole colony can still move.

A colony of soft robots is a relatively novel concept with a wide range of interesting applications. RoboSalps are soft, potentially quite energy efficient, and robust due to inherent redundancy. This makes them ideal for autonomous missions where a direct and immediate human control might not be feasible.

Dr Helmut Hauser of Bristol’s Department of Engineering Maths, explained: “These include the exploration of remote submarine environments, sewage tunnels, and industrial cooling systems. Due to the low weight and softness of the RoboSalp modules, they are also ideal for extra-terrestrial missions. They can easily be stored in a reduced volume, ideal for reducing global space mission payloads.”

A compliant body also provides safer interaction with potentially delicate ecosystems, both on earth and extra-terrestrial, reducing the risk of environmental damage. The possibility to detach units or segments, and rearrange them, gives the system adaptability: once the target environment is reached, the colony could be deployed to start its exploration.

At a certain point, it could split into multiple segments, each exploring in a different direction, and afterwards reassemble in a new configuration to achieve a different objective such as manipulation or sample collection.

Prof Jonathan Rossiter added: “We are also developing control approaches that are able to exploit the compliance of the modules with the goal of achieving energy efficient movements close to those observed in biological salps.”

Sea creatures inspire marine robots which can operate in extra-terrestrial oceans

RoboSalps in action. Credits: Valentina Lo Gatto

These robotic units called RoboSalps, after their animal namesakes, have been engineered to operate in unknown and extreme environments such as extra-terrestrial oceans.

Although salps resemble jellyfish with their semi-transparent barrel-shaped bodies, they belong to the family of Tunicata and have a complex life cycle, changing between solitary and aggregate generations where they connect to form colonies.

RoboSalps have similarly light, tubular bodies and can link to each other to form ‘colonies’ which gives them new capabilities that can only be achieved because they work together.

Researcher Valentina Lo Gatto of Bristol’s Department of Aerospace Engineering is leading the study. She is also a student at the EPSRC Centre of Doctoral Training in Future Autonomous and Robotic Systems (FARSCOPE CDT).

She said: “RoboSalp is the first modular salp-inspired robot. Each module is made of a very light-weight soft tubular structure and a drone propeller which enables them to swim. These simple modules can be combined into ‘colonies’ that are much more robust and have the potential to carry out complex tasks. Because of their low weight and their robustness, they are ideal for extra-terrestrial underwater exploration missions, for example, in the subsurface ocean on the Jupiter moon Europa.”

RoboSalps are unique as each individual module can swim on its own. This is possible because of a small motor with rotor blades – typically used for drones – inserted into the soft tubular structure.

When swimming on their own, RoboSalps modules are difficult to control, but after joining them together to form colonies, they become more stable and show sophisticated movements.

In addition, by having multiple units joined together, scientists automatically obtain a redundant system, which makes it more robust against failure. If one module breaks, the whole colony can still move.

A colony of soft robots is a relatively novel concept with a wide range of interesting applications. RoboSalps are soft, potentially quite energy efficient, and robust due to inherent redundancy. This makes them ideal for autonomous missions where a direct and immediate human control might not be feasible.

Dr Helmut Hauser of Bristol’s Department of Engineering Maths, explained: “These include the exploration of remote submarine environments, sewage tunnels, and industrial cooling systems. Due to the low weight and softness of the RoboSalp modules, they are also ideal for extra-terrestrial missions. They can easily be stored in a reduced volume, ideal for reducing global space mission payloads.”

A compliant body also provides safer interaction with potentially delicate ecosystems, both on earth and extra-terrestrial, reducing the risk of environmental damage. The possibility to detach units or segments, and rearrange them, gives the system adaptability: once the target environment is reached, the colony could be deployed to start its exploration.

At a certain point, it could split into multiple segments, each exploring in a different direction, and afterwards reassemble in a new configuration to achieve a different objective such as manipulation or sample collection.

Prof Jonathan Rossiter added: “We are also developing control approaches that are able to exploit the compliance of the modules with the goal of achieving energy efficient movements close to those observed in biological salps.”

Robot helps reveal how ants pass on knowledge

Ant leading other ant to new nest, known as tandem running. Image credit: Norasmah Basari and Nigel R Franks

The team built the robot to mimic the behaviour of rock ants that use one-to-one tuition, in which an ant that has discovered a much better new nest can teach the route there to another individual.

The findings, published in the Journal of Experimental Biology, confirm that most of the important elements of teaching in these ants are now understood because the teaching ant can be replaced by a machine.

Key to this process of teaching is tandem running where one ant literally leads another ant quite slowly along a route to the new nest. The pupil ant learns the route sufficiently well that it can find its own way back home and then lead a tandem-run with another ant to the new nest, and so on.

Prof Nigel Franks of Bristol’s School of Biological Sciences said: “Teaching is so important in our own lives that we spend a great deal of time either instructing others or being taught ourselves. This should cause us to wonder whether teaching actually occurs among non-human animals. And, in fact, the first case in which teaching was demonstrated rigorously in any other animal was in an ant.” The team wanted to determine what was necessary and sufficient in such teaching. If they could build a robot that successfully replaced the teacher, this should show that they largely understood all the essential elements in this process.

Prof Nigel Franks showing Sir David Attenborough the gantry during the opening of the new Life Sciences Building in 2014. Image credit: University of Bristol

The researchers built a large arena so there was an appreciable distance between the ants’ old nest, which was deliberately made to be of low quality, and a new much better one that ants could be led to by a robot. A gantry was placed atop the arena to move back and forth with a small sliding robot attached to it, so that the scientists could direct the robot to move along either straight or wavy routes. Attractive scent glands, from a worker ant, were attached to the robot to give it the pheromones of an ant teacher.

Prof Franks explained: “We waited for an ant to leave the old nest and put the robot pin, adorned with attractive pheromones, directly ahead of it. The pinhead was programmed to move towards the new nest either on a straight path or on a beautifully sinuous one. We had to allow for the robot to be interrupted in its journey, by us, so that we could wait for the following ant to catch up after it had looked around to learn landmarks.”

Diagram of ant pheromone glands. Image credit: Norasmah Basari

“When the follower ant had been led by the robot to the new nest, we allowed it to examine the new nest and then, in its own time, begin its homeward journey. We then used the gantry automatically to track the path of the returning ant.”

The team found that the robot had indeed taught the route successfully to the apprentice ant. The ants knew their way back to the old nest whether they had taken a winding path or a straight one.

Prof Franks explained: “A straight path might be quicker but a winding path would provide more time in which the following ant could better learn landmarks so that it could find its way home as efficiently as if it had been on a straight path.

“Crucially, we could compare the performance of the ants that the robot had taught with ones that we carried to the site of the new nest and that had not had an opportunity to learn the route. The taught ants found their way home much more quickly and successfully.”

The experiments were conducted by undergraduates Jacob Podesta, who is now a PhD student at York, and Edward Jarvis, who was also a Masters student at Professor Nigel Franks’s Lab. The gantry programming was accomplished by Dr. Alan Worley and all the statistical analyses were driven by Dr. Ana Sendova-Franks.

Their approach should make it possible to interrogate further exactly what is involved in successful teaching.

Innovative ‘smart socks’ could help millions living with dementia

Left: The display that carers will see in the Milbotix app. Right: Milbotix founder and CEO Dr Zeke Steer

Inventor Dr Zeke Steer quit his job and took a PhD at Bristol Robotics Laboratory so he could find a way to help people like his great-grandmother, who became anxious and aggressive because of her dementia.

Milbotix’s smart socks track heart rate, sweat levels and motion to give insights on the wearer’s wellbeing – most importantly how anxious the person is feeling.

They look and feel like normal socks, do not need charging, are machine washable and provide a steady stream of data to carers, who can easily see their patient’s metrics on an app.

Current alternatives to Milbotix’s product are worn on wrist straps, which can stigmatise or even cause more stress.

Dr Steer said: “The foot is actually a great place to collect data about stress, and socks are a familiar piece of clothing that people wear every day.

“Our research shows that the socks can accurately recognise signs of stress – which could really help not just those with dementia and autism, but their carers too.”

Dr Steer was working as a software engineer in the defence industry when his great-grandmother, Kath, began showing the ill effects of dementia.

Once gentle and with a passion for jazz music, Kath became agitated and aggressive, and eventually accused Dr Steer’s grandmother of stealing from her.

Dr Steer decided to investigate how wearable technologies and artificial intelligence could help with his great-grandmother’s symptoms. He studied for a PhD at Bristol Robotics Laboratory, which is jointly run by the University of Bristol and UWE Bristol.

During the research, he volunteered at a dementia care home operated by the St Monica Trust. Garden House Care Home Manager, Fran Ashby said: “Zeke’s passion was clear from his first day with us and he worked closely with staff, relatives and residents to better understand the effects and treatment of dementia.

“We were really impressed at the potential of his assisted technology to predict impending agitation and help alert staff to intervene before it can escalate into distressed behaviours.

“Using modern assistive technology examples like smart socks can help enable people living with dementia to retain their dignity and have better quality outcomes for their day-to-day life.”

While volunteering Dr Steer hit upon the idea of Milbotix, which he launched as a business in February 2020.

“I came to see that my great grandmother wasn’t an isolated episode, and that distressed behaviours are very common,” he explained.

Milbotix are currently looking to work with innovative social care organisations to refine and evaluate the smart socks.

The business recently joined SETsquared Bristol, the University’s world-leading incubator for high growth tech businesses.

Dr Steer was awarded one of their Breakthrough Bursaries, which provides heavily subsidised membership to founders from diverse backgrounds. Dr Steer is also currently on the University’s QUEST programme, which support founders to commercialise their products.

Charity Alzheimer’s Society says there will be 1.6 million people with dementia in the UK by 2040, with one person developing dementia every three minutes. Dementia is thought to cost the UK £34.7 billion a year.

Meanwhile, according to the Government autism affects 1% of the UK population, or some 700,000 people, 15-30% of whom are non-verbal part or all of the time.

Dr Steer is now growing the business: testing the socks with people living with mid to late-stage dementia and developing the tech before bringing the product to market next year. Milbotix will begin a funding round later this year.

Milbotix is currently a team of three, including Jacqui Arnold, who has been working with people living with dementia for 40 years.

She said: “These socks could make such a difference. Having that early indicator of someone’s stress levels rising could provide the early intervention they need to reduce their distress – be that touch, music, pain relief or simply having someone there with them.”

Milbotix will be supported by Alzheimer’s Society through their Accelerator Programme, which is helping fund the smart socks’ development, providing innovation support and helping test what it described as a “brilliant product”.

Natasha Howard-Murray, Senior Innovator at Alzheimer’s Society, said: “Some people with dementia may present behaviours such as aggression, irritability and resistance to care.

“This innovative wearable tech is a fantastic, accessible way for staff to better monitor residents’ distress and agitation.”

Professor Judith Squires, Deputy Vice-Chancellor at the University of Bristol, said: “It is fantastic to see Zeke using the skills he learnt with us to improve the wellbeing of some of those most in need.

“The innovative research that Zeke has undertaken has the potential to help millions live better lives. We hope to see Milbotix flourish.”

Touchy subject: 3D printed fingertip ‘feels’ like human skin

Robotic hand with a 3D-printed tactile fingertip on the little (pinky) finger. The white rigid back to the fingertip is covered with the black flexible 3D-printed skin.

Machines can beat the world’s best chess player, but they cannot handle a chess piece as well as an infant. This lack of robot dexterity is partly because artificial grippers lack the fine tactile sense of the human fingertip, which is used to guide our hands as we pick up and handle objects.

Two papers published in the Journal of the Royal Society Interface give the first in-depth comparison of an artificial fingertip with neural recordings of the human sense of touch. The research was led by Professor of Robotics & AI (Artificial Intelligence), Nathan Lepora, from the University of Bristol’s Department of Engineering Maths and based at the Bristol Robotics Laboratory.

“Our work helps uncover how the complex internal structure of human skin creates our human sense of touch. This is an exciting development in the field of soft robotics – being able to 3D-print tactile skin could create robots that are more dexterous or significantly improve the performance of prosthetic hands by giving them an in-built sense of touch,” said Professor Lepora.

Cut-through section on the 3D-printed tactile skin. The white plastic is a rigid mount for the flexible black rubber skin. Both parts are made together on an advanced 3D-printer. The ‘pins’ on the inside of the skin replicate dermal papillae that are formed inside human skin.

Professor Lepora and colleagues created the sense of touch in the artificial fingertip using a 3D-printed mesh of pin-like papillae on the underside of the compliant skin, which mimic the dermal papillae found between the outer epidermal and inner dermal layers of human tactile skin. The papillae are made on advanced 3D-printers that can mix together soft and hard materials to create complicated structures like those found in biology.

“We found our 3D-printed tactile fingertip can produce artificial nerve signals that look like recordings from real, tactile neurons. Human tactile nerves transmit signals from various nerve endings called mechanoreceptors, which can signal the pressure and shape of a contact. Classic work by Phillips and Johnson in 1981 first plotted electrical recordings from these nerves to study ‘tactile spatial resolution’ using a set of standard ridged shapes used by psychologists. In our work, we tested our 3D-printed artificial fingertip as it ‘felt’ those same ridged shapes and discovered a startlingly close match to the neural data,” said Professor Lepora.

“For me, the most exciting moment was when we looked at our artificial nerve recordings from the 3D-printed fingertip and they looked like the real recordings from over 40 years ago! Those recordings are very complex with hills and dips over edges and ridges, and we saw the same pattern in our artificial tactile data,” said Professor Lepora.

While the research found a remarkably close match between the artificial fingertip and human nerve signals, it was not as sensitive to fine detail. Professor Lepora suspects this is because the 3D-printed skin is thicker than real skin and his team is now exploring how to 3D-print structures on the microscopic scale of human skin.

“Our aim is to make artificial skin as good – or even better – than real skin,” said Professor Lepora.

PAPERS

Bristol scientists develop insect-sized flying robots with flapping wings

Front view of the flying robot. Image credit: Dr Tim Helps

This new advance, published in the journal Science Robotics, could pave the way for smaller, lighter and more effective micro flying robots for environmental monitoring, search and rescue, and deployment in hazardous environments.

Until now, typical micro flying robots have used motors, gears and other complex transmission systems to achieve the up-and-down motion of the wings. This has added complexity, weight and undesired dynamic effects.

Taking inspiration from bees and other flying insects, researchers from Bristol’s Faculty of Engineering, led by Professor of Robotics Jonathan Rossiter, have successfully demonstrated a direct-drive artificial muscle system, called the Liquid-amplified Zipping Actuator (LAZA), that achieves wing motion using no rotating parts or gears.

The LAZA system greatly simplifies the flapping mechanism, enabling future miniaturization of flapping robots down to the size of insects.

In the paper, the team show how a pair of LAZA-powered flapping wings can provide more power compared with insect muscle of the same weight, enough to fly a robot across a room at 18 body lengths per second.

They also demonstrated how the LAZA can deliver consistent flapping over more than one million cycles, important for making flapping robots that can undertake long-haul flights.

The team expect the LAZA to be adopted as a fundamental building block for a range of autonomous insect-like flying robots.

Dr Tim Helps, lead author and developer of the LAZA system said: “With the LAZA, we apply electrostatic forces directly on the wing, rather than through a complex, inefficient transmission system. This leads to better performance, simpler design, and will unlock a new class of low-cost, lightweight flapping micro-air vehicles for future applications, like autonomous inspection of off-shore wind turbines.”

Professor Rossiter added: “Making smaller and better performing flapping wing micro robots is a huge challenge. LAZA is an important step toward autonomous flying robots that could be as small as insects and perform environmentally critical tasks such as plant pollination and exciting emerging roles such as finding people in collapsed buildings.”

A camera that knows exactly where it is

Overview of the on-sensor mapping. The system moves around and as it does it builds a visual catalogue of what it observes. This is the map that is later used to know if it has been there before.
Image credit: University of Bristol

Knowing where you are on a map is one of the most useful pieces of information when navigating journeys. It allows you to plan where to go next and also tracks where you have been before. This is essential for smart devices from robot vacuum cleaners to delivery drones to wearable sensors keeping an eye on our health.

But one important obstacle is that systems that need to build or use maps are very complex and commonly rely on external signals like GPS that do not work indoors, or require a great deal of energy due to the large number of components involved.

Walterio Mayol-Cuevas, Professor in Robotics, Computer Vision and Mobile Systems at the University of Bristol’s Department of Computer Science, led the team that has been developing this new technology.

He said: “We often take for granted things like our impressive spatial abilities. Take bees or ants as an example. They have been shown to be able to use visual information to move around and achieve highly complex navigation, all without GPS or much energy consumption.

“In great part this is because their visual systems are extremely efficient and well-tuned to making and using maps, and robots can’t compete there yet.”

However, a new breed of sensor-processor devices that the team calls Pixel Processor Array (PPA), allow processing on-sensor. This means that as images are sensed, the device can decide what information to keep, what information to discard and only use what it needs for the task at hand.

An example of such PPA device is the SCAMP architecture that has been developed by the team’s colleagues at the University of Manchester by Piotr Dudek, Professor of Circuits and Systems from the University of Manchester and his team. This PPA has one small processor for every pixel which allows for massively parallel computation on the sensor itself.

The team at the University of Bristol has previously demonstrated how these new systems can recognise objects at thousands of frames per second but the new research shows how a sensor-processor device can make maps and use them, all at the time of image capture.

This work was part of the MSc dissertation of Hector Castillo-Elizalde, who did his MSc in Robotics at the University of Bristol. He was co-supervised by Yanan Liu who is also doing his PhD on the same topic and Dr Laurie Bose.

Hector Castillo-Elizalde and the team developed a mapping algorithm that runs all on-board the sensor-processor device.

The algorithm is deceptively simple: when a new image arrives, the algorithm decides if it is sufficiently different to what it has seen before. If it is, it will store some of its data, if not it will discard it.

Right: the system moves around the world, Left: A new image is seen and a decision is made to add it or not to the visual catalogue (top left), this is the pictorial map that can then be used to localise the system later. Image credit: University of Bristol

As the PPA device is moved around by for example a person or robot, it will collect a visual catalogue of views. This catalogue can then be used to match any new image when it is in the mode of localisation.

Importantly, no images go out of the PPA, only the key data that indicates where it is with respect to the visual catalogue. This makes the system more energy efficient and also helps with privacy.

During localisation the incoming image is compared to the visual catalogue (Descriptor database) and if a match is found, the system will tell where it is (Predicted node, small white rectangle at the top) relative to the catalogue. Note how the system is able to match images even if there are changes in illumination or objects like people moving.

The team believes that this type of artificial visual systems that are developed for visual processing, and not necessarily to record images, is a first step towards making more efficient smart systems that can use visual information to understand and move in the world. Tiny, energy efficient robots or smart glasses doing useful things for the planet and for people will need spatial understanding, which will come from being able to make and use maps.

The research has been partially funded by the Engineering and Physical Sciences Research Council (EPSRC), by a CONACYT scholarship to Hector Castillo-Elizalde and a CSC scholarship to Yanan Liu.

Paper

Lily the barn owl reveals how birds fly in gusty winds

Scientists from the University of Bristol and the Royal Veterinary College have discovered how birds are able to fly in gusty conditions – findings that could inform the development of bio-inspired small-scale aircraft.

Lily the barn owl flying
Lily flies through gusts: Scientists from Bristol and the RVC have discovered how birds fly in gusty conditions – with implications for small-scale aircraft design. Image credit: Cheney et al 2020

“Birds routinely fly in high winds close to buildings and terrain – often in gusts as fast as their flight speed. So the ability to cope with strong and sudden changes in wind is essential for their survival and to be able to do things like land safely and capture prey,” said Dr Shane Windsor from the Department of Aerospace Engineering at the University of Bristol.

“We know birds cope amazingly well in conditions which challenge engineered air vehicles of a similar size but, until now, we didn’t understand the mechanics behind it,” said Dr Windsor.

The study, published in Proceedings of the Royal Society B, reveals how bird wings act as a suspension system to cope with changing wind conditions. The team, which included Bristol PhD student Nicholas Durston and researchers Jialei Song and James Usherwood from Dongguan University of Technology in China and the RVC respectively, used an innovative combination of high-speed, video-based 3D surface reconstruction, computed tomography (CT) scans, and computational fluid dynamics (CFD) to understand how birds ‘reject’ gusts through wing morphing, i.e. by changing the shape and posture of their wings.

In the experiment, conducted in the Structure and Motion Laboratory at the Royal Veterinary College, the team filmed Lily, a barn owl, gliding through a range of fan-generated vertical gusts, the strongest of which was as fast as her flight speed. Lily is a trained falconry bird who is a veteran of many nature documentaries, so wasn’t fazed in the least by all the lights and cameras. “We began with very gentle gusts in case Lily had any difficulties, but soon found that – even at the highest gust speeds we could make – Lily was unperturbed; she flew straight through to get the food reward being held by her trainer, Lloyd Buck,” commented Professor Richard Bomphrey of the Royal Veterinary College.

“Lily flew through the bumpy gusts and consistently kept her head and torso amazingly stable over the trajectory, as if she was flying with a suspension system. When we analysed it, what surprised us was that the suspension-system effect wasn’t just due to aerodynamics, but benefited from the mass in her wings. For reference, each of our upper limbs is about 5% of our body weight; for a bird it’s about double, and they use that mass to effectively absorb the gust,” said joint lead-author Dr Jorn Cheney from the Royal Veterinary College.

“Perhaps most exciting is the discovery that the very fastest part of the suspension effect is built into the mechanics of the wings, so birds don’t actively need to do anything for it to work. The mechanics are very elegant. When you strike a ball at the sweetspot of a bat or racquet, your hand is not jarred because the force there cancels out. Anyone who plays a bat-and-ball sport knows how effortless this feels. A wing has a sweetspot, just like a bat. Our analysis suggests that the force of the gust acts near this sweetspot and this markedly reduces the disturbance to the body during the first fraction of a second. The process is automatic and buys just enough time for other clever stabilising processes to kick in,” added joint lead-author, Dr Jonathan Stevenson from the University of Bristol.

Dr Windsor said the next step for the research, which was funded by the European Research Council (ERC), Air Force Office of Scientific Research and the Wellcome Trust, is to develop bio-inspired suspension systems for small-scale aircraft.

Robots can now learn to swarm on the go

A new generation of swarming robots which can independently learn and evolve new behaviours in the wild is one step closer, thanks to research from the University of Bristol and the University of the West of England (UWE).

The team used artificial evolution to enable the robots to automatically learn swarm behaviours which are understandable to humans. This new advance published this Friday in Advanced Intelligent Systems, could create new robotic possibilities for environmental monitoring, disaster recovery, infrastructure maintenance, logistics and agriculture. Read More