Innovative ‘smart socks’ could help millions living with dementia

Left: The display that carers will see in the Milbotix app. Right: Milbotix founder and CEO Dr Zeke Steer

Inventor Dr Zeke Steer quit his job and took a PhD at Bristol Robotics Laboratory so he could find a way to help people like his great-grandmother, who became anxious and aggressive because of her dementia.

Milbotix’s smart socks track heart rate, sweat levels and motion to give insights on the wearer’s wellbeing – most importantly how anxious the person is feeling.

They look and feel like normal socks, do not need charging, are machine washable and provide a steady stream of data to carers, who can easily see their patient’s metrics on an app.

Current alternatives to Milbotix’s product are worn on wrist straps, which can stigmatise or even cause more stress.

Dr Steer said: “The foot is actually a great place to collect data about stress, and socks are a familiar piece of clothing that people wear every day.

“Our research shows that the socks can accurately recognise signs of stress – which could really help not just those with dementia and autism, but their carers too.”

Dr Steer was working as a software engineer in the defence industry when his great-grandmother, Kath, began showing the ill effects of dementia.

Once gentle and with a passion for jazz music, Kath became agitated and aggressive, and eventually accused Dr Steer’s grandmother of stealing from her.

Dr Steer decided to investigate how wearable technologies and artificial intelligence could help with his great-grandmother’s symptoms. He studied for a PhD at Bristol Robotics Laboratory, which is jointly run by the University of Bristol and UWE Bristol.

During the research, he volunteered at a dementia care home operated by the St Monica Trust. Garden House Care Home Manager, Fran Ashby said: “Zeke’s passion was clear from his first day with us and he worked closely with staff, relatives and residents to better understand the effects and treatment of dementia.

“We were really impressed at the potential of his assisted technology to predict impending agitation and help alert staff to intervene before it can escalate into distressed behaviours.

“Using modern assistive technology examples like smart socks can help enable people living with dementia to retain their dignity and have better quality outcomes for their day-to-day life.”

While volunteering Dr Steer hit upon the idea of Milbotix, which he launched as a business in February 2020.

“I came to see that my great grandmother wasn’t an isolated episode, and that distressed behaviours are very common,” he explained.

Milbotix are currently looking to work with innovative social care organisations to refine and evaluate the smart socks.

The business recently joined SETsquared Bristol, the University’s world-leading incubator for high growth tech businesses.

Dr Steer was awarded one of their Breakthrough Bursaries, which provides heavily subsidised membership to founders from diverse backgrounds. Dr Steer is also currently on the University’s QUEST programme, which support founders to commercialise their products.

Charity Alzheimer’s Society says there will be 1.6 million people with dementia in the UK by 2040, with one person developing dementia every three minutes. Dementia is thought to cost the UK £34.7 billion a year.

Meanwhile, according to the Government autism affects 1% of the UK population, or some 700,000 people, 15-30% of whom are non-verbal part or all of the time.

Dr Steer is now growing the business: testing the socks with people living with mid to late-stage dementia and developing the tech before bringing the product to market next year. Milbotix will begin a funding round later this year.

Milbotix is currently a team of three, including Jacqui Arnold, who has been working with people living with dementia for 40 years.

She said: “These socks could make such a difference. Having that early indicator of someone’s stress levels rising could provide the early intervention they need to reduce their distress – be that touch, music, pain relief or simply having someone there with them.”

Milbotix will be supported by Alzheimer’s Society through their Accelerator Programme, which is helping fund the smart socks’ development, providing innovation support and helping test what it described as a “brilliant product”.

Natasha Howard-Murray, Senior Innovator at Alzheimer’s Society, said: “Some people with dementia may present behaviours such as aggression, irritability and resistance to care.

“This innovative wearable tech is a fantastic, accessible way for staff to better monitor residents’ distress and agitation.”

Professor Judith Squires, Deputy Vice-Chancellor at the University of Bristol, said: “It is fantastic to see Zeke using the skills he learnt with us to improve the wellbeing of some of those most in need.

“The innovative research that Zeke has undertaken has the potential to help millions live better lives. We hope to see Milbotix flourish.”

Touchy subject: 3D printed fingertip ‘feels’ like human skin

Robotic hand with a 3D-printed tactile fingertip on the little (pinky) finger. The white rigid back to the fingertip is covered with the black flexible 3D-printed skin.

Machines can beat the world’s best chess player, but they cannot handle a chess piece as well as an infant. This lack of robot dexterity is partly because artificial grippers lack the fine tactile sense of the human fingertip, which is used to guide our hands as we pick up and handle objects.

Two papers published in the Journal of the Royal Society Interface give the first in-depth comparison of an artificial fingertip with neural recordings of the human sense of touch. The research was led by Professor of Robotics & AI (Artificial Intelligence), Nathan Lepora, from the University of Bristol’s Department of Engineering Maths and based at the Bristol Robotics Laboratory.

“Our work helps uncover how the complex internal structure of human skin creates our human sense of touch. This is an exciting development in the field of soft robotics – being able to 3D-print tactile skin could create robots that are more dexterous or significantly improve the performance of prosthetic hands by giving them an in-built sense of touch,” said Professor Lepora.

Cut-through section on the 3D-printed tactile skin. The white plastic is a rigid mount for the flexible black rubber skin. Both parts are made together on an advanced 3D-printer. The ‘pins’ on the inside of the skin replicate dermal papillae that are formed inside human skin.

Professor Lepora and colleagues created the sense of touch in the artificial fingertip using a 3D-printed mesh of pin-like papillae on the underside of the compliant skin, which mimic the dermal papillae found between the outer epidermal and inner dermal layers of human tactile skin. The papillae are made on advanced 3D-printers that can mix together soft and hard materials to create complicated structures like those found in biology.

“We found our 3D-printed tactile fingertip can produce artificial nerve signals that look like recordings from real, tactile neurons. Human tactile nerves transmit signals from various nerve endings called mechanoreceptors, which can signal the pressure and shape of a contact. Classic work by Phillips and Johnson in 1981 first plotted electrical recordings from these nerves to study ‘tactile spatial resolution’ using a set of standard ridged shapes used by psychologists. In our work, we tested our 3D-printed artificial fingertip as it ‘felt’ those same ridged shapes and discovered a startlingly close match to the neural data,” said Professor Lepora.

“For me, the most exciting moment was when we looked at our artificial nerve recordings from the 3D-printed fingertip and they looked like the real recordings from over 40 years ago! Those recordings are very complex with hills and dips over edges and ridges, and we saw the same pattern in our artificial tactile data,” said Professor Lepora.

While the research found a remarkably close match between the artificial fingertip and human nerve signals, it was not as sensitive to fine detail. Professor Lepora suspects this is because the 3D-printed skin is thicker than real skin and his team is now exploring how to 3D-print structures on the microscopic scale of human skin.

“Our aim is to make artificial skin as good – or even better – than real skin,” said Professor Lepora.

PAPERS

Bristol scientists develop insect-sized flying robots with flapping wings

Front view of the flying robot. Image credit: Dr Tim Helps

This new advance, published in the journal Science Robotics, could pave the way for smaller, lighter and more effective micro flying robots for environmental monitoring, search and rescue, and deployment in hazardous environments.

Until now, typical micro flying robots have used motors, gears and other complex transmission systems to achieve the up-and-down motion of the wings. This has added complexity, weight and undesired dynamic effects.

Taking inspiration from bees and other flying insects, researchers from Bristol’s Faculty of Engineering, led by Professor of Robotics Jonathan Rossiter, have successfully demonstrated a direct-drive artificial muscle system, called the Liquid-amplified Zipping Actuator (LAZA), that achieves wing motion using no rotating parts or gears.

The LAZA system greatly simplifies the flapping mechanism, enabling future miniaturization of flapping robots down to the size of insects.

In the paper, the team show how a pair of LAZA-powered flapping wings can provide more power compared with insect muscle of the same weight, enough to fly a robot across a room at 18 body lengths per second.

They also demonstrated how the LAZA can deliver consistent flapping over more than one million cycles, important for making flapping robots that can undertake long-haul flights.

The team expect the LAZA to be adopted as a fundamental building block for a range of autonomous insect-like flying robots.

Dr Tim Helps, lead author and developer of the LAZA system said: “With the LAZA, we apply electrostatic forces directly on the wing, rather than through a complex, inefficient transmission system. This leads to better performance, simpler design, and will unlock a new class of low-cost, lightweight flapping micro-air vehicles for future applications, like autonomous inspection of off-shore wind turbines.”

Professor Rossiter added: “Making smaller and better performing flapping wing micro robots is a huge challenge. LAZA is an important step toward autonomous flying robots that could be as small as insects and perform environmentally critical tasks such as plant pollination and exciting emerging roles such as finding people in collapsed buildings.”

A camera that knows exactly where it is

Overview of the on-sensor mapping. The system moves around and as it does it builds a visual catalogue of what it observes. This is the map that is later used to know if it has been there before.
Image credit: University of Bristol

Knowing where you are on a map is one of the most useful pieces of information when navigating journeys. It allows you to plan where to go next and also tracks where you have been before. This is essential for smart devices from robot vacuum cleaners to delivery drones to wearable sensors keeping an eye on our health.

But one important obstacle is that systems that need to build or use maps are very complex and commonly rely on external signals like GPS that do not work indoors, or require a great deal of energy due to the large number of components involved.

Walterio Mayol-Cuevas, Professor in Robotics, Computer Vision and Mobile Systems at the University of Bristol’s Department of Computer Science, led the team that has been developing this new technology.

He said: “We often take for granted things like our impressive spatial abilities. Take bees or ants as an example. They have been shown to be able to use visual information to move around and achieve highly complex navigation, all without GPS or much energy consumption.

“In great part this is because their visual systems are extremely efficient and well-tuned to making and using maps, and robots can’t compete there yet.”

However, a new breed of sensor-processor devices that the team calls Pixel Processor Array (PPA), allow processing on-sensor. This means that as images are sensed, the device can decide what information to keep, what information to discard and only use what it needs for the task at hand.

An example of such PPA device is the SCAMP architecture that has been developed by the team’s colleagues at the University of Manchester by Piotr Dudek, Professor of Circuits and Systems from the University of Manchester and his team. This PPA has one small processor for every pixel which allows for massively parallel computation on the sensor itself.

The team at the University of Bristol has previously demonstrated how these new systems can recognise objects at thousands of frames per second but the new research shows how a sensor-processor device can make maps and use them, all at the time of image capture.

This work was part of the MSc dissertation of Hector Castillo-Elizalde, who did his MSc in Robotics at the University of Bristol. He was co-supervised by Yanan Liu who is also doing his PhD on the same topic and Dr Laurie Bose.

Hector Castillo-Elizalde and the team developed a mapping algorithm that runs all on-board the sensor-processor device.

The algorithm is deceptively simple: when a new image arrives, the algorithm decides if it is sufficiently different to what it has seen before. If it is, it will store some of its data, if not it will discard it.

Right: the system moves around the world, Left: A new image is seen and a decision is made to add it or not to the visual catalogue (top left), this is the pictorial map that can then be used to localise the system later. Image credit: University of Bristol

As the PPA device is moved around by for example a person or robot, it will collect a visual catalogue of views. This catalogue can then be used to match any new image when it is in the mode of localisation.

Importantly, no images go out of the PPA, only the key data that indicates where it is with respect to the visual catalogue. This makes the system more energy efficient and also helps with privacy.

During localisation the incoming image is compared to the visual catalogue (Descriptor database) and if a match is found, the system will tell where it is (Predicted node, small white rectangle at the top) relative to the catalogue. Note how the system is able to match images even if there are changes in illumination or objects like people moving.

The team believes that this type of artificial visual systems that are developed for visual processing, and not necessarily to record images, is a first step towards making more efficient smart systems that can use visual information to understand and move in the world. Tiny, energy efficient robots or smart glasses doing useful things for the planet and for people will need spatial understanding, which will come from being able to make and use maps.

The research has been partially funded by the Engineering and Physical Sciences Research Council (EPSRC), by a CONACYT scholarship to Hector Castillo-Elizalde and a CSC scholarship to Yanan Liu.

Paper

Lily the barn owl reveals how birds fly in gusty winds

Scientists from the University of Bristol and the Royal Veterinary College have discovered how birds are able to fly in gusty conditions – findings that could inform the development of bio-inspired small-scale aircraft.

Lily the barn owl flying
Lily flies through gusts: Scientists from Bristol and the RVC have discovered how birds fly in gusty conditions – with implications for small-scale aircraft design. Image credit: Cheney et al 2020

“Birds routinely fly in high winds close to buildings and terrain – often in gusts as fast as their flight speed. So the ability to cope with strong and sudden changes in wind is essential for their survival and to be able to do things like land safely and capture prey,” said Dr Shane Windsor from the Department of Aerospace Engineering at the University of Bristol.

“We know birds cope amazingly well in conditions which challenge engineered air vehicles of a similar size but, until now, we didn’t understand the mechanics behind it,” said Dr Windsor.

The study, published in Proceedings of the Royal Society B, reveals how bird wings act as a suspension system to cope with changing wind conditions. The team, which included Bristol PhD student Nicholas Durston and researchers Jialei Song and James Usherwood from Dongguan University of Technology in China and the RVC respectively, used an innovative combination of high-speed, video-based 3D surface reconstruction, computed tomography (CT) scans, and computational fluid dynamics (CFD) to understand how birds ‘reject’ gusts through wing morphing, i.e. by changing the shape and posture of their wings.

In the experiment, conducted in the Structure and Motion Laboratory at the Royal Veterinary College, the team filmed Lily, a barn owl, gliding through a range of fan-generated vertical gusts, the strongest of which was as fast as her flight speed. Lily is a trained falconry bird who is a veteran of many nature documentaries, so wasn’t fazed in the least by all the lights and cameras. “We began with very gentle gusts in case Lily had any difficulties, but soon found that – even at the highest gust speeds we could make – Lily was unperturbed; she flew straight through to get the food reward being held by her trainer, Lloyd Buck,” commented Professor Richard Bomphrey of the Royal Veterinary College.

“Lily flew through the bumpy gusts and consistently kept her head and torso amazingly stable over the trajectory, as if she was flying with a suspension system. When we analysed it, what surprised us was that the suspension-system effect wasn’t just due to aerodynamics, but benefited from the mass in her wings. For reference, each of our upper limbs is about 5% of our body weight; for a bird it’s about double, and they use that mass to effectively absorb the gust,” said joint lead-author Dr Jorn Cheney from the Royal Veterinary College.

“Perhaps most exciting is the discovery that the very fastest part of the suspension effect is built into the mechanics of the wings, so birds don’t actively need to do anything for it to work. The mechanics are very elegant. When you strike a ball at the sweetspot of a bat or racquet, your hand is not jarred because the force there cancels out. Anyone who plays a bat-and-ball sport knows how effortless this feels. A wing has a sweetspot, just like a bat. Our analysis suggests that the force of the gust acts near this sweetspot and this markedly reduces the disturbance to the body during the first fraction of a second. The process is automatic and buys just enough time for other clever stabilising processes to kick in,” added joint lead-author, Dr Jonathan Stevenson from the University of Bristol.

Dr Windsor said the next step for the research, which was funded by the European Research Council (ERC), Air Force Office of Scientific Research and the Wellcome Trust, is to develop bio-inspired suspension systems for small-scale aircraft.

Robots can now learn to swarm on the go

A new generation of swarming robots which can independently learn and evolve new behaviours in the wild is one step closer, thanks to research from the University of Bristol and the University of the West of England (UWE).

The team used artificial evolution to enable the robots to automatically learn swarm behaviours which are understandable to humans. This new advance published this Friday in Advanced Intelligent Systems, could create new robotic possibilities for environmental monitoring, disaster recovery, infrastructure maintenance, logistics and agriculture. Read More