All posts by University of Bristol

A camera that knows exactly where it is

Overview of the on-sensor mapping. The system moves around and as it does it builds a visual catalogue of what it observes. This is the map that is later used to know if it has been there before.
Image credit: University of Bristol

Knowing where you are on a map is one of the most useful pieces of information when navigating journeys. It allows you to plan where to go next and also tracks where you have been before. This is essential for smart devices from robot vacuum cleaners to delivery drones to wearable sensors keeping an eye on our health.

But one important obstacle is that systems that need to build or use maps are very complex and commonly rely on external signals like GPS that do not work indoors, or require a great deal of energy due to the large number of components involved.

Walterio Mayol-Cuevas, Professor in Robotics, Computer Vision and Mobile Systems at the University of Bristol’s Department of Computer Science, led the team that has been developing this new technology.

He said: “We often take for granted things like our impressive spatial abilities. Take bees or ants as an example. They have been shown to be able to use visual information to move around and achieve highly complex navigation, all without GPS or much energy consumption.

“In great part this is because their visual systems are extremely efficient and well-tuned to making and using maps, and robots can’t compete there yet.”

However, a new breed of sensor-processor devices that the team calls Pixel Processor Array (PPA), allow processing on-sensor. This means that as images are sensed, the device can decide what information to keep, what information to discard and only use what it needs for the task at hand.

An example of such PPA device is the SCAMP architecture that has been developed by the team’s colleagues at the University of Manchester by Piotr Dudek, Professor of Circuits and Systems from the University of Manchester and his team. This PPA has one small processor for every pixel which allows for massively parallel computation on the sensor itself.

The team at the University of Bristol has previously demonstrated how these new systems can recognise objects at thousands of frames per second but the new research shows how a sensor-processor device can make maps and use them, all at the time of image capture.

This work was part of the MSc dissertation of Hector Castillo-Elizalde, who did his MSc in Robotics at the University of Bristol. He was co-supervised by Yanan Liu who is also doing his PhD on the same topic and Dr Laurie Bose.

Hector Castillo-Elizalde and the team developed a mapping algorithm that runs all on-board the sensor-processor device.

The algorithm is deceptively simple: when a new image arrives, the algorithm decides if it is sufficiently different to what it has seen before. If it is, it will store some of its data, if not it will discard it.

Right: the system moves around the world, Left: A new image is seen and a decision is made to add it or not to the visual catalogue (top left), this is the pictorial map that can then be used to localise the system later. Image credit: University of Bristol

As the PPA device is moved around by for example a person or robot, it will collect a visual catalogue of views. This catalogue can then be used to match any new image when it is in the mode of localisation.

Importantly, no images go out of the PPA, only the key data that indicates where it is with respect to the visual catalogue. This makes the system more energy efficient and also helps with privacy.

During localisation the incoming image is compared to the visual catalogue (Descriptor database) and if a match is found, the system will tell where it is (Predicted node, small white rectangle at the top) relative to the catalogue. Note how the system is able to match images even if there are changes in illumination or objects like people moving.

The team believes that this type of artificial visual systems that are developed for visual processing, and not necessarily to record images, is a first step towards making more efficient smart systems that can use visual information to understand and move in the world. Tiny, energy efficient robots or smart glasses doing useful things for the planet and for people will need spatial understanding, which will come from being able to make and use maps.

The research has been partially funded by the Engineering and Physical Sciences Research Council (EPSRC), by a CONACYT scholarship to Hector Castillo-Elizalde and a CSC scholarship to Yanan Liu.

Paper

Lily the barn owl reveals how birds fly in gusty winds

Scientists from the University of Bristol and the Royal Veterinary College have discovered how birds are able to fly in gusty conditions – findings that could inform the development of bio-inspired small-scale aircraft.

Lily the barn owl flying
Lily flies through gusts: Scientists from Bristol and the RVC have discovered how birds fly in gusty conditions – with implications for small-scale aircraft design. Image credit: Cheney et al 2020

“Birds routinely fly in high winds close to buildings and terrain – often in gusts as fast as their flight speed. So the ability to cope with strong and sudden changes in wind is essential for their survival and to be able to do things like land safely and capture prey,” said Dr Shane Windsor from the Department of Aerospace Engineering at the University of Bristol.

“We know birds cope amazingly well in conditions which challenge engineered air vehicles of a similar size but, until now, we didn’t understand the mechanics behind it,” said Dr Windsor.

The study, published in Proceedings of the Royal Society B, reveals how bird wings act as a suspension system to cope with changing wind conditions. The team, which included Bristol PhD student Nicholas Durston and researchers Jialei Song and James Usherwood from Dongguan University of Technology in China and the RVC respectively, used an innovative combination of high-speed, video-based 3D surface reconstruction, computed tomography (CT) scans, and computational fluid dynamics (CFD) to understand how birds ‘reject’ gusts through wing morphing, i.e. by changing the shape and posture of their wings.

In the experiment, conducted in the Structure and Motion Laboratory at the Royal Veterinary College, the team filmed Lily, a barn owl, gliding through a range of fan-generated vertical gusts, the strongest of which was as fast as her flight speed. Lily is a trained falconry bird who is a veteran of many nature documentaries, so wasn’t fazed in the least by all the lights and cameras. “We began with very gentle gusts in case Lily had any difficulties, but soon found that – even at the highest gust speeds we could make – Lily was unperturbed; she flew straight through to get the food reward being held by her trainer, Lloyd Buck,” commented Professor Richard Bomphrey of the Royal Veterinary College.

“Lily flew through the bumpy gusts and consistently kept her head and torso amazingly stable over the trajectory, as if she was flying with a suspension system. When we analysed it, what surprised us was that the suspension-system effect wasn’t just due to aerodynamics, but benefited from the mass in her wings. For reference, each of our upper limbs is about 5% of our body weight; for a bird it’s about double, and they use that mass to effectively absorb the gust,” said joint lead-author Dr Jorn Cheney from the Royal Veterinary College.

“Perhaps most exciting is the discovery that the very fastest part of the suspension effect is built into the mechanics of the wings, so birds don’t actively need to do anything for it to work. The mechanics are very elegant. When you strike a ball at the sweetspot of a bat or racquet, your hand is not jarred because the force there cancels out. Anyone who plays a bat-and-ball sport knows how effortless this feels. A wing has a sweetspot, just like a bat. Our analysis suggests that the force of the gust acts near this sweetspot and this markedly reduces the disturbance to the body during the first fraction of a second. The process is automatic and buys just enough time for other clever stabilising processes to kick in,” added joint lead-author, Dr Jonathan Stevenson from the University of Bristol.

Dr Windsor said the next step for the research, which was funded by the European Research Council (ERC), Air Force Office of Scientific Research and the Wellcome Trust, is to develop bio-inspired suspension systems for small-scale aircraft.

Robots can now learn to swarm on the go

A new generation of swarming robots which can independently learn and evolve new behaviours in the wild is one step closer, thanks to research from the University of Bristol and the University of the West of England (UWE).

The team used artificial evolution to enable the robots to automatically learn swarm behaviours which are understandable to humans. This new advance published this Friday in Advanced Intelligent Systems, could create new robotic possibilities for environmental monitoring, disaster recovery, infrastructure maintenance, logistics and agriculture. Read More