To the future: Finding the moral common ground in human-robot relations
Magnetic FreeBOT balls make giant leap for robotics
Skills development in Physical AI could give birth to lifelike intelligent robots
Signs Your Warehouse Environment Needs Mobile Robots
Davide Scaramuzza’s seminar – Autonomous, agile micro drones: Perception, learning, and control
A few days ago, Robotics Today hosted an online seminar with Professor Davide Scaramuzza from the University of Zurich. The seminar was recorded, so you can watch it now in case you missed it.
“Robotics Today – A series of technical talks” is a virtual robotics seminar series. The goal of the series is to bring the robotics community together during these challenging times. The seminars are open to the public. The format of the seminar consists of a technical talk live captioned and streamed via Web and Twitter, followed by an interactive discussion between the speaker and a panel of faculty, postdocs, and students that will moderate audience questions.
Abstract
Autonomous quadrotors will soon play a major role in search-and-rescue, delivery, and inspection missions, where a fast response is crucial. However, their speed and maneuverability are still far from those of birds and human pilots. High speed is particularly important: since drone battery life is usually limited to 20-30 minutes, drones need to fly faster to cover longer distances. However, to do so, they need faster sensors and algorithms. Human pilots take years to learn the skills to navigate drones. What does it take to make drones navigate as good or even better than human pilots? Autonomous, agile navigation through unknown, GPS-denied environments poses several challenges for robotics research in terms of perception, planning, learning, and control. In this talk, I will show how the combination of both model-based and machine learning methods united with the power of new, low-latency sensors, such as event cameras, can allow drones to achieve unprecedented speed and robustness by relying solely on onboard computing.
Biography
Davide Scaramuzza (Italian) is a Professor of Robotics and Perception at both departments of Informatics (University of Zurich) and Neuroinformatics (joint between the University of Zurich and ETH Zurich), where he directs the Robotics and Perception Group. His research lies at the intersection of robotics, computer vision, and machine learning, using standard cameras and event cameras, and aims to enable autonomous, agile navigation of micro drones in search and rescue applications. After a Ph.D. at ETH Zurich (with Roland Siegwart) and a postdoc at the University of Pennsylvania (with Vijay Kumar and Kostas Daniilidis), from 2009 to 2012, he led the European project sFly, which introduced the PX4 autopilot and pioneered visual-SLAM-based autonomous navigation of micro drones in GPS-denied environments. From 2015 to 2018, he was part of the DARPA FLA program (Fast Lightweight Autonomy) to research autonomous, agile navigation of micro drones in GPS-denied environments. In 2018, his team won the IROS 2018 Autonomous Drone Race, and in 2019 it ranked second in the AlphaPilot Drone Racing world championship. For his research contributions to autonomous, vision-based, drone navigation and event cameras, he won prestigious awards, such as a European Research Council (ERC) Consolidator Grant, the IEEE Robotics and Automation Society Early Career Award, an SNSF-ERC Starting Grant, a Google Research Award, the KUKA Innovation Award, two Qualcomm Innovation Fellowships, the European Young Research Award, the Misha Mahowald Neuromorphic Engineering Award, and several paper awards. He co-authored the book “Introduction to Autonomous Mobile Robots” (published by MIT Press; 10,000 copies sold) and more than 100 papers on robotics and perception published in top-ranked journals (Science Robotics, TRO, T-PAMI, IJCV, IJRR) and conferences (RSS, ICRA, CVPR, ICCV, CORL, NeurIPS). He has served as a consultant for the United Nations’ International Atomic Energy Agency’s Fukushima Action Plan on Nuclear Safety and several drones and computer-vision companies, to which he has also transferred research results. In 2015, he cofounded Zurich-Eye, today Facebook Zurich, which developed the visual-inertial SLAM system running in Oculus Quest VR headsets. He was also the strategic advisor of Dacuda, today Magic Leap Zurich. In 2020, he cofounded SUIND, which develops camera-based safety solutions for commercial drones. Many aspects of his research have been prominently featured in wider media, such as The New York Times, BBC News, Discovery Channel, La Repubblica, Neue Zurcher Zeitung, and also in technology-focused media, such as IEEE Spectrum, MIT Technology Review, Tech Crunch, Wired, The Verge.
You can also view past seminars on the Robotics Today YouTube Channel.
New ‘robotic snake’ device grips, picks up objects
Using gazes for effective tutoring with social robots
Salamanders provide a model for spinal-cord regeneration
Robots in the factory: bosses or slaves?
Robots in the factory: bosses or slaves?
Global impact of Covid-19 on Autonomous Last Mile Delivery Market
Drones that patrol forests could monitor environmental and ecological changes

By Caroline Brogan
Imperial researchers have created drones that can attach sensors to trees to monitor environmental and ecological changes in forests.
Sensors for forest monitoring are already used to track changes in temperature, humidity and light, as well as the movements of animals and insects through their habitat. They also help to detect and monitor forest fires and can provide valuable data on how climate change and other human activities are impacting the natural world.
However, placing these sensors can prove difficult in large, tall forests, and climbing trees to place them poses its own risks.
Now, researchers at Imperial College London’s Aerial Robotics Lab have developed drones that can shoot sensor-containing darts onto trees several metres away in cluttered environments like forests. The drones can also place sensors through contact or by perching on tree branches.
Via GIPHY. Credit: Imperial College London
I like to think of them as artificial forest inhabitants who will soon watch over the ecosystem and provide the data we need to protect the environment.
Professor Mirko Kovac, Department of Aeronautics
The researchers hope the drones will be used in future to create networks of sensors to boost data on forest ecosystems, and to track hard-to-navigate biomes like the Amazon rainforest.
Lead researcher Professor Mirko Kovac, Director of the Aerial Robotics Lab from the Department of Aeronautics at Imperial said: “Monitoring forest ecosystems can be difficult, but our drones could deploy whole networks of sensors to boost the amount and precision of environmental and ecological data.
“I like to think of them as artificial forest inhabitants who will soon watch over the ecosystem and provide the data we need to protect the environment.”
The drones are equipped with cameras to help identify suitable targets, and a smart material that changes shape when heated to launch the darts, which then stick to the trees. They can also perch on tree branches like birds to collect data themselves, acting as mobile sensors.
The researchers have tested their drones at the Swiss Federal Laboratories for Materials Science and Technology (EMPA) and on trees at Imperial’s Silwood Park Campus.
Via GIPHY. Credit: Imperial College London
We aim to introduce new design and control strategies to allow drones to effectively operate in forested environments.
Dr Salua Hamaza, Department of Aeronautics
The drones are currently controlled by people: using control units, the researchers watch through the camera lens to select target trees and shoot the darts. The next step is to make the drones autonomous, so that researchers can test how they fare in denser forest environments without human guidance.
Co-author André Farhina, of the Department of Aeronautics, said: “There are plenty of challenges to be addressed before the drones can be regularly used in forests, like achieving a careful balance between human input and automated tasks so that they can be used safely while remaining adaptable to unpredictable environments.”
Co-author Dr Salua Hamaza, also of the Department of Aeronautics, said: “We aim to introduce new design and control strategies to allow drones to effectively operate in forested environments. Exploiting smart mechanisms and new sensing techniques we can off-load the on-board computation, and create platforms that are energy-efficient and better performing.”
- Hamaza, S., Farinha, A., Nguyen, H.N. and Kovac, M., 2020, November. Sensor Delivery in Forests with Aerial Robots: A New Paradigm for Environmental Monitoring. In IEEE IROS Workshop on Perception, Planning and Mobility in Forestry Robotics.
- Nguyen, H.N., Siddall, R., Stephens, B., Navarro-Rubio, A. and Kova?, M., 2019, April. A Passively Adaptive Microspine Grapple for Robust, Controllable Perching. In 2019 2nd IEEE International Conference on Soft Robotics (RoboSoft) (pp. 80-87). IEEE. Video.
- Farinha, A., Zufferey, R., Zheng, P., Armanini, S.F. and Kovac, M., 2020. Unmanned Aerial Sensor Placement for Cluttered Environments. IEEE Robotics and Automation Letters, 5(4), pp.6623-6630. Video.
These works were funded by the EPSRC, ORCA Robotics Hub, EU Horizon 2020, NERC (QMEE CDT), UKAEA with RACE and the Royal Society.