Page 312 of 465
1 310 311 312 313 314 465

CYBATHLON 2020 Global Edition: A competition to break down barriers between the public, people with disabilities and technology developers

Involving potential users of a particular technology in the research and development (R&D) process is a very powerful way to maximise success when such technology is deployed in the real world. In addition, this can speed up the R&D process because the researchers’ perspective to the problem is combined with that of end-users. The non-​profit project CYBATHLON was created by ETH Zurich as a way to advance R&D of assistive technology through competitions that involve developers, people with disabilities, and the general public.

Competitor getting up from sofa
Over 50 teams from all over the world will compete against each other in the Cybathlon 2020 Global Edition. (Credit: Alessandro Della Bella / ETH Zürich)

This 13th and 14th of November, the CYBATHLON 2020 edition is taking place. The event will be live-streamed, and it is completely open to the public. You can access it through their website. Here’s the full programme for the two days:

Friday, 13 November 2020

4pm CET (3pm UTC): Brain-Computer Interface Race

The power of thoughts

  • Welcome to CYBATHLON 2020 Global Edition!
  • Kick-off by the head of competition, Lukas Jaeger
  • Races of all teams
  • Team stories and insights
  • Analysis by the BCI expert Nicole Wenderoth of ETH Zurich
  • Special guest: Joël Mesot, President of ETH Zurich
  • The top 4: Who will win?

5pm CET (4pm UTC): Powered Arm Prosthesis Race

Grasping and feeling

  • Races of all teams
  • Team stories and insights
  • Guest: Roger Gassert, researcher on assistive technologies at ETH Zurich
  • Analysis by the arm prosthesis expert Michel Fornasier
  • The top 4: Who will win?

6pm CET (5pm UTC): Functional Electrical Stimulation Bike Race

Power to the muscles

  • Races of all teams
  • Team stories and insights
  • Guest: Robert Riener, initiator of CYBATHLON
  • Analysis by Claudio Perret, expert in functional electrical stimulation
  • The top 4: Who will win?

7pm CET (6pm UTC): Inside CYBATHLON – Stories, recap and outlook

Insights of the protagonists and organisers – the journey of CYBATHLON

  • Robert Riener, initiator of CYBATHLON
  • Roger Gassert, researcher on assistive technologies at ETH Zurich
  • Florian Hauser, powered wheelchair pilot of team HSR enhanced
  • Roland Sigrist, director of CYBATHLON

The medical checks

  • Who can compete? Insights of the medical examiners Zina-Mary Manjaly and Jirí Dvořák

Focus: Inclusion

Recap and outlook

Saturday, 14 November 2020

1pm CET (12pm UTC): Powered Wheelchair Race

Overcoming stairs and ramps

  • Races of all teams
  • Insights of the head of competition, Lukas Jaeger
  • Team stories and insights
  • Guest: Roger Gassert, researcher on assistive technologies at ETH Zurich
  • Analysis by scientist Sue Bertschy
  • The top 4: Who will win?

2pm CET (1pm UTC): Powered Leg Prosthesis Race

Watch your step

  • Races of all teams
  • Team stories and insights
  • Guest: Robert Riener, initiator of CYBATHLON
  • Analysis by expert Lukas Christen, parathlete and coach
  • The top 4: Who will win?

3pm CET (2pm UTC): Powered Exoskeleton Race

Walking in robotic suits

  • Races of all teams
  • Guest: Roger Gassert, researcher on assistive technologies at ETH Zurich
  • Analysis by the exoskeleton developers Jaime Duarte and Kai Schmidt
  • The top 4: Who will win?

4pm CET (3pm UTC): Inside CYBATHLON – Stories, recap and outlook

Insights of the protagonists and organisers – the future of CYBATHLON and social inclusion

  • Silke Pan, Powered Exoskeleton Pilot of team TWIICE
  • Robert Riener, Initiator of CYBATHLON

New systems

  • Maria Fossati, powered arm prosthesis pilot of team SoftHand Pro
  • Max Erick Busse-Grawitz, expert on mechatronics
  • Roger Gassert, researcher on assistive technologies at ETH Zurich

Recap and outlook – the CYBATHLON @school and the next CYBATHLON

  • Special guest: Sarah Springman, rector of ETH Zurich
  • Robert Riener, initiator of CYBATHLON
  • Roland Sigrist, director of CYBATHLON

The next CYBATHLON

Using social robots to improve children’s language skills

As robots share many characteristics with toys, they could prove to be a valuable tool for teaching children in engaging and innovative ways. In recent years, some roboticists and computer scientists have thus been investigating how robotics systems could be introduced in classroom and pre-school environments.

Skills development in Physical AI could give birth to lifelike intelligent robots

The research suggests that teaching materials science, mechanical engineering, computer science, biology and chemistry as a combined discipline could help students develop the skills they need to create lifelike artificially intelligent (AI) robots as researchers.

Davide Scaramuzza’s seminar – Autonomous, agile micro drones: Perception, learning, and control

Davide Scaramuzza

A few days ago, Robotics Today hosted an online seminar with Professor Davide Scaramuzza from the University of Zurich. The seminar was recorded, so you can watch it now in case you missed it.

“Robotics Today – A series of technical talks” is a virtual robotics seminar series. The goal of the series is to bring the robotics community together during these challenging times. The seminars are open to the public. The format of the seminar consists of a technical talk live captioned and streamed via Web and Twitter, followed by an interactive discussion between the speaker and a panel of faculty, postdocs, and students that will moderate audience questions.

Abstract

Autonomous quadrotors will soon play a major role in search-and-rescue, delivery, and inspection missions, where a fast response is crucial. However, their speed and maneuverability are still far from those of birds and human pilots. High speed is particularly important: since drone battery life is usually limited to 20-30 minutes, drones need to fly faster to cover longer distances. However, to do so, they need faster sensors and algorithms. Human pilots take years to learn the skills to navigate drones. What does it take to make drones navigate as good or even better than human pilots? Autonomous, agile navigation through unknown, GPS-denied environments poses several challenges for robotics research in terms of perception, planning, learning, and control. In this talk, I will show how the combination of both model-based and machine learning methods united with the power of new, low-latency sensors, such as event cameras, can allow drones to achieve unprecedented speed and robustness by relying solely on onboard computing.

Biography

Davide Scaramuzza (Italian) is a Professor of Robotics and Perception at both departments of Informatics (University of Zurich) and Neuroinformatics (joint between the University of Zurich and ETH Zurich), where he directs the Robotics and Perception Group. His research lies at the intersection of robotics, computer vision, and machine learning, using standard cameras and event cameras, and aims to enable autonomous, agile navigation of micro drones in search and rescue applications. After a Ph.D. at ETH Zurich (with Roland Siegwart) and a postdoc at the University of Pennsylvania (with Vijay Kumar and Kostas Daniilidis), from 2009 to 2012, he led the European project sFly, which introduced the PX4 autopilot and pioneered visual-SLAM-based autonomous navigation of micro drones in GPS-denied environments. From 2015 to 2018, he was part of the DARPA FLA program (Fast Lightweight Autonomy) to research autonomous, agile navigation of micro drones in GPS-denied environments. In 2018, his team won the IROS 2018 Autonomous Drone Race, and in 2019 it ranked second in the AlphaPilot Drone Racing world championship. For his research contributions to autonomous, vision-based, drone navigation and event cameras, he won prestigious awards, such as a European Research Council (ERC) Consolidator Grant, the IEEE Robotics and Automation Society Early Career Award, an SNSF-ERC Starting Grant, a Google Research Award, the KUKA Innovation Award, two Qualcomm Innovation Fellowships, the European Young Research Award, the Misha Mahowald Neuromorphic Engineering Award, and several paper awards. He co-authored the book “Introduction to Autonomous Mobile Robots” (published by MIT Press; 10,000 copies sold) and more than 100 papers on robotics and perception published in top-ranked journals (Science Robotics, TRO, T-PAMI, IJCV, IJRR) and conferences (RSS, ICRA, CVPR, ICCV, CORL, NeurIPS). He has served as a consultant for the United Nations’ International Atomic Energy Agency’s Fukushima Action Plan on Nuclear Safety and several drones and computer-vision companies, to which he has also transferred research results. In 2015, he cofounded Zurich-Eye, today Facebook Zurich, which developed the visual-inertial SLAM system running in Oculus Quest VR headsets. He was also the strategic advisor of Dacuda, today Magic Leap Zurich. In 2020, he cofounded SUIND, which develops camera-based safety solutions for commercial drones. Many aspects of his research have been prominently featured in wider media, such as The New York Times, BBC News, Discovery Channel, La Repubblica, Neue Zurcher Zeitung, and also in technology-focused media, such as IEEE Spectrum, MIT Technology Review, Tech Crunch, Wired, The Verge.

You can also view past seminars on the Robotics Today YouTube Channel.

Using gazes for effective tutoring with social robots

Ph.D. candidate Eunice Njeri Mwangi of the department of Industrial Design has investigated how intuitive gaze-based interactions between robots and humans can help improve the effectiveness of robot tutors. The researcher successfully defended her PHD-thesis on Wednesday 28th of October 2020.

Drones that patrol forests could monitor environmental and ecological changes

Tree sensors
Credit: Imperial College London

By Caroline Brogan

Imperial researchers have created drones that can attach sensors to trees to monitor environmental and ecological changes in forests.

Sensors for forest monitoring are already used to track changes in temperature, humidity and light, as well as the movements of animals and insects through their habitat. They also help to detect and monitor forest fires and can provide valuable data on how climate change and other human activities are impacting the natural world.

However, placing these sensors can prove difficult in large, tall forests, and climbing trees to place them poses its own risks.

Now, researchers at Imperial College London’s Aerial Robotics Lab have developed drones that can shoot sensor-containing darts onto trees several metres away in cluttered environments like forests. The drones can also place sensors through contact or by perching on tree branches.

Via GIPHY. Credit: Imperial College London

I like to think of them as artificial forest inhabitants who will soon watch over the ecosystem and provide the data we need to protect the environment.

Professor Mirko Kovac, Department of Aeronautics

The researchers hope the drones will be used in future to create networks of sensors to boost data on forest ecosystems, and to track hard-to-navigate biomes like the Amazon rainforest.

Lead researcher Professor Mirko Kovac, Director of the Aerial Robotics Lab from the Department of Aeronautics at Imperial said: “Monitoring forest ecosystems can be difficult, but our drones could deploy whole networks of sensors to boost the amount and precision of environmental and ecological data.

“I like to think of them as artificial forest inhabitants who will soon watch over the ecosystem and provide the data we need to protect the environment.”

The drones are equipped with cameras to help identify suitable targets, and a smart material that changes shape when heated to launch the darts, which then stick to the trees. They can also perch on tree branches like birds to collect data themselves, acting as mobile sensors.

The researchers have tested their drones at the Swiss Federal Laboratories for Materials Science and Technology (EMPA) and on trees at Imperial’s Silwood Park Campus.

Via GIPHY. Credit: Imperial College London

We aim to introduce new design and control strategies to allow drones to effectively operate in forested environments.

Dr Salua Hamaza, Department of Aeronautics

The drones are currently controlled by people: using control units, the researchers watch through the camera lens to select target trees and shoot the darts. The next step is to make the drones autonomous, so that researchers can test how they fare in denser forest environments without human guidance.

Co-author André Farhina, of the Department of Aeronautics, said: “There are plenty of challenges to be addressed before the drones can be regularly used in forests, like achieving a careful balance between human input and automated tasks so that they can be used safely while remaining adaptable to unpredictable environments.”

Co-author Dr Salua Hamaza, also of the Department of Aeronautics, said: “We aim to introduce new design and control strategies to allow drones to effectively operate in forested environments. Exploiting smart mechanisms and new sensing techniques we can off-load the on-board computation, and create platforms that are energy-efficient and better performing.”

  • Hamaza, S., Farinha, A., Nguyen, H.N. and Kovac, M., 2020, November. Sensor Delivery in Forests with Aerial Robots: A New Paradigm for Environmental Monitoring. In IEEE IROS Workshop on Perception, Planning and Mobility in Forestry Robotics.
  • Nguyen, H.N., Siddall, R., Stephens, B., Navarro-Rubio, A. and Kova?, M., 2019, April. A Passively Adaptive Microspine Grapple for Robust, Controllable Perching. In 2019 2nd IEEE International Conference on Soft Robotics (RoboSoft) (pp. 80-87). IEEE. Video.
  • Farinha, A., Zufferey, R., Zheng, P., Armanini, S.F. and Kovac, M., 2020. Unmanned Aerial Sensor Placement for Cluttered Environments. IEEE Robotics and Automation Letters, 5(4), pp.6623-6630. Video.

These works were funded by the EPSRC, ORCA Robotics Hub, EU Horizon 2020, NERC (QMEE CDT), UKAEA with RACE and the Royal Society.

Page 312 of 465
1 310 311 312 313 314 465