Page 358 of 522
1 356 357 358 359 360 522

Smooth touchdown: Novel camera-based system for automated landing of drone on a fixed spot

Initially earmarked for covert military operations, unmanned aerial vehicles (UAVs) or drones have since gained tremendous popularity, which has broadened the scope of their use. In fact, "remote pilot" drones have been largely replaced by "autonomous" drones for applications in various fields. One such application is their usage in rescue missions following a natural or man-made disaster. However, this often requires the drones to be able to land safely on uneven terrain—which can be very difficult to execute.

Talking to Machines – New Operating Concepts With Artificial Intelligence

A simple request, a short sentence, for the human brain to interpret what is meant, make the connection and initiate an appropriate reaction is easy. For a machine this is much more complicated. To control technical devices with speech requires many individual steps.

Bio-in­spired ro­bot­ics: Learn­ing from drag­on­flies

It is a high-speed movement: within fractions of a second the mouthparts of the dragonfly larvae spring forwards to seize its prey. For decades, researchers had assumed that this action must have been driven primarily by hydraulic pressure. Now, for the first time, scientists at Kiel University (CAU) have completely decrypted the biomechanical functional principle of what is known as the labial mask of dragonfly larvae. A vital contribution to this discovery was made by the team led by Dr. Sebastian Büsse of the Zoological Institute in its development of a bio-inspired robot with the operating principle of the complex mouthparts adapted to test its own hypothesis—the technology used here could lead to a significant enhancement of agile robot systems. The results of the ambitious research project were published on Wednesday 20 January in the renowned specialist journal Science Robotics.

Self-supervised learning of visual appearance solves fundamental problems of optical flow

Flying insects as inspiration to AI for small drones

How do honeybees land on flowers or avoid obstacles? One would expect such questions to be mostly of interest to biologists. However, the rise of small electronics and robotic systems has also made them relevant to robotics and Artificial Intelligence (AI). For example, small flying robots are extremely restricted in terms of the sensors and processing that they can carry onboard. If these robots are to be as autonomous as the much larger self-driving cars, they will have to use an extremely efficient type of artificial intelligence – similar to the highly developed intelligence possessed by flying insects.

Optical flow

One of the main tricks up the insect’s sleeve is the extensive use of ‘optical flow’: the way in which objects move in their view. They use it to land on flowers and avoid obstacles or predators. Insects use surprisingly simple and elegant optical flow strategies to tackle complex tasks. For example, for landing, honeybees keep the optical flow divergence (how quickly things get bigger in view) constant when going down. By following this simple rule, they automatically make smooth, soft landings.

I started my work on optical flow control from enthusiasm about such elegant, simple strategies. However, developing the control methods to actually implement these strategies in flying robots turned out to be far from trivial. For example, when I first worked on optical flow landing my flying robots would not actually land, but they started to oscillate, continuously going up and down, just above the landing surface.

Bees on the left, a drone on the right
Honeybees are a fertile source of inspiration for the AI of small drones. They are able to perform an impressive repertoire of complex behaviors with very limited processing (~960,000 neurons). Drones are in their turn very interesting “models” for biology. Testing out hypotheses from biology on drones can bring novel insights into the problems faced and solved by flying insects like honeybees.

Fundamental problems

Optical flow has two fundamental problems that have been widely described in the growing literature on bio-inspired robotics. The first problem is that optical flow only provides mixed information on distances and velocities – and not on distance or velocity separately. To illustrate, if there are two landing drones and one of them flies twice as high and twice as fast as the other drone, then they experience exactly the same optical flow. However, for good control these two drones should actually react differently to deviations in the optical flow divergence. If a drone does not adapt its reactions to the height when landing, it will never arrive and start to oscillate above the landing surface.

The second problem is that optical flow is very small and little informative in the direction in which a robot is moving. This is very unfortunate for obstacle avoidance, because it means that the obstacles straight ahead of the robot are the hardest ones to detect! The problems are illustrated in the figures below.

Illustration of the two problems
Left: Problem 1: The white drone is twice as high and goes down twice as fast as the red drone. However, they both see the same optical flow divergence, as in both cases the object in view gets twice as big. This can be seen by the colored triangles – at the highest position it captures the full angle of the landing platform, at the lowest position it only covers half of the landing platform in the field of view.
Right: Problem 2: The drone moves straight forward, in the direction of its forward-looking camera. Hence, the focus of expansion is straight ahead. Objects close to this direction, like the red obstacle, have very little flow. This is illustrated by the red lines in the figure: The angles of these lines with respect to the camera are very similar. Objects further from this direction, like the green obstacle, have considerable flow. Indeed, the green lines show that the angle gets quickly bigger when the drone moves forward.

Learning visual appearance as the solution

In an article published in Nature Machine Intelligence today [1], we propose a solution to both problems. The main idea was that both problems of optical flow would disappear if the robots were able to interpret not only optical flow, but also the visual appearance of objects in their environment. This solution becomes evident from the above figures. The rectangular insets show the images captured by the drones. For the first problem it is evident that the image perfectly captures the difference in height between the white and red drone: The landing platform is simply larger in the red drone’s image. For the second problem the red obstacle is as large as the green one in the drone’s image. Given their identical size, the obstacles are equally close to the drone.

Exploiting visual appearance as captured by an image would allow robots to see distances to objects in the scene similarly to how we humans can estimate distances in a still picture. This would allow drones to immediately pick the right control gain for optical flow control and it would allow them to see obstacles in the flight direction. The only question was: How can a flying robot learn to see distances like that?

The key to this question lay in a theory I devised a few years back [2], which showed that flying robots can actively induce optical flow oscillations to perceive distances to objects in the scene. In the approach proposed in the Nature Machine Intelligence article the robots use such oscillations in order to learn what the objects in their environment look like at different distances. In this way, the robot can for example learn how fine the texture of grass is when looking at it from different heights during landing, or how thick tree barks are at different distances when navigating in a forest.

Relevance to robotics and applications

Implementing this learning process on flying robots led to much faster, smoother optical flow landings than we ever achieved before. Moreover, for obstacle avoidance, the robots were now also able to see obstacles in the flight direction very clearly. This did not only improve obstacle detection performance, but also allowed our robots to speed up. We believe that the proposed methods will be very relevant to resource-constrained flying robots, especially when they operate in a rather confined environment, such as flying in greenhouses to monitor crop or keeping track of the stock in warehouses.

It is interesting to compare our way of distance learning with recent methods in the computer vision domain for single camera (monocular) distance perception. In the field of computer vision, self-supervised learning of monocular distance perception is done with the help of projective geometry and the reconstruction of images. This results in impressively accurate, dense distance maps. However, these maps are still “unscaled” – they can show that one object is twice as far as another one but cannot convey distances in an absolute sense.

In contrast, our proposed method provides “scaled” distance estimates. Interestingly, the scaling is not in terms of meters but in terms of control gains that would lead the drone to oscillate. This makes it very relevant for control. This feels very much like the way in which we humans perceive distances. Also for us it may be more natural to reason in terms of actions (“Is an object within reach?”, “How many steps do I roughly need to get to a place?”) than in terms of meters. It hence reminds very much of the perception of “affordances”, a concept forwarded by Gibson, who introduced the concept of optical flow [3].

A drone flying
Picture of our drone during obstacle avoidance experiments. At first, the drone flies forward while trying to keep the height constant by keeping the vertical optical flow equal to zero. If it gets close to an obstacle, it will oscillate. The increasing oscillation when approaching an obstacle is used to learn how to see distance by means of visual appearance. After learning, the drone can fly faster and safer. For the experiments we used a Parrot Bebop 2 drone, replacing its onboard software with the Paparazzi open source autopilot – performing all computing onboard of the drone’s native processor.

Relevance to biology

The findings are not only relevant to robotics, but also provide a new hypothesis for insect intelligence. Typical honeybee experiments start with a learning phase, in which honeybees exhibit various oscillatory behaviors when they get acquainted with a new environment and related novel cues like artificial flowers. The final measurements presented in articles typically take place after this learning phase has finished and focus predominantly on the role of optical flow. The presented learning process forms a novel hypothesis on how flying insects improve their navigational skills over their lifetime. This suggests that we should set up more studies to investigate and report on this learning phase.

On behalf of all authors,
Guido de Croon, Christophe De Wagter, and Tobias Seidl.

References

WSR: A new Wi-Fi-based system for collaborative robotics

Researchers at Harvard University have recently devised a system based on Wi-Fi sensing that could enhance the collaboration between robots operating in unmapped environments. This system, presented in a paper pre-published on arXiv, can essentially emulate antenna arrays in the air as a robot moves freely in a 2-D or 3-D environment.

#326: Deep Sea Mining, with Benjamin Pietro Filardo

In this episode, Abate follows up with Benjamin Pietro Filardo, founder of Pliant Energy Systems and NACROM, the North American Consortium for Responsible Ocean Mining. Pietro talks about the deep sea mining industry, an untapped market with a massive potential for growth. Pietro discusses the current proposed solutions for deep sea mining which are environmentally destructive, and he offers an alternative solution using swarm robots which could mine the depths of the ocean while creating minimal disturbance to this mysterious habitat.

Benjamin “Pietro” Filardo
After several years in the architectural profession, Pietro founded Pliant Energy Systems to explore renewable energy concepts he first pondered while earning his first degree in marine biology and oceanography. With funding from four federal agencies he has broadened the application of these concepts into marine propulsion and a highly novel robotics platform.

 
 
 
 
 
 
 

Links

A technique that allows robots to estimate the pose of objects by touching them

Humans are able to find objects in their surroundings and detect some of their properties simply by touching them. While this skill is particularly valuable for blind individuals, it can also help people with no visual impairments to complete simple tasks, such as locating and grabbing an object inside a bag or pocket.

MakinaRocks, Hyundai Robotics sign MOU to ‘advance AI-based industrial robot arm anomaly detection’

"As an increasing number of customers request AI-integrated robots, an AI-based analysis platform for robot functions is an indispensable necessity. We plan to expand our userbase with robots that can autonomously predict failures and perform quality management."

Women in Robotics Update: introducing our 2021 Board of Directors

Women in Robotics is a grassroots community involving women from across the globe. Our mission is supporting women working in robotics and women who would like to work in robotics. We formed an official 501c3 non-profit organization in 2020 headquartered in Oakland California. We’d like to introduce our 2021 Board of Directors:

Andra Keay, Women in Robotics President

Managing Director at Silicon Valley Robotics | Visiting Scholar at CITRIS People and Robots Lab | Startup Advisor & Investor

Andra Keay founded Women in Robotics originally under the umbrella of Silicon Valley Robotics, the non-profit industry group supporting innovation and commercialization of robotics technologies. Andra’s background is in human-robot interaction and communication theory. She is a trained futurist, founder of the Robot Launch global startup competition, Robot Garden maker space, Women in Robotics and is a mentor, investor and advisor to startups, investors, accelerators and think tanks, with a strong interest in commercializing socially positive robotics and AI. Andra speaks regularly at leading technology conferences, and is Secretary-General of the International Alliance of Robotics Associations. She is also a Visiting Scholar with the UC’s CITRIS People and Robots Research Group.

Allison Thackston

Roboticist, Software Engineer & Manager– Waymo

Allison Thackston is the Chair of Women in Robotics Website SubCommittee and CoChair of our New Chapter Formation SubCommittee. She is also a Founding Member of the ROS2 Technical Steering Committee. Prior to working at Waymo, she worked at Nuro and was the Manager of Shared Autonomy at Toyota Research Institute, and Principle Research Scientist Intelligent Manipulation. She has an MS in Robotics and MechEng from the University of Hawaii and a BS in EEng from Georgia Tech. With a passion for robots and robotic technologies, she brings energy, dedication, and smarts to all the challenges she faces. 

Ariel Anders

Roboticist – Robust.AI

Ariel Anders is a black feminist roboticist who enjoys spending time with her family and artistic self-expression. Anders is the first roboticist hired at Robust.AI, an early stage robotics startup building the world’s first industrial grade cognitive engine. Anders received a BS in Computer Engineering from UC Santa Cruz and her Doctorate in Computer Science from MIT, where she taught project-based collaborative robotics courses, developed an iOS app for people with vision impairment, and received a grant to install therapy lamps across campus. Her research focused on reliable robotic manipulation with the vision of enabling household helpers.

Cynthia Yeung

Robotics Executive & COO, Advisor, Speaker

Cynthia Yeung is the Chair of the Women in Robotics Mentoring Program SubCommittee, which will be piloting shortly. She is also a mentor and advisor to robotics companies, accelerators and venture capital firms, and speaks at leading technology conferences. Cynthia studied Entrepreneurship at Stanford, Systems Engineering at UPenn, and did a triple major at The Wharton School, UPenn, where she was a Benjamin Franklin Scholar and a Joseph Wharton Scholar. She has led Strategic or International Partnerships at organizations like Google, Capital One and led Product Partnerships at SoftBank Robotics, Checkmate.io and was COO of CafeX. In her own words, “I practice radical candor. I build teams to make myself obsolete. I create value to better human society. I edit robotics research papers for love.”

Hallie Siegel

Associate Director, Strategy & Operations at University of Toronto

Hallie Siegel is the driving force behind the emerging robotics network in Canada, centered at the University of Toronto. She is a communications professional serving the technology, innovation and research sectors, specifically robotics, automation and AI. She has a Masters in Strategic Foresight and Innovation at OCADU, where she was Dean’s Scholar. Hallie was also the first Managing Editor at Robohub.org, the site for robotics news and views, after doing science communications for Raffaelo D’Andrea’s lab at ETH Zurich. In her spare time, she is a multidisciplinary artist, and Chair of the Women in Robotics Vision Workshops. 

Kerri Fetzer-Borelli

Head of Diversity, Equity, Inclusion & Community Engagement at Toyota Research Institute

Kerri Fetzer-Borelli is the CoChair for the Women in Robotics New Chapter Formation SubCommittee. They have worked as Scientific Data Collector for the military, as a Welder in nuclear power plants, and as the Manager of Autonomous Vehicle Testing, then Prototyping and Robotics Operations at Toyota Research Institute where they now lead DEI and Community Engagement. Kerri mobilizes cross functional teams to solve complex, abstract problems by distilling strategic, actionable items and workflows from big ideas.

Laura Stelzner

Robotics Software Engineer at RIOS

Laura Stelzner is the Chair of the Women in Robotics Community Management SubCommittee increasing activity and engagement in our online community. By day, she is in charge of software for emerging robotics startup RIOS which provides factory automation as a service, deploying AI powered and dexterous robots on factory assembly lines. Prior to RIOS, Laura worked at Toyota Research Institute, Space Systems Loral, Amazon Labs, Electric Movement and Raytheon. She has a BS in Computer Engineering from UC Santa Cruz and an MS in Computer Science from Stanford.

Laurie Linz, Women in Robotics Treasurer

Software Development Engineer in Test at Alteryx

Laurie Linz is the Women in Robotics Treasurer, as well as founder of the Boulder/Denver Colorado WiR Chapter. When not working as a software developer or QA tester, Laurie can be found with her hands on an Arduino, or a drone, or a camera. As she says, “I like to build things, break things and solve puzzles all day! Thankfully development and testing allows me to do that. Fred Brooks was right when he wrote that the programmer gains the “sheer joy of making things” and he talks of “castles in the air, from air” as we are only limited by the bounds of human imagination.”

Lisa Winter

Head of Hardware at Quartz

A roboticist since childhood, Lisa has over 20 years experience designing and building robots. She has competed in Robot Wars and BattleBots competitions since 1996, and is a current judge on BattleBots. She currently holds the position of Head of Hardware at Quartz, an early stage startup working on the future of construction. Her rugged hardware can be seen attached to tower cranes all around California. In her free time she likes to volunteer her prototyping skills to the Marine Mammal Center to aid in the rehab of hundreds of animals each year. She is a Founding Board Member of Women in Robotics and Chair of the Artwork/Swag SubCommittee.

Sue Keay, Women in Robotics Secretary

CEO at Queensland AI Hub and Chair of the Board of Directors of Robotics Australia Group

Currently CEO of Queensland AI Hub, after leading cyber-physical systems research for CSIRO’s Data61. Previously Sue set-up the world’s first robotic vision research centre. She led the development of Australia’s first robotics roadmap, the Robotics Australia Network and the Queensland Robotics Cluster. A Graduate of the Australian Institute of Company Directors, she founded and Chairs the Board of Robotics Australia Group. Sue also serves on the Boards of CRC ORE, Queensland AI Hub and represents Australia in the International Alliance of Robotics Associations.

With such a go-getting Board of Directors, you can be assured that Women in Robotics is preparing for an active 2021. As of 1/1/21, we had 1270 members in our online community, 900 additional newsletter subscribers, and six active chapters in the USA, Canada, UK and Australia. All Women in Robotics events abide by our Code of Conduct and we offer it for use at any robotics event or conference.

Our focus for 2021 is on:

  • Project Inspire – our annual 30 women in robotics you need to know about list, plus regular updates, spotlights, and wikipedia pages for women in robotics.
  • Project Connect – forming new chapters, promoting our online community, and enjoying  regular member led activities and events, under a Code of Conduct.
  • Project Advance – piloting a mentoring program, providing educational resources for women in robotics, and improving accountability metrics in our workplaces.

We’d also like to thank our two Founding Board Members, Sabine Hauert of the Hauert Lab at University of Bristol UK and Founder of Robohub.org and Sarah Osentoski SVP of Engineering at Iron Ox, who are leaving the WiR Board but who will be leading our new Women in Robotics Advisory Board, another new initiative for 2021.

You can subscribe to our newsletter to keep updated on our activities, to sign up for our speaker database or volunteering opportunities, or to show your support as an ally. Please support our activities with a one off or recurring donation (tax deductible in the USA). 

Page 358 of 522
1 356 357 358 359 360 522