Archive 08.11.2022

Page 5 of 5
1 3 4 5

New VR system lets you share sights on the move without causing VR sickness

Researchers from Tokyo Metropolitan University have engineered a virtual reality (VR) remote collaboration system which lets users on Segways share not only what they see but also the feeling of acceleration as they move. Riders equipped with cameras and accelerometers can feedback their sensations to a remote user on a modified wheelchair wearing a VR headset. User surveys showed significant reduction in VR sickness, promising a better experience for remote collaboration activities.

General purpose robots should not be weaponized: An open letter to the robotics industry and our communities

Over the course of the past year Open Robotics has taken time from our day-to-day efforts to work with our colleagues in the field to consider how the technology we develop could negatively impact society as a whole. In particular we were concerned with the weaponization of mobile robots. After a lot of thoughtful discussion, deliberation, and debate with our colleagues at organizations like Boston Dynamics, Clearpath Robotics, Agility Robotics, AnyBotics, and Unitree, we have co-authored and signed an open letter to the robotics community entitled, “General Purpose Robots Should Not Be Weaponized.” You can read the letter, in its entirety, here. Additional media coverage of the letter can be found in Axios, and The Robot Report.

The letter codifies internal policies we’ve had at Open Robotics since our inception and we think it captures the sentiments of much of the ROS community. For our part, we have pledged that we will not weaponize mobile robots, and we do not support others doing so either. We believe that the weaponization of robots raises serious ethical issues and harms public trust in technologies that can have tremendous benefits to society. This is but a first step, and we look forward to working with policy makers, the robotics community, and the general public, to continue to promote the ethical use of robots and prohibit their misuse. This is but one of many discussions that must happen between robotics professionals, the general public, and lawmakers about advanced technologies, and quite frankly, we think it is long overdue.

Due to the permissive nature of the licenses we use for ROS, Gazebo, and our other projects, it is difficult, if not impossible, for us to limit the use of the technology we develop to build weaponized systems. However, we do not condone such efforts, and we will have no part in directly assisting those who do with our technical expertise or labor. This has been our policy from the start, and will continue to be our policy. We encourage the ROS community to take a similar stand and to work with their local lawmakers to prevent the weaponization of robotic systems. Moreover, we hope the entire ROS community will take time to reflect deeply on the ethical implications of their work, and help others better understand both the positive and negative outcomes that are possible in robotics.

How shoring up drones with artificial intelligence helps surf lifesavers spot sharks at the beach

A close encounter between a white shark and a surfer. Author provided.

By Cormac Purcell (Adjunct Senior Lecturer, UNSW Sydney) and Paul Butcher (Adjunct Professor, Southern Cross University)

Australian surf lifesavers are increasingly using drones to spot sharks at the beach before they get too close to swimmers. But just how reliable are they?

Discerning whether that dark splodge in the water is a shark or just, say, seaweed isn’t always straightforward and, in reasonable conditions, drone pilots generally make the right call only 60% of the time. While this has implications for public safety, it can also lead to unnecessary beach closures and public alarm.

Engineers are trying to boost the accuracy of these shark-spotting drones with artificial intelligence (AI). While they show great promise in the lab, AI systems are notoriously difficult to get right in the real world, so remain out of reach for surf lifesavers. And importantly, overconfidence in such software can have serious consequences.

With these challenges in mind, our team set out to build the most robust shark detector possible and test it in real-world conditions. By using masses of data, we created a highly reliable mobile app for surf lifesavers that could not only improve beach safety, but help monitor the health of Australian coastlines.

White shark being observed by a drone.A white shark being tracked by a drone. Author provided.

Detecting dangerous sharks with drones

The New South Wales government has invested more than A$85 million in shark mitigation measures over the next four years. Of all approaches on offer, a 2020 survey showed drone-based shark surveillance is the public’s preferred method to protect beach-goers.

The state government has been trialling drones as shark-spotting tools since 2016, and with Surf Life Saving NSW since 2018. Trained surf lifesaving pilots fly the drone over the ocean at a height of 60 metres, watching the live video feed on portable screens for the shape of sharks swimming under the surface.

Identifying sharks by carefully analysing the video footage in good conditions seems easy. But water clarity, sea glitter (sea-surface reflection), animal depth, pilot experience and fatigue all reduce the reliability of real-time detection to a predicted average of 60%. This reliability falls further when conditions are turbid.

Pilots also need to confidently identify the species of shark and tell the difference between dangerous and non-dangerous animals, such as rays, which are often misidentified.

Identifying shark species from the air.

AI-driven computer vision has been touted as an ideal tool to virtually “tag” sharks and other animals in the video footage streamed from the drones, and to help identify whether a species nearing the beach is cause for concern.

AI to the rescue?

Early results from previous AI-enhanced shark-spotting systems have suggested the problem has been solved, as these systems report detection accuracies of over 90%.

But scaling these systems to make a real-world difference across NSW beaches has been challenging.

AI systems are trained to locate and identify species using large collections of example images and perform remarkably well when processing familiar scenes in the real world.

However, problems quickly arise when they encounter conditions not well represented in the training data. As any regular ocean swimmer can tell you, every beach is different – the lighting, weather and water conditions can change dramatically across days and seasons.

Animals can also frequently change their position in the water column, which means their visible characteristics (such as their outline) changes, too.

All this variation makes it crucial for training data to cover the full gamut of conditions, or that AI systems be flexible enough to track the changes over time. Such challenges have been recognised for years, giving rise to the new discipline of “machine learning operations”.

Essentially, machine learning operations explicitly recognises that AI-driven software requires regular updates to maintain its effectiveness.

Examples of the drone footage used in our huge dataset.

Building a better shark spotter

We aimed to overcome these challenges with a new shark detector mobile app. We gathered a huge dataset of drone footage, and shark experts then spent weeks inspecting the videos, carefully tracking and labelling sharks and other marine fauna in the hours of footage.

Using this new dataset, we trained a machine learning model to recognise ten types of marine life, including different species of dangerous sharks such as great white and whaler sharks.

And then we embedded this model into a new mobile app that can highlight sharks in live drone footage and predict the species. We worked closely with the NSW government and Surf Lifesaving NSW to trial this app on five beaches during summer 2020.

Drone flying at a beach.A drone in surf lifesaver NSW livery preparing to go on patrol. Author provided.

Our AI shark detector did quite well. It identified dangerous sharks on a frame-by-frame basis 80% of the time, in realistic conditions.

We deliberately went out of our way to make our tests difficult by challenging the AI to run on unseen data taken at different times of year, or from different-looking beaches. These critical tests on “external data” are often omitted in AI research.

A more detailed analysis turned up common-sense limitations: white, whaler and bull sharks are difficult to tell apart because they look similar, while small animals (such as turtles and rays) are harder to detect in general.

Spurious detections (like mistaking seaweed as a shark) are a real concern for beach managers, but we found the AI could easily be “tuned” to eliminate these by showing it empty ocean scenes of each beach.

Seaweed identified as sharks.Example of where the AI gets it wrong – seaweed identified as sharks. Author provided.

The future of AI for shark spotting

In the short term, AI is now mature enough to be deployed in drone-based shark-spotting operations across Australian beaches. But, unlike regular software, it will need to be monitored and updated frequently to maintain its high reliability of detecting dangerous sharks.

An added bonus is that such a machine learning system for spotting sharks would also continually collect valuable ecological data on the health of our coastline and marine fauna.

In the longer term, getting the AI to look at how sharks swim and using new AI technology that learns on-the-fly will make AI shark detection even more reliable and easy to deploy.

The NSW government has new drone trials for the coming summer, testing the usefulness of efficient long-range flights that can cover more beaches.

AI can play a key role in making these flights more effective, enabling greater reliability in drone surveillance, and may eventually lead to fully-automated shark-spotting operations and trusted automatic alerts.

The authors acknowledge the substantial contributions from Dr Andrew Colefax and Dr Andrew Walsh at Sci-eye.The Conversation

This article appeared in The Conversation.

17 Robots, Teams From Around the World Vie for $8m in Prizes, November 4-5 in Long Beach, CA

The ANA Avatar XPRIZE challenges teams to develop physical, human-operated robotic avatar systems that can execute tasks and replicate a human’s senses, actions and presence to a remote location in real-time, leading to a more connected world.

Implementing monocular visual-tactile sensors for robust manipulation

Tactile perception is essential information for humans perceiving the world physically. And tactile sensing plays an important role in improving the performance of planning and control for a robotic manipulator, so as to achieve complex robotic manipulations.

How Do Cryptocurrency Values Change During a Bull Market?

The cryptocurrency market differs from traditional markets by its incredibly high volatility. While banks issue conventional currencies, and their amount can be increased when needed, crypto assets emission does not depend on governments or banks.  Prices on crypto exchanges change probably every minute. Slight daily volatility allows traders to profit in portions many times within...

The post <strong>How Do Cryptocurrency Values Change During a Bull Market?</strong> appeared first on 1redDrop.

ep.365: Precise Navigation using LEO Satellites, with Tyler Reid

Dr. Tyler Reid, co-founder and CTO of Xona Space Systems, discusses a new type of global navigation satellite system (GNSS). Xona Space Systems plans to provide centimeter-level positioning accuracy and will serve the emerging autonomous vehicle community, where precise navigation is key. Reid discusses the advantages and technical challenges of a low Earth orbit (LEO) solution.

Tyler Reid

Tyler Reid is co-founder and CTO of Xona Space Systems. Previously, Tyler worked as a Research Engineer at the Ford Motor Company in localization and mapping for self-driving cars. He has also worked as an engineer at Google and as a lecturer at Stanford University, where he co-taught the GPS course. Tyler received his PhD (2017) and MSc (2012) in Aeronautics and Astronautics from Stanford and B.Eng. (’10) in Mechanical Engineering from McGill.

 

 

Links

 

Robots come out of the research lab

This year’s Swiss Robotics Day – an annual event run by the EPFL-led National Centre of Competence in Research (NCCR) Robotics – will be held at the Beaulieu convention center in Lausanne. For the first time, this annual event will take place over two days: the first day, on 4 November, will be reserved for industry professionals, while the second, on 5 November, will be open to the public.

Visitors at this year’s Swiss Robotics Day are in for a glimpse of some exciting new technology: a robotic exoskeleton that enables paralyzed patients to ski, a device the width of a strand of hair that can be guided through a human vein, a four-legged robot that can walk over obstacles, an artificial skin that can diagnose early-stage Parkinson’s, a swarm of flying drones, and more.

The event, now in its seventh year, was created by NCCR Robotics in 2015. It has expanded into a leading conference for the Swiss robotics industry, bringing together university researchers, businesses and citizens from across the country. For Swiss robotics experts, the event provides a chance to meet with peers, share ideas, explore new business opportunities and look for promising new hires. That’s what they’ll do on Friday, 4 November – the day reserved for industry professionals.

On Saturday, 5 November, the doors will open to the general public. Visitors of all ages can discover the latest inventions coming out of Swiss R&D labs and fabricated by local companies – including some startups. The event will feature talks and panel discussions on topics such as ethics in robotics, space robotics, robotics in art and how artificial intelligence can be used to promote sustainable development – all issues that will shape the future of the industry. PhD students will provide a snapshot of where robotics research stands today, while school-age children can sign up for robot-building workshops. Teachers can take part in workshops given by the Roteco robot teaching community and see how robotics technology can support learning in the classroom.

In the convention center’s 5,000 m² exhibit hall, some 70 booths will be set up with all sorts of robot demonstrations, complete with an area for flying drones. Technology developed as part of the Cybathlon international competition will be on display; this competition was introduced by NCCR Robotics in 2016 to encourage research on assistance systems for people with disabilities. Silke Pan will give a dance performance with a robotic exoskeleton, choregraphed by Antoine Le Moal of the Béjart ballet company. Talks will be given in French and English. Entrance is free of charge but registration is required.

Laying the foundation for future success

The 2022 Swiss Robotics Day will mark the end of NCCR Robotics, capping 12 years of cutting-edge research. The center was funded by the Swiss National Science Foundation and has sponsored R&D at over 30 labs in seven Swiss institutions: EPFL, ETH Zurich, the University of Zurich, IDSIA-SUPSI-USI in Lugano, the University of Bern, EMPA and the University of Basel. NCCR Robotics has given rise to 16 spin-offs in high-impact fields like portable robots, drones, search-and-rescue systems and education. Together the spin-offs have raised over CHF 100 million in funding and some of them, like Flyability and ANYbotics, have grown into established businesses creating hundreds of high-tech jobs. The center has also rolled out several educational and community initiatives to further the teaching of robotics in Switzerland.

After the center closes, some of its activities – especially those related to technology transfer – will be carried out by the Innovation Booster Robotics program sponsored by Innosuisse and housed at EPFL. This program, initially funded for three years, is designed to promote robotics in universities and the business world.

A day for industry professionals only

The first day of the event, 4 November, is intended for robotics-industry businesses, investors, researchers, students and journalists. It will kick off with a talk by Robin Murphy, a world-renowned expert in rescue robotics and a professor at Texas A&M University; she will be followed by Auke Ijspeert from EPFL’s Biorobotics Laboratory, Elena García Armada from the Center for Automation and Robotics in Spain, Raffaello D’Andrea (a pioneer in robotics-based inventory management) from ETH Zurich, Thierry Golliard from Swiss Post and Adrien Briod, the co-founder of Flyability.

In the afternoon, a panel discussion will explore how robots and artificial intelligence are changing the workplace. Experts will include Dario Floreano from NCCR Robotics and EPFL, Rafael Lalive from the University of Lausanne, Alisa Rupenyan-Vasileva from ETH Zurich, Agnès Petit Markowski from Mobbot and Pierre Dillenbourg from EPFL. Event participants will also have a chance to network that afternoon. The day will conclude with an awards ceremony to designate Switzerland’s best Master’s thesis on robotics. The booths and robot demonstrations will take place on both days of the event.

A virtual glimpse of NCCR Robotics research

At NCCR Robotics, a new generation of robots that can work side by side with humans (fighting disabilities, facing emergencies and transforming education) is developed. Check out the videos below to see them in more detail.

Improving the autonomous navigation of mobile robots in crowded spaces using people as sensors

A team of researchers from University of Illinois at Urbana-Champaign and Stanford University led by Prof. Katie Driggs-Campbell, have recently developed a new deep reinforcement learning-based method that could improve the ability of mobile robots to safely navigate crowded spaces. Their method, introduced in a paper pre-published on arXiv, is based on the idea of using people in the robot's surroundings as indicators of potential obstacles.

Supporting innovation with automation: Researcher develops autonomous hot cell tool

Scientific progress is anything but automatic. The path to new discoveries is not a straight line. But while the route to nuclear energy breakthroughs may be circuitous, automated solutions can enhance the efficiency of the research process and get to the innovation a little sooner.
Page 5 of 5
1 3 4 5