Archive 30.07.2023

Page 27 of 65
1 25 26 27 28 29 65

Enabling autonomous exploration

CMU’s Autonomous Exploration Research Team has developed a suite of robotic systems and planners enabling robots to explore more quickly, probe the darkest corners of unknown environments, and create more accurate and detailed maps — all without human help.

By Aaron Aupperlee

A research group in Carnegie Mellon University’s Robotics Institute is creating the next generation of explorers — robots.

The Autonomous Exploration Research Team has developed a suite of robotic systems and planners enabling robots to explore more quickly, probe the darkest corners of unknown environments, and create more accurate and detailed maps. The systems allow robots to do all this autonomously, finding their way and creating a map without human intervention.

“You can set it in any environment, like a department store or a residential building after a disaster, and off it goes,” said Ji Zhang, a systems scientist in the Robotics Institute. “It builds the map in real-time, and while it explores, it figures out where it wants to go next. You can see everything on the map. You don’t even have to step into the space. Just let the robots explore and map the environment.”

The team has worked on exploration systems for more than three years. They’ve explored and mapped several underground mines, a parking garage, the Cohon University Center, and several other indoor and outdoor locations on the CMU campus. The system’s computers and sensors can be attached to nearly any robotic platform, transforming it into a modern-day explorer. The group uses a modified motorized wheelchair and drones for much of its testing.

Robots can explore in three modes using the group’s systems. In one mode, a person can control the robot’s movements and direction while autonomous systems keep it from crashing into walls, ceilings or other objects. In another mode, a person can select a point on a map and the robot will navigate to that point. The third mode is pure exploration. The robot sets off on its own, investigates the entire space and creates a map.

“This is a very flexible system to use in many applications, from delivery to search-and-rescue,” said Howie Choset, a professor in the Robotics Institute.

The group combined a 3D scanning lidar sensor, forward-looking camera and inertial measurement unit sensors with an exploration algorithm to enable the robot to know where it is, where it has been and where it should go next. The resulting systems are substantially more efficient than previous approaches, creating more complete maps while reducing the algorithm run time by half.

The new systems work in low-light, treacherous conditions where communication is spotty, like caves, tunnels and abandoned structures. A version of the group’s exploration system powered Team Explorer, an entry from CMU and Oregon State University in DARPA’s Subterranean Challenge. Team Explorer placed fourth in the final competition but won the Most Sectors Explored Award for mapping more of the route than any other team.

“All of our work is open-sourced. We are not holding anything back. We want to strengthen society with the capabilities of building autonomous exploration robots,” said Chao Cao, a Ph.D. student in robotics and the lead operator for Team Explorer. “It’s a fundamental capability. Once you have it, you can do a lot more.”

The group’s most recent work appeared in Science Robotics, which published “Representation Granularity Enables Time-Efficient Autonomous Exploration in Large, Complex Worlds” online. Past work has received top awards at prestigious robotics conferences. “TARE: A Hierarchical Framework for Efficiently Exploring Complex 3D Environments” won the Best Paper and Best Systems Paper awards at the Robotics Science and Systems Conference in 2021. It was the first time in the conference’s history that a paper received both awards. “FAR Planner: Fast, Attemptable Route Planner Using Dynamic Visibility Update” won the Best Student Paper Award at the International Conference on Intelligent Robots and Systems in 2022.

More information is available on the group’s website.

Genesis LDD Direct Drive Motors

Welcome to the next generation of robotics and machinery. LiveDrive LDD is a unique direct drive motor that overcomes almost every limitation of servo geared solutions. A 50% reduction in length from typical geared motor is possible with Genesis direct drive motors. Choose to simplify machine designs and have a shorter footprint while having high performance, accuracy, and efficiency.

Genesis Motion Solutions – Direct Drive Housed Rotary Motors

Welcome to the next generation of robotics and machinery. LiveDrive LDD is a unique direct drive motor that overcomes almost every limitation of servo geared solutions. No Gearbox. No Downtime. No Contamination. Bring performance accuracy and reliability, lower total cost of ownership to your robotics and machines with LiveDrive® LDD – our patented direct drive motor eliminates the need for servo gearheads while simplifying robot and machine architecture.

Reinforcement learning allows underwater robots to locate and track objects underwater

A research team has shown for the first time that reinforcement learning—i.e., a neural network that learns the best action to perform at each moment based on a series of rewards—allows autonomous vehicles and underwater robots to locate and carefully track marine objects and animals.

3D printed robotic gripper doesn’t need electronics to function

A new soft robotic gripper is not only 3D printed in one print, it also doesn't need any electronics to work. The device was developed by a team of roboticists at the University of California San Diego, in collaboration with researchers at the BASF corporation, who detailed their work in Science Robotics.

Research improves quadruped bounding with efficient learning method

In a study published in special issue of the journal IET Cyber-Systems and Robotics, researchers from Zhejiang University experienced in legged robot motion and control, pre-trained the neural network (NN) using data from a robot operated by conventional model-based controllers.

Researchers develop low-cost sensor to enhance robots’ sense of touch

Achieving human-level dexterity during manipulation and grasping has been a long-standing goal in robotics. To accomplish this, having a reliable sense of tactile information and force is essential for robots. A recent study, published in IEEE Robotics and Automation Letters, describes the L3 F-TOUCH sensor that enhances the force sensing capabilities of classic tactile sensors. The sensor is lightweight, low-cost, and wireless, making it an affordable option for retrofitting existing robot hands and graspers.

Pangolin the inspiration for medical robot

Scientists at the Max Planck Institute for Intelligent Systems in Stuttgart have developed a magnetically controlled soft medical robot with a unique, flexible structure inspired by the body of a pangolin. The robot is freely movable despite built-in hard metal components. Thus, depending on the magnetic field, it can adapt its shape to be able to move and can emit heat when needed, allowing for functionalities such as selective cargo transportation and release as well as mitigation of bleeding.

Pangolins are fascinating creatures. This animal looks like a walking pine cone, as it is the only mammal completely covered with hard scales. The scales are made of keratin, just like our hair and nails. The scales overlap and are directly connected to the underlying soft skin layer. This special arrangement allows the animals to curl up into a ball in case of danger.

While pangolins have many other unique features, researchers from the Physical Intelligence Department at the Max Planck Institute for Intelligent Systems in Stuttgart, which is led by Prof. Dr. Metin Sitti, were particularly fascinated by how pangolins can curl up their scale-covered bodies in a flash. They took the animal as a model and developed a flexible robot made of soft and hard components that, just like the animal, become a sphere in the blink of an eye – with the additional feature that the robot can emit heat when needed.

In a research paper published in Nature Communications, first author Ren Hao Soon and his colleagues present a robot design that is no more than two centimeters long and consists of two layers: a soft layer made of a polymer studded with small magnetic particles and a hard component made of metal elements arranged in overlapping layers. Thus, although the robot is made of solid metal components, it is still soft and flexible for use inside the human body.

Fig. 1 shows the pangolin-inspired untethered magnetic robot. A Conceptual illustration of the pangolin-inspired robot operating in the small intestine. Robot is actuated with a low-frequency magnetic field and heated remotely with a high-frequency magnetic field. The pangolin’s body consist of individual overlapping hard keratin scales. The robot inspired by this overlapping design is shown on the right. Images of pangolins used under Standard licence from Shutterstock.

When the robot is exposed to a low-frequency magnetic field, the researchers can roll up the robot and move it back and forth as they wish. The metal elements stick out like the animal’s scales, without hurting any surrounding tissue. Once it is rolled up, the robot can transport particles such as medicines. The vision is that such a small machine will one day travel through our digestive system, for example.

Double useful: freely movable and hot

When the robot is exposed to a high-frequency magnetic field, it heats up to over 70oC thanks to the built-in metal. Thermal energy is used in several medical procedures, such as treating thrombosis, stopping bleeding and removing tumor tissue. Untethered robots that can move freely, even though they are made of hard elements such as metal and can also emit heat, are rare. The pangolin robot is therefore considered promising for modern medicine. It could one day reach even the narrowest and most sensitive regions in the body in a minimally invasive and gentle way and emit heat as needed. That is a vision of the future. Already today, in a video, the researchers are showing how they can flexibly steer the robot through animal tissue and artificial organs.

Researchers develop machine-learning technique that can efficiently learn to control a robot

Researchers from MIT and Stanford University have devised a new machine-learning approach that could be used to control a robot, such as a drone or autonomous vehicle, more effectively and efficiently in dynamic environments where conditions can change rapidly.

Wet surface? No problem for gecko adhesion based robot

Geckos' unique ability to climb across anything from a dry desert floor to a cold mountain top without leaving any sticky residue behind is the inspiration for many wall crawling robots, but for the first time researchers at Carnegie Mellon University have introduced it to water.

Psychology graduate explores human preferences when considering autonomous robots as companions, teammates

With the fierce debate broiling over the promise versus perceived dangers of Artificial Intelligence (AI) and autonomous robots, Nicole Moore of the University of Alabama in Huntsville (UAH) has had a study published that is especially timely.
Page 27 of 65
1 25 26 27 28 29 65