Archive 30.07.2022

Page 1 of 5
1 2 3 5

#ICRA2022 awards finalists and winners

Credits: Wise Owl Multimedia

In this post we bring you all the paper awards finalists and winners presented during the 2022 edition of the IEEE International Conference on Robotics and Automation (ICRA).

ICRA 2022 Outstanding Paper

ICRA 2022 Outstanding Student Paper

ICRA 2022 Outstanding Automation Paper

ICRA 2022 Outstanding Coordination Paper

ICRA 2022 Outstanding Deployed Systems Paper

ICRA 2022 Outstanding Dynamics and Control Paper

ICRA 2022 Outstanding Interaction Paper

ICRA 2022 Outstanding Learning Paper

ICRA 2022 Outstanding Locomotion Paper

ICRA 2022 Outstanding Manipulation Paper

ICRA 2022 Outstanding Mechanisms and Design Paper

ICRA 2022 Outstanding Navigation Paper

ICRA 2022 Outstanding Planning Paper

A new type of soft robotic actuator that can be scaled down to just one centimeter

A team of researchers at Istituto Italiano di Tecnologia's Bioinspired Soft Robotics Laboratory has developed a new pleat-based soft robotic actuator that can be used in a variety of sizes, down to just 1 centimeter. In their paper published in the journal Science Robotics, the group describes the technology behind their new actuator and how well it worked when they tested it under varied circumstances.

r/Robotics Showcase: An event for members of all ages and abilities to share and discuss their passion for robotics

This post shows the proceedings for the 2nd annual Reddit Robotics Showcase! We’re delighted to once again offer an event for robotics enthusiasts of all ages and abilities from across the world to share their passion and their projects.

This year, we are planning an event which anticipates the enthusiasm we’ve had from our community, continuing to provide a unique opportunity for roboticists around the world to share and discuss their work, regardless of age or ability. The primary purpose of this event is to showcase the multitude of projects underway in the r/Robotics Reddit community. Topics range across all focuses of robotics, such as simulation, navigation, control, perception, and mechatronic design. We will use this showcase to present discussion pieces and foster conversation between active members in the robotics community around the world. The showcase will feature invited roboticists in research and industry to discuss what they see as technical challenges or interesting directions for robots.

The showcase is free and online, and is to be held on July 30th & 31st 2022, livestreamed via the Reddit Robotics Showcase YouTube channel. For more information, please check out the official website.

2022 Program

All times are recorded in Eastern Daylight Time (EDT), UTC-4.

Saturday, 30th of July
Session 1: Industrial and Applied Robotics
10:00 – 11:00 Matt Whelan (Ocado Technology) – The Ocado 600 Series Robot
11:00 – 11:15 Nye Mech Works (HAPPA) – Real Power Armor
11:15 – 11:30 3D Printed 6-Axis Robot Arm
11:30 – 12:00 Vasily Morzhakov (Rembrain) – Cloud Platform for Smart Robotics
12:00 – 12:15 Bridging the Gap Between Physical and Digital with BOTSZY
12:15 – 12:45 Advoard – Autonomous Fleet Robots for Logistics
Lunch Break

Session 2: Mobile Robots
14:00 – 15:00 TBC
15:00 – 15:30 Julius Sustarevas – Armstone: Autonomous Mobile 3D Printer
15:30 – 15:45 Camera Controller Hexapod & Screw/Augur All-Terrain Robot
15:45 – 16:15 Keegan Neave – NE-Five
16:15 – 16:30 Dimitar – Gravis and Ricardo
16:30 – 16:40 Kamal Carter – Aim-Hack Robot
16:40 – 16:55 Calvin – BeBOT Real Time

Sunday, 31st of July
Session 1: Bio-Inspired Robots
10:00 – 11:00 Dr. Matteo Russo (Rolls-Royce UTC in Manufacturing and On-Wing Technology) – Entering the Maze: Snake-Like Robots from Aerospace to Industry
11:00 – 11:10 Humanoid, Hexapod, and Legged Robot Control
11:10 – 11:25 Halid Yildirim – Design of a Modular Quadruped Robot Dog
11:25 – 11:35 Hexapod as Robot Pet
11:35 – 12:05 Jakub Bartoszek – Honey Badger Quadruped
12:05 – 12:20 Lutz Freitag – 01. RFC Berlin
12:20 – 12:35 Hamburg Bit-Bots
12:35 – 12:50 William Kerber – Human Mode Robotics – Lynx Quadruped and AI Training
12:50 – 13:00 Sanjeev Hegde – Juggernaut
Lunch Break

Session 2: Human Robot Interaction
14:00 – 15:00 Dr. Ruth Aylett (The National Robotarium) – Social Agents and Human Robot Interaction
15:00 – 15:30 Senmag Robotics – High Fidelity Haptics Made Accessible
15:30 – 15:40 Hand Controlled Artificial Hand
15:40 – 16:10 The Shadow Robot Company
16:10 – 16:25 Maël Abril – 6 Axis Dynamixel Robot Arm
16:25 – 16:35 Control 4WD Raspberry Pi Robot Car Using Hand Gestures
16:35 – 17:05 Tentacular – Interactive Robotic Art

Human-like features in robot behavior: Response time variability can be perceived as human-like

Humans behave and act in a way that other humans can recognize as human-like. If humanness has specific features, is it possible to replicate these features on a machine like a robot? Researchers at IIT-Istituto Italiano di Tecnologia (Italian Institute of Technology) tried to answer that question by implementing a non-verbal Turing test in a human-robot interaction task. They involved human participants and the humanoid robot iCub in a joint action experiment. What they found is that specific features of human behavior, namely response timing, can be translated into the robot in a way that humans cannot distinguish whether they are interacting with a person or a machine.

DayDreamer: An algorithm to quickly teach robots new behaviors in the real world

Training robots to complete tasks in the real-world can be a very time-consuming process, which involves building a fast and efficient simulator, performing numerous trials on it, and then transferring the behaviors learned during these trials to the real world. In many cases, however, the performance achieved in simulations does not match the one attained in the real-world, due to unpredictable changes in the environment or task.

Warehouse robots that feel by sight

More than a decade ago, Ted Adelson set out to create tactile sensors for robots that would give them a sense of touch. The result? A handheld imaging system powerful enough to visualize the raised print on a dollar bill. The technology was spun into GelSight, to answer an industry need for low-cost, high-resolution imaging.

Q&A: Warehouse robots that feel by sight

Ted Adelson. Photo courtesy of the Department of Brain and Cognitive Sciences.

By Kim Martineau | MIT Schwarzman College of Computing

More than a decade ago, Ted Adelson set out to create tactile sensors for robots that would give them a sense of touch. The result? A handheld imaging system powerful enough to visualize the raised print on a dollar bill. The technology was spun into GelSight, to answer an industry need for low-cost, high-resolution imaging.

An expert in both human and machine vision, Adelson was pleased to have created something useful. But he never lost sight of his original dream: to endow robots with a sense of touch. In a new Science Hub project with Amazon, he’s back on the case. He plans to build out the GelSight system with added capabilities to sense temperature and vibrations. A professor in MIT’s Department of Brain and Cognitive Sciences, Adelson recently sat down to talk about his work.

Q: What makes the human hand so hard to recreate in a robot?

A: A human finger has soft, sensitive skin, which deforms as it touches things. The question is how to get precise sensing when the sensing surface itself is constantly moving and changing during manipulation.

Q: You’re an expert on human and computer vision. How did touch grab your interest?

A: When my daughters were babies, I was amazed by how skillfully they used their fingers and hands to explore the world. I wanted to understand the way they were gathering information through their sense of touch. Being a vision researcher, I naturally looked for a way to do it with cameras.

Q: How does the GelSight robot finger work? What are its limitations?

A: A camera captures an image of the skin from inside, and a computer vision system calculates the skin’s 3D deformation. GelSight fingers offer excellent tactile acuity, far exceeding that of human fingers. However, the need for an inner optical system limits the sizes and shapes we can achieve today.

Q: How did you come up with the idea of giving a robot finger a sense of touch by, in effect, giving it sight?

A: A camera can tell you about the geometry of the surface it is viewing. By putting a tiny camera inside the finger, we can measure how the skin geometry is changing from point to point. This tells us about tactile properties like force, shape, and texture.

Q: How did your prior work on cameras figure in?

A: My prior research on the appearance of reflective materials helped me engineer the optical properties of the skin. We create a very thin matte membrane and light it with grazing illumination so all the details can be seen.

Q: Did you know there was a market for measuring 3D surfaces?

A: No. My postdoc Kimo Johnson posted a YouTube video showing GelSight’s capabilities about a decade ago. The video went viral, and we got a flood of email with interesting suggested applications. People have since used the technology for measuring the microtexture of shark skin, packed snow, and sanded surfaces. The FBI uses it in forensics to compare spent cartridge casings.

Q: What’s GelSight’s main application?  

A: Industrial inspection. For example, an inspector can press a GelSight sensor against a scratch or bump on an airplane fuselage to measure its exact size and shape in 3D. This application may seem quite different from the original inspiration of baby fingers, but it shows that tactile sensing can have many uses. As for robotics, tactile sensing is mainly a research topic right now, but we expect it to increasingly be useful in industrial robots.

Q: You’re now building in a way to measure temperature and vibrations. How do you do that with a camera? How else will you try to emulate human touch?

A: You can convert temperature to a visual signal that a camera can read by using liquid crystals, the molecules that make mood rings and forehead thermometers change color. For vibrations we will use microphones. We also want to extend the range of shapes a finger can have. Finally, we need to understand how to use the information coming from the finger to improve robotics.

Q: Why are we sensitive to temperature and vibrations, and why is that useful for robotics?

A: Identifying material properties is an important aspect of touch. Sensing temperature helps you tell whether something is metal or wood, and whether it is wet or dry. Vibrations can help you distinguish a slightly textured surface, like unvarnished wood, from a perfectly smooth surface, like wood with a glossy finish.

Q: What’s next?

A: Making a tactile sensor is the first step. Integrating it into a useful finger and hand comes next. Then you have to get the robot to use the hand to perform real-world tasks.

Q: Evolution gave us five fingers and two hands. Will robots have the same?

A: Different robots will have different kinds of hands, optimized for different situations. Big hands, small hands, hands with three fingers or six fingers, and hands we can’t even imagine today. Our goal is to provide the sensing capability, so that the robot can skillfully interact with the world.

Virtual co-embodiment of a joint body with left and right limbs controlled by two persons

What factors influence the embodiment felt towards parts of our bodies controlled by others? Using a new "joint avatar" whose left and right limbs are controlled by two people simultaneously, researchers have revealed that the visual information necessary to predict the partner's intentions behind limb movements can significantly enhance the sense of embodiment towards partner-controlled limbs during virtual co-embodiment. This finding may contribute to enhancing the sense of embodiment towards autonomous prosthetic limbs.

Enhancing the safety of autonomous vehicles in critical scenarios

Researchers at Ulm University in Germany have recently developed a new framework that could help to make self-driving cars safer in urban and highly dynamic environments. This framework, presented in a paper pre-published on arXiv, is designed to identify potential threats around the vehicle in real-time.
Page 1 of 5
1 2 3 5