Archive 11.04.2023

Page 5 of 6
1 3 4 5 6

Innocence over utilitarianism: Heightened moral standards for robots in rescue dilemmas

The Moralities of Intelligent Machines research group headed by Michael Laakasuo investigates people's moral views on imaginary rescue situations where the rescuer is either a human or a robot specifically designed for the task. The rescuer has to decide whether to save, for example, one innocent victim of a boating accident or two individuals whose irresponsible behavior caused the accident.

Brain-inspired intelligent robotics: Theoretical analysis and systematic application

Robots have become a crucial indicator for measuring the competitive strength of a country in science and technology. Robotic systems have made advancements in fields such as mechanical engineering, control and artificial intelligence technologies. However, the performance of current robotic systems still includes limitations and cannot satisfy the demands of an increasing number of applications. In order to address these problems, researchers have constructed a brain-inspired intelligent robotic system.

A framework to enable touch-enhanced robotic grasping using tactile sensors

To successfully cooperate with humans on manual tasks, robots should be able to grasp and manipulate a variety of objects without dropping or damaging them. Recent research efforts in the field of robotics have thus focused on developing tactile sensors and controllers that could provide robots with the sense of touch and bring their object manipulation capabilities closer to those of humans.

Robotic hand can identify objects with just one grasp

MIT researchers developed a soft-rigid robotic finger that incorporates powerful sensors along its entire length, enabling them to produce a robotic hand that could accurately identify objects after only one grasp. Image: Courtesy of the researchers

By Adam Zewe | MIT News Office

Inspired by the human finger, MIT researchers have developed a robotic hand that uses high-resolution touch sensing to accurately identify an object after grasping it just one time.

Many robotic hands pack all their powerful sensors into the fingertips, so an object must be in full contact with those fingertips to be identified, which can take multiple grasps. Other designs use lower-resolution sensors spread along the entire finger, but these don’t capture as much detail, so multiple regrasps are often required.

Instead, the MIT team built a robotic finger with a rigid skeleton encased in a soft outer layer that has multiple high-resolution sensors incorporated under its transparent “skin.” The sensors, which use a camera and LEDs to gather visual information about an object’s shape, provide continuous sensing along the finger’s entire length. Each finger captures rich data on many parts of an object simultaneously.

Using this design, the researchers built a three-fingered robotic hand that could identify objects after only one grasp, with about 85 percent accuracy. The rigid skeleton makes the fingers strong enough to pick up a heavy item, such as a drill, while the soft skin enables them to securely grasp a pliable item, like an empty plastic water bottle, without crushing it.

These soft-rigid fingers could be especially useful in an at-home-care robot designed to interact with an elderly individual. The robot could lift a heavy item off a shelf with the same hand it uses to help the individual take a bath.

“Having both soft and rigid elements is very important in any hand, but so is being able to perform great sensing over a really large area, especially if we want to consider doing very complicated manipulation tasks like what our own hands can do. Our goal with this work was to combine all the things that make our human hands so good into a robotic finger that can do tasks other robotic fingers can’t currently do,” says mechanical engineering graduate student Sandra Liu, co-lead author of a research paper on the robotic finger.

Liu wrote the paper with co-lead author and mechanical engineering undergraduate student Leonardo Zamora Yañez and her advisor, Edward Adelson, the John and Dorothy Wilson Professor of Vision Science in the Department of Brain and Cognitive Sciences and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL). The research will be presented at the RoboSoft Conference.

A human-inspired finger

The robotic finger is comprised of a rigid, 3D-printed endoskeleton that is placed in a mold and encased in a transparent silicone “skin.” Making the finger in a mold removes the need for fasteners or adhesives to hold the silicone in place.

The researchers designed the mold with a curved shape so the robotic fingers are slightly curved when at rest, just like human fingers.

“Silicone will wrinkle when it bends, so we thought that if we have the finger molded in this curved position, when you curve it more to grasp an object, you won’t induce as many wrinkles. Wrinkles are good in some ways — they can help the finger slide along surfaces very smoothly and easily — but we didn’t want wrinkles that we couldn’t control,” Liu says.

The endoskeleton of each finger contains a pair of detailed touch sensors, known as GelSight sensors, embedded into the top and middle sections, underneath the transparent skin. The sensors are placed so the range of the cameras overlaps slightly, giving the finger continuous sensing along its entire length.

The GelSight sensor, based on technology pioneered in the Adelson group, is composed of a camera and three colored LEDs. When the finger grasps an object, the camera captures images as the colored LEDs illuminate the skin from the inside.

Image: Courtesy of the researchers

Using the illuminated contours that appear in the soft skin, an algorithm performs backward calculations to map the contours on the grasped object’s surface. The researchers trained a machine-learning model to identify objects using raw camera image data.

As they fine-tuned the finger fabrication process, the researchers ran into several obstacles.

First, silicone has a tendency to peel off surfaces over time. Liu and her collaborators found they could limit this peeling by adding small curves along the hinges between the joints in the endoskeleton.

When the finger bends, the bending of the silicone is distributed along the tiny curves, which reduces stress and prevents peeling. They also added creases to the joints so the silicone is not squashed as much when the finger bends.

While troubleshooting their design, the researchers realized wrinkles in the silicone prevent the skin from ripping.

“The usefulness of the wrinkles was an accidental discovery on our part. When we synthesized them on the surface, we found that they actually made the finger more durable than we expected,” she says.

Getting a good grasp

Once they had perfected the design, the researchers built a robotic hand using two fingers arranged in a Y pattern with a third finger as an opposing thumb. The hand captures six images when it grasps an object (two from each finger) and sends those images to a machine-learning algorithm which uses them as inputs to identify the object.

Because the hand has tactile sensing covering all of its fingers, it can gather rich tactile data from a single grasp.

“Although we have a lot of sensing in the fingers, maybe adding a palm with sensing would help it make tactile distinctions even better,” Liu says.

In the future, the researchers also want to improve the hardware to reduce the amount of wear and tear in the silicone over time and add more actuation to the thumb so it can perform a wider variety of tasks.


This work was supported, in part, by the Toyota Research Institute, the Office of Naval Research, and the SINTEF BIFROST project.

A new design that equips robots with proprioception and a tail

Researchers at Carnegie Mellon University (CMU)'s Robomechanics Lab recently introduced two new approaches that could help to improve the ability of legged robots to move on rocky or extreme terrains. These two approaches, outlined in a paper pre-published on arXiv, are inspired by the innate proprioception abilities and tail mechanics of animals.

Robotic flies to swarm 24/7 in RoboHouse

Image source: Bitcraze

Yes, you heard that correctly: the goal is permanent airtime. Robotic flies roaming a room in RoboHouse with no human guidance – achieved within six months. In the future, 24/7 swarms like these may revolutionise aircraft inspection. Imagine a fighter jet enveloped by hundreds of nano drones that build-up a detailed picture in minutes. It’s a challenging mission, but not all challenges are equal. So we asked each Crazyflies team member: What is your favourite problem?

Lennart #myfavouritedesignproblem

Okay, maybe permanent flying is exaggerating a bit, at some point batteries need recharging, but it remains the overall design essence. For team member Lennart, this is the main challenge: “We want to optimise the charging process so that you have as many drones in the air as possible with a minimum amount of charging pads.”

Each Crazyflie can buzz off for seven minutes before needing a 35 minute recharge. Through the use of wireless charging pads, human intervention is cancelled out, the alternative being manual battery replacement.

Seppe #myfavouritedesignproblem

But challenges go way further than just battery strategy. Student Seppe identifies his favourite obstacle-to-overcome in collision avoidence: “This does not only include collisions between drones, but also with stationary objects,” Seppe tells us. “By deploying sensors and proper coding, these risks are minimised. Yet the strength of a robust system doesn’t lie in reducing risks, it lies in handling them when they happen.”

Servaas #myfavouritedesignproblem

Servaas’s favourite challenge ties in with that of his colleague: round-trip latency. Or in English: the time it takes for the flying AI-insects to send their observations and receive commands in return. “Depending on how much time this transfer of information takes up, we could for instance let the drones react to more unpredictable objects such as humans.” Perhaps actual flies could also identify as such an object.

The robotic flies are tested in a drone cage to help further development and reaching their team goals.

Andreas #myfavouritedesignproblem

Floating away from technical aspects, Andreas defines solving real-world problems his goal: “Designing an autonomous, 24/7 flying drone swarm is cool, but we also want to have an actual impact through real-world application.” Andreas seeks to fulfil this wish by doing market research and identifying problems that yet remain devoid of a solution. One such application could be the inspection of large or difficult-to-access infrastructure like bridges or power lines.

Andrea #myfavouritedesignproblem

Not coming from a robotic background, for fifth team member Andrea the challenge amounted to familiarising all this software involved. Luckily, Andrea managed to learn the tools of the trade, finding the AI-insects’ autonomy one of the next exciting challenges to be tackled.

Recently this student team even received the NLF prize for their work, an award by the Dutch Air and Aerospace Foundation.

The drones

But wait, this does not yet complete the team. There are a hundred other individuals, quite literally also team members. The students have included the Crazyflies in their team, deciding to name them ‘member 6 to 105’. These drones are going to inspect infrastructure all by themselves, only stopping occasionally to recharge their batteries.

Cyberzoo

If all goes well, the Crazyflies could become part of the Crazy Zoo robot exhibition on TU Delft Campus, an initiative by Chris Verhoeven, theme leader swarm robots at TU Delft. For now though, the students have a lot of work on their hands to realise their dreams and live up to the challenges. We have no doubt they will fly high.

The post Robotic flies to swarm 24/7 in RoboHouse appeared first on RoboHouse.

Teaching robots to improve controls for flight systems and other applications that demand quick responses

Commercial airplanes can be controlled by autopilot. But what happens if a wing gets damaged or an engine malfunctions? Is it possible to design a software system with a feedback loop—a system that quickly tests how controls operate on the damaged vessel and makes adjustments on the fly to give it the best chance of landing safely?

ep.365: ReRun: An Open Source Package For Beautiful Visualizations, with Nikolaus West

Nico, Emil, and Moritz founded ReRun with the mission of making powerful visualization tools free and easily accessible for roboticists. Nico and Emil talk about how these powerful tools help debug the complex problem scopes faced by roboticists. Tune in for more.

Nikolaus West
Co-Founder & CEO
Niko is a second-time founder and software engineer with a computer vision background from Stanford. He’s fanatic about bringing great computer vision and robotics products to the physical world.

Emil Ernerfeldt
Co-Founder & CTO
Emil fell in love with coding over 20 years ago and hasn’t looked back since. He’s the creator of egui, an easy-to-use immediate mode GUI in Rust, that we’re using to build Rerun. He brings a strong perspective from the gaming industry, with a focus on great and blazing fast tools.

Links

Page 5 of 6
1 3 4 5 6