Robots have long been seen as a bad bet for Silicon Valley investors—too complicated, capital-intensive and "boring, honestly," says venture capitalist Modar Alaoui.
Biological muscles act as flexible actuators, generating force naturally and with an impressive range of motion. Unsurprisingly, scientists and engineers have been striving to build artificial muscles that mimic these abilities. A new review study, published in Nature, takes a deep dive into recent developments surrounding fiber-type artificial muscles, one of the most life-like types of artificial muscles developed so far.
In the horticultural world, some vines are especially grabby. As they grow, the woody tendrils can wrap around obstacles with enough force to pull down entire fences and trees.
Robots now see the world with an ease that once belonged only to science fiction. They can recognize objects, navigate cluttered spaces and sort thousands of parcels an hour. But ask a robot to touch something gently, safely or meaningfully, and the limits appear instantly.
A new study published in Nature Communications details a hybrid robot that combines the wind-driven mobility of tumbleweeds with active quadcopter control, offering a new paradigm for energy-efficient terrestrial exploration.
Meet the robotic dog with a memory like an elephant and the instincts of a seasoned first responder.
Whether you're reaching for a mug, a pencil or someone's hand, you don't need to consciously instruct each of your fingers on where they need to go to get a proper grip.
Over the past decades, roboticists have introduced a wide range of advanced systems that can move around in their surroundings and complete various tasks. Most of these robots can effectively collect images and other data in their surroundings, using computer vision algorithms to interpret it and plan their future actions.
Generative AI and robotics are moving us ever closer to the day when we can ask for an object and have it created within a few minutes. In fact, MIT researchers have developed a speech-to-reality system, an AI-driven workflow that allows them to provide input to a robotic arm and "speak objects into existence," creating things like furniture in as little as five minutes.
Researchers at the University of Maryland, Baltimore County (UMBC) have extracted the building blocks of precise hand gestures used in the classical Indian dance form Bharatanatyam—and found a richer "alphabet" of movement compared to natural grasps. The work could improve how we teach hand movements to robots and offer humans better tools for physical therapy.
Underwater octopuses change their body color and texture in the blink of an eye to blend perfectly into their surroundings when evading predators or capturing prey. They transform their bodies to match the colors of nearby corals or seaweed, turning blue or red, and move by softly curling their arms or snatching prey.
EPFL scientists have integrated discarded crustacean shells into robotic devices, leveraging the strength and flexibility of natural materials for robotic applications.
In the future, tiny flying robots could be deployed to aid in the search for survivors trapped beneath the rubble after a devastating earthquake. Like real insects, these robots could flit through tight spaces larger robots can't reach, while simultaneously dodging stationary obstacles and pieces of falling rubble.
Imagine having a continuum soft robotic arm bend around a bunch of grapes or broccoli, adjusting its grip in real time as it lifts the object. Unlike traditional rigid robots that generally aim to avoid contact with the environment as much as possible and stay far away from humans for safety reasons, this arm senses subtle forces, stretching and flexing in ways that mimic more of the compliance of a human hand. Its every motion is calculated to avoid excessive force while achieving the task efficiently.
Embodied artificial intelligence (AI) systems are robotic agents that rely on machine learning algorithms to sense their surroundings, plan their actions and execute them. A key aspect of these systems are visual perception modules, which allow them to analyze images captured by cameras and interpret them.