Eyes are said to be the mirror of the soul. Eyes and gaze direction guide attention, evoke emotions and activate the brain's social perception mechanisms. Researchers at Tampere University and the University of Bremen conducted a study examining how people perceive the minds of humanoid robots. Mind perception refers to the way humans detect and infer that other people, beings or even objects possess consciousness, emotions and cognitive states.
Last year, Norwegian-US tech company 1X announced a strange new product: "the world's first consumer-ready humanoid robot designed to transform life at home."
While the statement, "Humanoid robots are coming," might cause anxiety for some, for one Georgia Tech research team, working with humanlike robots couldn't be more exciting. The researchers have developed a new "thinking" technology for two-legged robots, increasing their balance and agility.
Mobile robots must continuously estimate their position to navigate autonomously. However, satellite-based navigation systems are not always reliable: signals may degrade near buildings or become unavailable indoors. To operate safely and efficiently, robots must interpret their surroundings using onboard sensors and robust localization algorithms.
Human hands are a wonder of nature and unmatched in the animal kingdom. They can twist caps, flick switches, handle tiny objects with ease, and perform thousands of tasks every day. Robot hands struggle to keep up. They typically miss the sense of touch, can't move many fingers at once, and lose track of what they are holding when their fingers block their camera's view. Scientists have now developed a smarter way to train a robot's brain to give its hands human-like dexterity.
Fighting fires could be done remotely without the need to place firefighting crews directly in potentially dangerous situations by using collaborative teams of artificial intelligence-powered robots with extinguishing equipment on board, with an initial soft trial of the technology proving successful.
Standing on a tower overlooking the cliffs of the Cortina downhill course, there is someone who is just as involved in the biggest skiing races of the Winter Olympics as Mikaela Shiffrin and Breezy Johnson.
The compound eyes of the humble fruit fly are a marvel of nature. They are wide-angle and can process visual information several times faster than the human eye. Inspired by this biological masterpiece, researchers at the Chinese Academy of Sciences have developed an insect-scale compound eye that can both see and smell, potentially improving how drones and robots navigate complex environments and avoid obstacles.
EPFL roboticists have shown that when a modular robot shares power, sensing, and communication resources among its individual units, it is significantly more resistant to failure than traditional robotic systems, where the breakdown of one element often means a loss of functionality.
Researchers at Universidad Carlos III de Madrid (UC3M) have developed a new methodology for a robot to learn how to move its arms autonomously by combining a type of observational learning with intercommunication between its limbs. This work represents a further step toward achieving more natural and easily teachable service robots capable of performing assistive tasks in domestic environments, such as setting and clearing the table, ironing, or tidying up the kitchen.
Whether chasing skiers as they fly down the mountain or tracking the luge as it tears around bends, new drone-mounted cameras are offering Winter Olympics viewers a wild ride.
What is it about a cheetah's build that enables it to run so fast? What gives the wolf its exceptional endurance? While these questions can be partly answered through animal experiments, many contributing factors can't be isolated from one another. Now, a new tool has arrived: a highly customizable, open-source robot design called The Robot of Theseus, or TROT, developed at the University of Michigan.
Penn Engineers have developed a system that lets robots see around corners using radio waves processed by AI, a capability that could improve the safety and performance of driverless cars as well as robots operating in cluttered indoor settings like warehouses and factories.
Soft robots made out of flexible, biocompatible materials are in high demand in industries from health care to manufacturing, but precisely designing and controlling such robots for specific purposes is a perennial challenge. What if you could 3D print a soft robot with predictable shape-morphing capabilities already built in? Harvard 3D printing experts have shown it's possible.
A system developed by researchers at the University of Waterloo lets people collaborate with groups of robots to create works of art inspired by music. The new technology features multiple wheeled robots about the size of soccer balls that trail colored light as they move within a fixed area on the floor in response to key features of music including tempo and chord progression. A camera records the coordinated light trails as they snake within that area, which serves as the canvas for the creation of a "painting," or visual representation of the emotional content of a particular piece of music.