The eyes of raptors can accurately perceive prey from kilometers away. Is it possible to model camera technology after birds' eyes? Researchers have developed a new type of camera that is inspired by the structures and functions of birds' eyes. A research team led by Prof. Kim Dae-Hyeong at the Center for Nanoparticle Research within the Institute for Basic Science (IBS), in collaboration with Prof. Song Young Min at the Gwangju Institute of Science and Technology (GIST), has developed a perovskite-based camera specializing in object detection.
To complete real-world tasks in home environments, offices and public spaces, robots should be able to effectively grasp and manipulate a wide range of objects. In recent years, developers have created various machine learning–based models designed to enable skilled object manipulation in robots.
Cambridge researchers have shown that members of the public have little trouble in learning very quickly how to use a third thumb—a controllable, prosthetic extra thumb—to pick up and manipulate objects.
Snake-inspired robots could have various advantages over conventional wheeled or legged robots. For instance, slithering robots can adapt the shape of their body, enter narrow spaces, and move freely in environments that are inaccessible to both humans and other robots.
Inside a lab in Boston University's College of Engineering, a robot arm drops small, plastic objects into a box placed perfectly on the floor to catch them as they fall. One by one, these tiny structures—feather-light, cylindrical pieces, no bigger than an inch tall—fill the box. Some are red, others blue, purple, green, or black.
"I'll have you eating out of the palm of my hand" is an unlikely utterance you'll hear from a robot. Why? Most of them don't have palms.
Using more robots to close labor gaps in the hospitality industry may backfire and cause more human workers to quit, according to a Washington State University study.
What features does a robotic guide dog need? Ask the blind, say the authors of a recent paper. Led by researchers at the University of Massachusetts Amherst, a study identifying how to develop robot guide dogs with insights from guide dog users and trainers won a Best Paper Award at CHI 2024: Conference on Human Factors in Computing Systems (CHI).
Researchers Indrek Must and Kadri-Ann Valdur of the Institute of Technology of the University of Tartu have created a robot leg modeled after the leg of a cucumber spider. The soft robot created in cooperation with the Italian Institute of Technology could, in the future, move where humans cannot.
A team of engineers and roboticists at Hong Kong University of Science and Technology has developed an electronic compound eye design to give robots the ability to swarm efficiently and inexpensively.
A team of researchers at Delft University of Technology has developed a drone that flies autonomously using neuromorphic image processing and control based on the workings of animal brains. Animal brains use less data and energy compared to current deep neural networks running on graphics processing units (GPUs).
Brain-machine interfaces are devices that enable direct communication between a brain's electrical activity and an external device such as a computer or a robotic limb that allows people to control machines using their thoughts.
Research at Uppsala University and Karolinska Institutet could pave the way for a prosthetic hand and robot to be able to feel touch like a human hand. Their study has been published in the journal Science. The technology could also be used to help restore lost functionality to patients after a stroke.
A robot, designed to mimic the motion of a snail, has been developed by researchers at the University of Bristol.
Sat in a circle on the nursery floor, a group of Swiss three-year-olds ask a robot called Nao questions about giraffes and broccoli.