A decontamination robot funded by the Office of Naval Research (ONR) and designed by several local universities was recently tested in Richmond Va. The robot—initially designed for shipboard firefighting and maintenance tasks—has now been enlisted in the fight against COVID-19.
Robot maker Agility, a spinoff created by researchers from Oregon State University, has announced that parties interested in purchasing one of its Digit robots can now do so. The human-like robot has been engineered to perform manual labor, such as removing boxes from shelves and loading them onto a truck. The robot can be purchased directly from Agility for $250,000.
Over the past few decades, technological advances have enabled the development of increasingly sophisticated, immersive and realistic video games. One of the most noteworthy among these advances is virtual reality (VR), which allows users to experience games or other simulated environments as if they were actually navigating them, via the use of electronic wearable devices.
A rectangular robot as tiny as a few human hairs can travel throughout a colon by doing back flips, Purdue University engineers have demonstrated in live animal models.
Using a brain-inspired approach, scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed a way for robots to have the artificial intelligence (AI) to recognize pain and to self-repair when damaged.
The study of developmental biology is getting a robotic helping hand.
What if you could instruct a swarm of robots to paint a picture? The concept may sound far-fetched, but a recent study in open-access journal Frontiers in Robotics and AI has shown that it is possible. The robots in question move about a canvas leaving color trails in their wake, and in a first for robot-created art, an artist can select areas of the canvas to be painted a certain color and the robot team will oblige in real time. The technique illustrates the potential of robotics in creating art, and could be an interesting tool for artists.
Robots are gradually making their way into hospitals and other clinical facilities, providing basic assistance to doctors and patients. To facilitate their widespread use in health care settings, however, robotics researchers need to ensure that users feel at ease with robots and accept the help they can offer. This could potentially be achieved by developing robots that communicate in empathetic and compassionate ways.
To interact with humans and assist them in their day-to-day life, robots should have both verbal and non-verbal communication capabilities. In other words, they should be able to understand both what a user is saying and what their behavior indicates, adapting their speech, behavior and actions accordingly.
Engineers at the University of California San Diego have built a squid-like robot that can swim untethered, propelling itself by generating jets of water. The robot carries its own power source inside its body. It can also carry a sensor, such as a camera, for underwater exploration.
Robots can be amazing tools for search-and-rescue missions and environmental studies, but eventually they must return to a base to recharge their batteries and upload their data. That can be a challenge if your robot is an autonomous underwater vehicle (AUV) exploring deep ocean waters.
A team of researchers from the University of California, the University of North Carolina at Chapel Hill and Pacific Northwest National Laboratory has found that insects use natural oscillations to stabilize their flight. In their study, published in the journal Science Robotics, the researchers used what they describe as "a type of calculus" (chronological calculus) to better understand the factors that are involved in keeping flapping winged insects aloft. Matěj Karásek, with Delft University of Technology has published a Focus piece in the same journal issue describing the work done by the team on this new effort.
Would you trust your life to an autonomous vehicle? Do you understand how it will respond in dangerous situations? Are you willing to get in without knowing the risks?
The way humans interpret the behavior of AI-endowed artificial agents, such as humanoid robots, depends on specific individual attitudes that can be detected from neural activity. Researchers at IIT-Istituto Italiano di Tecnologia (Italian Institute of Technology) demonstrated that people's bias toward robots—that is, attributing intentionality or considering them as "mindless things"—can be correlated with distinct brain activity patterns. The research results have been published in Science Robotics and are important for understanding the way humans engage with robots, while also considering their acceptance in healthcare applications and daily life.
Vanderbilt University engineers have determined that their back-assist exosuit, a clothing-like device that supports human movement and posture, can reduce fatigue by an average of 29-47 percent in lower back muscles. The exosuit's functionality presents a promising new development for individuals who work in physically demanding fields and are at risk for back pain, including medical professionals and frontline workers.