Researchers have developed a mind-reading system for decoding neural signals from the brain during arm movement. The method, described in the journal Applied Soft Computing, can be used by a person to control a robotic arm through a brain-machine interface (BMI).
It's been roughly 23 years since one of the first robotic animals trotted on the scene, defying classical notions of our cuddly four-legged friends. Since then, a barrage of the walking, dancing, and door-opening machines have commanded their presence, a sleek mixture of batteries, sensors, metal, and motors. Missing from the list of cardio activities was one both loved and loathed by humans (depending on whom you ask), and which proved slightly trickier for the bots: learning to run.
If a Tyrannosaurus Rex living 66 million years ago featured a similar leg structure as an ostrich running in the savanna today, then we can assume bird legs stood the test of time—a good example of evolutionary selection.
Robots mowing lawns is a form of robotic assistance that society has accepted. But there are currently few concepts for robotic assistance in other tasks, specifically those that involve close proximity to humans, like housekeeping and care. The Fraunhofer Institute for Machine Tools and Forming Technology IWU uses innovative switchable stiffnesses in robots to combine the required strength with the necessary safety. At the Hannover Messe Preview on March 16, 2022, and at the Hannover Messe from May 30 to June 2, 2022, the researchers will be presenting a robot arm that could facilitate the support of people in their direct surroundings.
Feeling and moving in a place without being there is the main goal of the new iCub robot advanced telexistence system, also called the iCub3 avatar system, developed by researchers at IIT-Istituto Italiano di Tecnologia (Italian Institute of Technology) in Genova, Italy. The new system was tested in an online demonstration involving a human operator based in IIT, Genova, and a new version of the humanoid robot, the iCub 3, visiting the Italian Pavilion at the 17th International Architecture Exhibition—La Biennale di Venezia; the two sites are 300 km apart and the communication relied on basic optical fiber connection. Researchers demonstrated that the system transports the operator locomotion, manipulation, voice and facial expressions to the robotic avatar, while receiving visual, auditory, haptic and touch feedback. This is the first test of a legged humanoid robot for remote tourism and conferring the experience to a human operator. The system is a prototype and may be further developed for other scenarios, including disaster response, healthcare and metaverse applications.
Skoltech researchers and their colleagues from ESPCI Paris, Chiba University, and Japan Agency for Marine-Earth Science and Technology have used a 3D simulation to show that small fish swimming in a school can sense the position and tail beat of their neighbors as water pressure variation on the side of their bodies. This mechanism is thought to enable fish to maximize swimming efficiency in a group even in complete darkness, when no visual cues are available. Understanding group motion of fish is useful for predicting their migration and designing aquatic research robots that mimic fish behavior either for the energy-saving benefits of moving in a group or to blend in with the ocean creatures they are studying. The paper is published in Frontiers in Robotics and AI.
Over the past few decades, computer scientists have developed increasingly advanced techniques to train and operate robots. Collectively, these methods could facilitate the integration of robotic systems in an increasingly wide range of real-world settings.
In recent years, roboticists have developed mobile robots with a wide range of anatomies and capabilities. A class of robotic systems that has been found to be particularly promising for navigating unstructured and dynamic environments are legged robots (i.e., robots with two or more legs that often resemble animals).
Officials and engineers at Kawasaki have unveiled Bex, a quadruped robot that can walk, roll around and even carry a human passenger on its back—at this year's 2022 International Robot Exhibition in Tokyo. At the exhibition, Bex was configured to look like an Ibex, a type of wild goat, which is where it gets its name.
Hard, cold metal is usually what comes to mind when one imagines a robot. While these rigid automatons have their advantages, researchers are now exploring "soft" or continuum robots that can provide flexibility and compliance where traditional robots cannot. One such researcher is LSU Mechanical Engineering (ME) Assistant Professor Hunter Gilbert.
Ben-Gurion University of the Negev researchers argue in a new paper that previous tests of virtual reality versus social robots for cognitive training compare apples to apples when they really need to be comparing apples to oranges.
Physicists have discovered a new way to coat soft robots in materials that allow them to move and function in a more purposeful way. The research, led by the UK's University of Bath, is described today in Science Advances.
A new approach to producing realistic expressions of pain on robotic patients could help to reduce error and bias during physical examination.
Scientists from the Faculty of Engineering, Information and Systems at the University of Tsukuba devised a text message mediation robot that can help users control their anger when receiving upsetting news. This device may help improve social interactions as we move towards a world with increasingly digital communications.
To monitor and navigate real-world environments, machines and robots should be able to gather images and measurements under different background lighting conditions. In recent years, engineers worldwide have thus been trying to develop increasingly advanced sensors, which could be integrated within robots, surveillance systems, or other technologies that can benefit from sensing their surroundings.