Human beings have the ability to recognize emotions in others. Although perfectly capable of communicating with humans through speech, robots and virtual agents are only good at processing logical instructions, which greatly restricts human-robot interaction (HRI). Consequently, a great deal of research in HRI is about emotion recognition from speech. But first, how do we describe emotions?
Army researchers developed a technique that allows robots to remain resilient when faced with intermittent communication losses on the battlefield.
A team of researchers affiliated with Seoul National University, Harvard University and Hankook Tire and Technology Co. Ltd., has developed a tire based on an origami design that allows for changing the shape of a tire while a vehicle is moving. In their paper published in the journal Science Robotics, the group describes their new tire design and how well it worked when tested.
The five meter-long Lexus RX-450h leads a rather contemplative life at Empa. It never takes long trips. Instead, the SUV dutifully makes its rounds on a special track just 180 meters long in a secluded backyard of the Empa campus. The scenery is not particularly spectacular: The Mobileye camera behind the windshield sees freshly painted lane markings on aging concrete; the Velodyne lidar scans the window front of always the same lab building at every turn, and the Delphi radar behind the Lexus' radiator grille routinely measures the distance to five tin trash cans set up to either side of the course.
Guide dogs, dogs that are trained to help humans move through their environments, have played a critical role in society for many decades. These highly trained animals, in fact, have proved to be valuable assistants for visually impaired individuals, allowing them to safely navigate indoor and outdoor environments.
In recent years, computer scientists and roboticists have developed a variety of technological tools to aid human agents during critical missions, such as military operations or search and rescue efforts. Unmanned aerial vehicles (UAVs) have proved to be particularly valuable in these cases, as they can often enter remote or dangerous areas that are inaccessible to humans.
Anyone with children knows that while controlling one child can be hard, controlling many at once can be nearly impossible. Getting swarms of robots to work collectively can be equally challenging, unless researchers carefully choreograph their interactions—like planes in formation—using increasingly sophisticated components and algorithms. But what can be reliably accomplished when the robots on hand are simple, inconsistent, and lack sophisticated programming for coordinated behavior?
Army and Arizona State University researchers identified a set of approaches to help scientists assess how well autonomous systems and humans communicate.
The Swiss robotics company ANYbotics has announced the launch of a new end-to-end robotic inspection system for the energy and industrial processing arenas. This solution aims to answer the call for better safety at production sites and lower downtime.
A team of researchers working at Barcelona Institute of Science and Technology has developed a skeletal-muscle-based, biohybrid soft robot that can swim faster than other skeletal-muscle-based biobots. In their paper published in the journal Science Robotics, the group describes building and testing their soft robot.
Ever wondered why your virtual home assistant doesn't understand your questions? Or why your navigation app took you on the side street instead of the highway? In a study published April 21st in the journal iScience, Italian researchers designed a robot that "thinks out loud" so that users can hear its thought process and better understand the robot's motivations and decisions.
To enable the efficient operation of unmanned aerial vehicles (UAVs) in instances where a global localization system (GPS) or an external positioning device (e.g., a laser reflector) is unavailable, researchers must develop techniques that automatically estimate a robot's pose. If the environment in which a drone operates does not change very often and one is able to build a 3D map of this environment, map-based robot localization techniques can be fairly effective.
Researchers at the University of Twente have discovered that primary school children in both regular and special needs schools make strides when they learn together with a robot. On 30 April, both Daniel Davison and Bob Schadenberg will obtain their Ph.D.s from UT, with comparable research but working in different contexts.
Multi-robot systems have recently been used to tackle a variety of real-world problems, for instance, helping human users to monitor environments and access secluded locations. In order to navigate unknown and dynamic environments most efficiently, these robotic systems should be guided by path planners, which can identify collision-free trajectories for individual robots in a team.
Spoken dialogue is the most natural way for people to interact with complex autonomous agents such as robots. Future Army operational environments will require technology that allows artificial intelligent agents to understand and carry out commands and interact with them as teammates.