Archive 23.06.2023

Page 34 of 65
1 32 33 34 35 36 65

Robot Talk Episode 54 – Robotics and science fiction

In this special live recording of the Robot Talk podcast at the Great Exhibition Road Festival, Claire chatted to Glyn Morgan (Science Museum), Bani Anvari (University College London) and Thrishantha Nanayakara (Imperial College London) to explore how our intelligent friends from the world of science fiction match up with state-of-the art robotics and artificial intelligence reality.

Glyn Morgan is a curator of exhibitions at the Science Museum, most recently: “Science Fiction: Voyage to the Edge of Imagination” (open until August 20th). He also teaches a course on Science Fiction at Imperial College, and has published widely on many aspects of the genre writing for the Los Angeles Review of Books, the Royal Society, and the Science Fiction Research Association, amongst others. His research is interested in the interface between science fiction and other disciplines from history to psychology and beyond, and the ways science fiction can be used as a cognitive tool to help us understand ourselves and our society.

Bani Anvari is a Full Professor of Intelligent Mobility at the Centre for Transport Studies in the Faculty of Engineering at University College London (UCL). She is the founder and director of Intelligent Mobility at UCL. Her vision is to enable humans to trust and fully exploit the benefits of future mobility services through new technology and innovation. Her research focuses on Intelligent Mobility and exploring interactions with semi- and fully-autonomous vehicles in various contexts, benefiting significantly from Robotics and AI.

Thrishantha Nanayakkara is a Professor of Robotics and the Director of the Morphlab at Dyson School of Design Engineering (DSDE), Imperial College London. His group has used soft robots to understand how compliance of the body helps to stabilise dynamic interactions with the environment. He is and has been PI on projects of more than £5 million that have pushed the boundaries of our understanding on how conditioning the body improves the efficacy of action and perception in human-human and human-robot interactions.

How robots could help verify compliance with nuclear arms agreements

Ensuring that countries abide by future nuclear arms agreements will be a vital task. Inspectors may have to count warheads or confirm the removal of nuclear weapons from geographical areas. Those hotspots could include underground bunkers and require confirmation that no weapons exist in a location at all. Now, researchers at Princeton University and the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) have devised an automated way to ensure compliance.

City buildings could blow air taxi future off course

The air taxi market is almost ready for take off, with companies such as Boeing, Hyundai, Airbus and Toyota building fleets to have commuters flitting through the sky. Europe and the U.S. have both drafted new rules to pave the way for air taxis to begin operations within the decade, with Australia's Civil Aviation Safety Authority (CASA) to follow suit.

Scientists develop magnetically controlled soft medical robot inspired by the pangolin

Pangolins are fascinating creatures. This animal looks like a walking pine cone, as it is the only mammal completely covered with hard scales. The scales are made of keratin, just like our hair and nails. The scales overlap and are directly connected to the underlying soft skin layer. This special arrangement allows the animals to curl up into a ball in case of danger.

Robots with tact

Close-up picture of beautiful charming female in pale pink silk shirt sitting on floor on colorful carpet holding laptop on knees with prosthetic bionic hand made of black metal mechanical device

Picture: Adobe Stock/shurkin_son

Artificial hands, even the most sophisticated prostheses, are still by far inferior to human hands. What they lack are the tactile abilities crucial for dexterity. Other challenges include linking sensing to action within the robotic system – and effectively linking it to the human user. Prof. Dr. Philipp Beckerle from FAU has joined with international colleagues to summarize the latest findings in this field of Robotics – and establish an agenda for future research. Their piece in the research journal Science Robotics suggests a sensorimotor control framework for haptically enabled robotic hands, inspired by principles of the human’s central nervous system. Their aim is to link tactile sensing to movement in human-centred, haptically enabled artificial hands. According to the European and American team of researchers, this approach promises improved dexterity for humans controlling robotic hands.

Tactile sensing needs to play a bigger role

“Human manual dexterity relies critically on touch”, explains Prof. Dr. Philipp Beckerle, head of FAU’s Chair of Autonomous Systems and Mechatronics (ASM). “Humans with intact motor function but insensate fingertips can find it very difficult to grasp or manipulate things.” This, he says, indicates that tactile sensing is necessary for human dexterity. “Bioinspired design suggests that lessons from human haptics could enhance the currently limited dexterity of artificial hands. But robotic and prosthetic hands make little use of the many tactile sensors nowadays available and are hence much less dexterous.”

Beckerle, a Mechatronics engineer, has just had the paper “A hierarchical sensorimotor control framework for human-in-the-loop robotic hands” published in the research journal Science Robotics. In this, he unfolds with international colleagues how advanced technologies now provide not only mechatronic and computational components for anthropomorphic limbs, but also sensing ones. The scientists therefore suggest that such recently developed tactile sensing technologies could be incorporated into a general concept of “electronic skins”. “These include dense arrays of normal-force-sensing tactile elements in contrast to fingertips with a more comprehensive force perception”, the paper reads. “This would provide a directional force-distribution map over the entire sensing surface, and complex three-dimensional architectures, mimicking the mechanical properties and multimodal sensing of human fingertips.” Tactile sensing systems mounted on mechatronic limbs could therefore provide robotic systems with the complex representations needed to characterize, identify and manipulate, e.g. objects.

Human principles as inspiration for future designs

To achieve haptically informed and dexterous machines, the researchers secondly propose taking inspiration from the principles of the hierarchically organised human central nervous system (CNS). The CNS controls, which signals the brain receives from tactile senses and sends back to the body. The authors propose a conceptual framework in which a bioinspired touch-enabled robot shares control with the human – to a degree that the human sets. Principals of the framework include parallel processing of tasks, integration of feedforward and feedback control as well as a dynamic balance between subconscious and conscious processing. These could not only be applied in the design of bionic limbs, but also that of virtual avatars or remotely navigated telerobots.

It remains yet another challenge though to effectively interface a human user with touch-enabled robotic hands. “Enhancing haptic robots with high-density tactile sensing can substantially improve their capabilities but raises questions about how best to transmit these signals to a human controller, how to navigate shared perception and action in human-machine systems”, the paper reads. It remains largely unclear how to manage agency and task assignment, to maximize utility and user experience in human-in-the-loop systems. “Particularly challenging is how to exploit the varied and abundant tactile data generated by haptic devices. Yet, human principles provide inspiration for the future design of mechatronic systems that can function like humans, alongside humans, and even as replacement parts for humans.”

Philipp Beckerle’s Chair is part of the FAU’s Departments of Electrical Engineering, Electronics and Information Technology as well as the Department of Artificial Intelligence in Biomedical Engineering. “Our mission at ASM is to research human-centric mechatronics and robotics and strive for solutions that combine the desired performance with user-friendly interaction properties”, Beckerle explains. “Our focus is on wearable systems such as prostheses or exoskeletons, cognitive systems such as collaborative or humanoid robots and generally on tasks with close human-robot interaction. The human factors are crucial in such scenarios in order to meet the user’s needs and to achieve synergetic interface as well as interaction between humans and machines.”

Apart from Prof. Dr. Beckerle, scientists from the Universities of Genoa, Pisa and Rome, Aalborg, Bangor and Pittsburgh as well as the Imperial College London and the University of Southern California, Los Angeles were contributing to the paper.

RoboCat: A self-improving robotic agent

Robots are quickly becoming part of our everyday lives, but they’re often only programmed to perform specific tasks well. While harnessing recent advances in AI could lead to robots that could help in many more ways, progress in building general-purpose robots is slower in part because of the time needed to collect real-world training data. Our latest paper introduces a self-improving AI agent for robotics, RoboCat, that learns to perform a variety of tasks across different arms, and then self-generates new training data to improve its technique.

RoboCat: A self-improving robotic agent

Robots are quickly becoming part of our everyday lives, but they’re often only programmed to perform specific tasks well. While harnessing recent advances in AI could lead to robots that could help in many more ways, progress in building general-purpose robots is slower in part because of the time needed to collect real-world training data. Our latest paper introduces a self-improving AI agent for robotics, RoboCat, that learns to perform a variety of tasks across different arms, and then self-generates new training data to improve its technique.

RoboCat: A self-improving robotic agent

Robots are quickly becoming part of our everyday lives, but they’re often only programmed to perform specific tasks well. While harnessing recent advances in AI could lead to robots that could help in many more ways, progress in building general-purpose robots is slower in part because of the time needed to collect real-world training data. Our latest paper introduces a self-improving AI agent for robotics, RoboCat, that learns to perform a variety of tasks across different arms, and then self-generates new training data to improve its technique.
Page 34 of 65
1 32 33 34 35 36 65