Page 288 of 522
1 286 287 288 289 290 522

Matrox® Imaging Library (MIL) X

From traditional 2D and 3D computer-vision and deep-learning-based tools, Matrox® Imaging Library (MIL) X has your vision development needs covered. This SDK features algorithms and application tools field-proven in more than 60 key industries. MIL CoPilot—an interactive environment for training deep neural networks, prototyping, and code generation—shortens project ramp up time.

A robot that can put a surgical gown on a supine mannequin

A pair of researchers working in the Personal Robotics Laboratory at Imperial College London has taught a robot to put a surgical gown on a supine mannequin. In their paper published in the journal Science Robotics, Fan Zhang and Yiannis Demiris described the approach they used to teach the robot to partially dress the mannequin. Júlia Borràs, with Institut de Robòtica i Informàtica Industrial, CSIC-UPC, has published a Focus piece in the same journal issue outlining the difficulties in getting robots to handle soft material and the work done by the researchers on this new effort.

A robot that can put a surgical gown on a supine mannequin

A pair of researchers working in the Personal Robotics Laboratory at Imperial College London has taught a robot to put a surgical gown on a supine mannequin. In their paper published in the journal Science Robotics, Fan Zhang and Yiannis Demiris described the approach they used to teach the robot to partially dress the mannequin. Júlia Borràs, with Institut de Robòtica i Informàtica Industrial, CSIC-UPC, has published a Focus piece in the same journal issue outlining the difficulties in getting robots to handle soft material and the work done by the researchers on this new effort.

Touchy subject: 3D printed fingertip ‘feels’ like human skin

Robotic hand with a 3D-printed tactile fingertip on the little (pinky) finger. The white rigid back to the fingertip is covered with the black flexible 3D-printed skin.

Machines can beat the world’s best chess player, but they cannot handle a chess piece as well as an infant. This lack of robot dexterity is partly because artificial grippers lack the fine tactile sense of the human fingertip, which is used to guide our hands as we pick up and handle objects.

Two papers published in the Journal of the Royal Society Interface give the first in-depth comparison of an artificial fingertip with neural recordings of the human sense of touch. The research was led by Professor of Robotics & AI (Artificial Intelligence), Nathan Lepora, from the University of Bristol’s Department of Engineering Maths and based at the Bristol Robotics Laboratory.

“Our work helps uncover how the complex internal structure of human skin creates our human sense of touch. This is an exciting development in the field of soft robotics – being able to 3D-print tactile skin could create robots that are more dexterous or significantly improve the performance of prosthetic hands by giving them an in-built sense of touch,” said Professor Lepora.

Cut-through section on the 3D-printed tactile skin. The white plastic is a rigid mount for the flexible black rubber skin. Both parts are made together on an advanced 3D-printer. The ‘pins’ on the inside of the skin replicate dermal papillae that are formed inside human skin.

Professor Lepora and colleagues created the sense of touch in the artificial fingertip using a 3D-printed mesh of pin-like papillae on the underside of the compliant skin, which mimic the dermal papillae found between the outer epidermal and inner dermal layers of human tactile skin. The papillae are made on advanced 3D-printers that can mix together soft and hard materials to create complicated structures like those found in biology.

“We found our 3D-printed tactile fingertip can produce artificial nerve signals that look like recordings from real, tactile neurons. Human tactile nerves transmit signals from various nerve endings called mechanoreceptors, which can signal the pressure and shape of a contact. Classic work by Phillips and Johnson in 1981 first plotted electrical recordings from these nerves to study ‘tactile spatial resolution’ using a set of standard ridged shapes used by psychologists. In our work, we tested our 3D-printed artificial fingertip as it ‘felt’ those same ridged shapes and discovered a startlingly close match to the neural data,” said Professor Lepora.

“For me, the most exciting moment was when we looked at our artificial nerve recordings from the 3D-printed fingertip and they looked like the real recordings from over 40 years ago! Those recordings are very complex with hills and dips over edges and ridges, and we saw the same pattern in our artificial tactile data,” said Professor Lepora.

While the research found a remarkably close match between the artificial fingertip and human nerve signals, it was not as sensitive to fine detail. Professor Lepora suspects this is because the 3D-printed skin is thicker than real skin and his team is now exploring how to 3D-print structures on the microscopic scale of human skin.

“Our aim is to make artificial skin as good – or even better – than real skin,” said Professor Lepora.

PAPERS

On-the-fly reconfigurable magnetic slime used as a robot

A team of researchers affiliated with a host of entities in China has created a type of magnetic slime that can be configured on the fly to perform a variety of robotic tasks. In their paper published in the journal Advanced Functional Materials, the group describes their slime, possible uses for it and the actions they have taken to make it less toxic.

Touchy subject: 3D printed fingertip ‘feels’ like human skin

Machines can beat the world's best chess player, but they cannot handle a chess piece as well as an infant. This lack of robot dexterity is partly because artificial grippers lack the fine tactile sense of the human fingertip, which is used to guide our hands as we pick up and handle objects.

Robots dress humans without the full picture

Robots are already adept at certain things, such as lifting objects that are too heavy or cumbersome for people to manage. Another application they're well suited for is the precision assembly of items like watches that have large numbers of tiny parts—some so small they can barely be seen with the naked eye.

Drones and driverless cars could help with Ukraine’s humanitarian crisis

The Russian invasion of Ukraine has led to a serious humanitarian crisis. Of Ukraine's 44 million people, almost one-quarter have been displaced. Around 3.7 million have escaped to neighboring European countries, while around 6.5 million are estimated to be displaced inside Ukraine. Tragically, deaths and injuries continue to rise.

A new approach that could improve how robots interact in conversational groups

To effectively interact with humans in crowded social settings, such as malls, hospitals, and other public spaces, robots should be able to actively participate in both group and one-to-one interactions. Most existing robots, however, have been found to perform much better when communicating with individual users than with groups of conversing humans.

Learn How To Protect Your Business with AI for Visual Inspection – Pleora Webinar Live April 6, 2022

This webinar will discuss how two manufacturers – a distillery and an electronics assembly operation – are using camera-based visual inspection to protect their brand and make manual processes repeatable, consistent, and traceable.

Exoskeletons with personalize-your-own settings

Leo Medrano, a PhD student in the Neurobionics Lab at the University of Michigan, tests out an ankle exoskeleton on a two-track treadmill. Researchers were able to give the exoskeleton user direct control to tune its behavior, allowing them to find the right torque and timing settings for themselves.

By Dan Newman

To transform human mobility, exoskeletons need to interact seamlessly with their user, providing the right level of assistance at the right time to cooperate with our muscles as we move.

To help achieve this, University of Michigan researchers gave users direct control to customize the behavior of an ankle exoskeleton.

Not only was the process faster than the conventional approach, in which an expert would decide the settings, but it may have incorporated preferences an expert would have missed. For instance, user height and weight, which are commonly used metrics for tuning exoskeletons and robotic prostheses, had no effect on preferred settings.

“Instead of a one-size-fits-all level of power, or using measurements of muscle activity to customize an exoskeleton’s behavior, this method uses active user feedback to shape the assistance a person receives,” said Kim Ingraham, first author of the study in Science Robotics, and a recent mechanical engineering Ph.D. graduate.

Experts usually tune the wide-ranging settings of powered exoskeletons to take into account the varied characteristics of human bodies, gait biomechanics and user preferences. This can be done by crunching quantifiable data, such as metabolic rate or muscle activity, to minimize the energy expended from a user, or more simply by asking the user to repeatedly compare between pairs of settings to find which feels best.

What minimizes energy expenditure, however, may not be the most comfortable or useful. And asking the user to select between choices for numerous settings could be too time consuming and also obscures how those settings might interact with each other to affect the user experience.

By allowing the user to directly manipulate the settings, preferences that are difficult to detect or measure could be accounted for by the users themselves. Users could quickly and independently decide what features are most important—for example, trading off comfort, power or stability, and then selecting the settings to best match those preferences without the need for an expert to retune.

“To be able to choose and have control over how it feels is going to help with user satisfaction and adoption of these devices in the future,” Ingraham said. “No matter how much an exoskeleton helps, people won’t wear them if they are not enjoyable.”

By allowing the user to directly manipulate the exoskeleton’s settings using a tablet while on a treadmill, preferences that are difficult to detect or measure, such as comfort, could be accounted for by the users themselves. Courtesy Kim Ingraham

To test the feasibility of such a system, the research team outfitted users with Dephy powered ankle exoskeletons and a touch screen interface that displayed a blank grid. Selecting any point on the grid would alter the torque output of the exoskeleton on one axis, while changing the timing of that torque on the alternate axis.

When told to find their preference while walking on a treadmill, the set of users who had no previous experience with an exoskeleton were, on average, able to confirm their optimal settings in about one minute, 45 seconds.

“We were surprised at how precisely people were able to identify their preferences, especially because they were totally blinded to everything that was happening—we didn’t tell them what parameters they were tuning, so they were only selecting their preferences based on how they felt the device was assisting them,” Ingraham said.

In addition, user preference changed over the course of the experiment. As the first-time users gained more experience with the exoskeleton, they preferred a higher level of assistance. And, those already experienced with exoskeletons preferred a much greater level of assistance than the first-time users.

These findings could help determine how often retuning of an exoskeleton needs to be done as a user gains experience and supports the idea of incorporating direct user input into preference for the best experience.

The ankle exoskeleton, from Dephy Inc., provides assistance when stepping off with the foot. An expert usually tunes the precise machines’ wide-ranging settings to take into account the varied characteristics of human bodies, gait biomechanics, and user preferences.

“This is fundamental work in exploring how to incorporate people’s preference into exoskeleton control,” said Elliott Rouse, senior author of the study, an assistant professor of mechanical engineering and a core faculty member of the Robotics Institute. “This work is motivated by our desire to develop exoskeletons that go beyond the laboratory and have a transformative impact on society.

“Next is answering why people prefer what they prefer, and how these preferences affect their energy, their muscle activity, and their physiology, and how we could automatically implement preference-based control in the real world. It’s important that assistive technologies provide a meaningful benefit to their user.”

The research was supported by the National Science Foundation, the D. Dan and Betty Kahn Foundation and the Carl Zeiss Foundation in cooperation with the German Scholars Organization, in addition to hardware and technical assistance from Dephy Inc. Ingraham is now a postdoctoral researcher at the University of Washington.

Extra: Interview with the research team

What is the history of this research question?

One of the most challenging parts of designing assistive robotic technologies is understanding how we should apply assistance to the human body in order to best meet the user’s goals. Much of the research to date has focused on designing the assistance from lower-limb robotic exoskeletons in order to reduce the energy required to walk. While reducing the energy required to walk may be valuable for applications that require users to walk long distances, there are many other factors that people may wish to prioritize when using a robotic exoskeleton during their daily lives. Users may want to prioritize any number of subjective metrics, like comfort, balance, stability, or effort. In our research, we wanted to capture some of these metrics simultaneously by asking individual users to find their preference in how the exoskeleton assists them.

Why should people care about this?

For exoskeletons to transform human mobility, they need to to act synergistically with their user by providing meaningful assistance but not interfering with their normal walking mechanics. Moreover, these devices must be comfortable to wear and user satisfaction must be high in order for people to want to use exoskeletons during their daily routines. Therefore, understanding what users prefer in the context of exoskeleton assistance is crucial to the development and translation of these technologies. Additionally, human mobility is complex, and we constantly encounter new terrains, situations, and environments that require us to adapt our gait in novel ways. It is impossible to capture in the lab or even predict all the situations that individuals will encounter using an exoskeleton in their daily lives. Therefore, giving users direct control over some elements of their exoskeleton assistance allows the user to provide a rich source of situation-specific information that can help the machine decide how to best assist the user in that given moment.

What excites you most about this finding?

Our study showed that people have clear preferences in how they want a lower-limb exoskeleton to assist them, and that they find these preferences quickly and reliably based only on their perception of how the device was assisting them. This finding opens the doors to understanding the complex interactions between the human and the machine, and will directly inform how we design exoskeleton assistance in the future.

What are your next steps? What should other researchers do next?

We are excited about understanding why users preferred a particular assistance profile and how preferred assistance relates to biomechanical, behavioral, and energetic outcomes.

Page 288 of 522
1 286 287 288 289 290 522