Archive 04.12.2020

Page 5 of 5
1 3 4 5

#CYBATHLON2020GlobalEdition winners of the powered exoskeleton race (with interview)

Team Angel Robotics 1
Winning team: Angel Robotics with pilot Byeong-Uk Kim

The last edition of CYBATHLON took place on 13-14 November, 2020. This competition, created by ETH Zurich and run as a non-profit project, aims to advance in the research and development of assistive technology by involving developers, people with disabilities, and the general public. We had the chance to interview the winning team of the powered exoskeleton race, Angel Robotics from South Korea.

In this race, pilots with complete thoracic or lumbar spinal cord injury from nine teams competed using an exoskeleton. This wearable, powered support enables them to walk and master other everyday tasks. Indeed, the motivation behind this race is that “the use of exoskeletons is still rare, they are currently mainly used for physiotherapy in hospitals and rehabilitation centers. Exoskeletons dramatically increase the mobility of people with paraplegia, which consequently improves their overall physical and psychological health and therefore might represent a welcome addition to a wheelchair”, as the organizers of CYBATHLON state. This race involved:

(1) Sitting down & standing up from a sofa, and stacking cups while standing next to a table to test the range of motion and strength in the knee and hip joints, and stability.

(2) Slaloming around furniture without displacing it to test precision of steps and agility.

(3) Crossing uneven terrain to test precision of steps and adaptation of step lengths and widths.

(4) Climbing and descending stairs to test range of motion and strength in the knee and hip joints, and step precision.

(5) Walking across a tilted path to test the lateral range of motion in hip and foot joints, and stability.

(6) Climbing a ramp, opening and closing the door in the middle of the ramp, and descending the ramp to test the range of motion in foot, knee and hip joints, stability and maneuvering in confined spaces.

Race tasks
Powered exoskeleton race tasks. Credit: CYBATHLON

The top three teams were the company Angel Robotics (1) from South Korea with pilot Byeong-Uk Kim, TWIICE from EPFL research group REHAssist with pilot Silke Pan, and Angel Robotics (2) with pilot Lee Joo-Hyun. Remarkably, the three of them achieved the highest score – 100 points. With this impressive result, the podium was decided based on finishing time. If you can’t wait to watch how tight the races were, you can enjoy them in the recorded livestream below (from time 3:10:30).

You can see the results from the rest of the teams in this discipline here, or watch the recoreded livestreams of both days on their website.

Interview to Kyoungchul Kong – Team leader of Angel Robotics team

We had the pleasure to interview Kyungchul Kong, team leader and CEO of Angel Robotics (1&2). He is also an Associate Professor of KAIST (Korea Advanced Institute of Science and Technology).

Kyoungchul Kong
Kyoungchul Kong – Team leader of Angel Robotics team


D. C. Z.: What does it mean for your team to have won in your CYBATHLON category?

K.K.: In WalkON Suit, the powered exoskeleton of Angel Robotics, there have been various dramatic technical advances. Since the first Cybathlon in 2016, the walking speed has become as fast as people without disabilities. The most important feature of WalkON Suit is its balance; as the center of mass is placed on the area of feet while standing straight, the wearer can stand without much effort for a long time. These superior functionalities of WalkON Suit could be proved by winning the Gold and Bronze medals at Cybathlon 2020.

D. C. Z.: And what does it mean for people with disabilities?

K.K.: While winning the Gold medal is glorious, winning two medals is especially meaningful. The physical conditions of the two pilots (i.e., the Gold medalist and the Bronze medalist) of Team Angel Robotics were extremely different. One was a male with very strong upper body, while the other was a female with much less muscles. Such different people could be successfully assisted by WalkON Suit, which means that the powered exoskeleton is not a technology optimized for a single user, but able to be utilized by many people with different body conditions.

D. C. Z.: What are still your challenges?

K.K.: In order to bring the WalkON Suit into the real life of people who need this technology, it has to be much improved in terms of wearability, price, and weight. The user should be able to wear the robot without anyone else’s help. It should be light enough to handle while sitting on a wheelchair. The price is another critical issue considering practical conditions. With these restrictions, the functionalities and performance of the robot must not be deteriorated. These are the challenges we are much trying to get over.

Copying beetle wings to design MAVs that can recover from midair collisions

A pair of researchers at Konkuk University has designed a miniaturized micro air vehicle (MAV) capable of recovering from midair collisions. In their paper published in the journal Science, Hoang Vu Phan and Hoon Cheol Park describe their study of collision recovery in rhinoceros beetles and how they applied their findings to the design of a new kind of MAV.

RealAnt: A low-cost quadruped robot that can learn via reinforcement learning

Over the past decade or so, roboticists and computer scientists have tried to use reinforcement learning (RL) approaches to train robots to efficiently navigate their environment and complete a variety of basic tasks. Building affordable robots that can support and manage the exploratory controls associated with RL algorithms, however, has so far proved to be fairly challenging.

Robot hands one step closer to human thanks to AI algorithms

The Shadow Robot Dexterous Hand is a robot hand, with size, shape and movement capabilities similar to those of a human hand. To give the robotic hand the ability to learn how to manipulate objects researchers from WMG, University of Warwick, have developed new AI algorithms.

Improved remote control of robots

Sometimes you need to get human knowledge and skills to places that are hazardous or difficult to access for people. The project entitled Predictive Avatar Control and Feedback (PACOF) is creating a robotic system that allows the robot operator to experience the location just as the robot does. Three researchers representing the three different disciplines of the University of Twente's EEMCS faculty are working together in this project.

Robots Partnering With Humans: at FPT Industrial Factory 4.0 is Already a Reality Thanks to Collaboration With Comau

What will the factory of the future be like? How will it be organized? To answer these and many other questions about the Industry 4.0 production model, all it takes is a trip to the FPT Industrial Driveline plant in Turin.

Ultra-sensitive and resilient sensor for soft robotic systems

Sensor sleeve
Graduate student Moritz Graule demonstrates a fabric arm sleeve with embedded sensors. The sensors detect the small changes in the Graule’s forearm muscle through the fabric. Such a sleeve could be used in everything from virtual reality simulations and sportswear to clinical diagnostics for neurodegenerative diseases like Parkinson’s Disease. Credit: Oluwaseun Araromi/Harvard SEAS

By Leah Burrows / SEAS communications

Newly engineered slinky-like strain sensors for textiles and soft robotic systems survive the washing machine, cars and hammers.

Think about your favorite t-shirt, the one you’ve worn a hundred times, and all the abuse you’ve put it through. You’ve washed it more times than you can remember, spilled on it, stretched it, crumbled it up, maybe even singed it leaning over the stove once. We put our clothes through a lot and if the smart textiles of the future are going to survive all that we throw at them, their components are going to need to be resilient.

Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and the Wyss Institute for Biologically Inspired Engineering have developed an ultra-sensitive, seriously resilient strain sensor that can be embedded in textiles and soft robotic systems. The research is published in Nature.

“Current soft strain gauges are really sensitive but also really fragile,” said Oluwaseun Araromi, Ph.D., a Research Associate in Materials Science and Mechanical Engineering at SEAS and the Wyss Institute and first author of the paper. “The problem is that we’re working in an oxymoronic paradigm — highly sensitivity sensors are usually very fragile and very strong sensors aren’t usually very sensitive. So, we needed to find mechanisms that could give us enough of each property.”

In the end, the researchers created a design that looks and behaves very much like a Slinky.

“A Slinky is a solid cylinder of rigid metal but if you pattern it into this spiral shape, it becomes stretchable,” said Araromi. “That is essentially what we did here. We started with a rigid bulk material, in this case carbon fiber, and patterned it in such a way that the material becomes stretchable.”

The pattern is known as a serpentine meander, because its sharp ups and downs resemble the slithering of a snake. The patterned conductive carbon fibers are then sandwiched between two pre-strained elastic substrates. The overall electrical conductivity of the sensor changes as the edges of the patterned carbon fiber come out of contact with each other, similar to the way the individual spirals of a slinky come out of contact with each other when you pull both ends. This process happens even with small amounts of strain, which is the key to the sensor’s high sensitivity.

Close-up of the sensor material
A close-up view of the sensor’s patterned conductive carbon fibers. The fibers are sandwiched between two prestrained elastic substrates. The overall electrical conductivity of the sensor changes as the edges of the patterned carbon fiber come out of contact with each other. Credit: James Weaver/Harvard SEAS

Unlike current highly sensitive stretchable sensors, which rely on exotic materials such as silicon or gold nanowires, this sensor doesn’t require special manufacturing techniques or even a clean room. It could be made using any conductive material.

The researchers tested the resiliency of the sensor by stabbing it with a scalpel, hitting it with a hammer, running it over with a car, and throwing it in a washing machine ten times. The sensor emerged from each test unscathed. To demonstrate its sensitivity, the researchers embedded the sensor in a fabric arm sleeve and asked a participant to make different gestures with their hand, including a fist, open palm, and pinching motion. The sensors detected the small changes in the subject’s forearm muscle through the fabric and a machine learning algorithm was able to successfully classify these gestures.

“These features of resilience and the mechanical robustness put this sensor in a whole new camp,” said Araromi.

Such a sleeve could be used in everything from virtual reality simulations and sportswear to clinical diagnostics for neurodegenerative diseases like Parkinson’s Disease. Harvard’s Office of Technology Development has filed to protect the intellectual property associated with this project.

“The combination of high sensitivity and resilience are clear benefits of this type of sensor,” said senior author Robert Wood, Ph.D., Associate Faculty member at the Wyss Institute, and the Charles River Professor of Engineering and Applied Sciences at SEAS. “But another aspect that differentiates this technology is the low cost of the constituent materials and assembly methods. This will hopefully reduce the barriers to get this technology widespread in smart textiles and beyond.”

Sensor twist
This ultra-sensitive resilient strain sensor can be embedded in textiles and soft robotic systems. Credit: Oluwaseun Araromi/Harvard SEAS

“We are currently exploring how this sensor can be integrated into apparel due to the intimate interface to the human body it provides,” says co-author and Wyss Associate Faculty member Conor Walsh, Ph.D., who also is the Paul A. Maeder Professor of Engineering and Applied Sciences at SEAS. “This will enable exciting new applications by being able to make biomechanical and physiological measurements throughout a person’s day, not possible with current approaches.”

The combination of high sensitivity and resilience are clear benefits of this type of sensor. But another aspect that differentiates this technology is the low cost of the constituent materials and assembly methods. This will hopefully reduce the barriers to get this technology widespread in smart textiles and beyond.

Robert Wood

The research was co-authored by Moritz A. Graule, Kristen L. Dorsey, Sam Castellanos, Jonathan R. Foster, Wen-Hao Hsu, Arthur E. Passy, James C. Weaver, Senior Staff Scientist at SEAS and Joost J. Vlassak, the Abbott and James Lawrence Professor of Materials Engineering at SEAS. It was funded through the university’s strategic research alliance with Tata. The 6-year, $8.4M alliance was established in 2016 to advance Harvard innovation in fields including robotics, wearable technologies, and the internet of things (IoT).

#324: Embodied Interactions: from Robotics to Dance, with Kim Baraka

In this episode, our interviewer Lauren Klein speaks with Kim Baraka about his PhD research to enable robots to engage in social interactions, including interactions with children with Autism Spectrum Disorder. Baraka discusses how robots can plan their actions across multiple modalities when interacting with humans, and how models from psychology can inform this process. He also tells us about his passion for dance, and how dance may serve as a testbed for embodied intelligence within Human-Robot Interaction.

Kim Baraka

Kim Baraka is a postdoctoral researcher in the Socially Intelligent Machines Lab at the University of Texas at Austin, and an upcoming Assistant Professor in the Department of Computer Science at Vrije Universiteit Amsterdam, where he will be part of the Social Artificial Intelligence Group. Baraka recently graduated with a dual PhD in Robotics from Carnegie Mellon University (CMU) in Pittsburgh, USA, and the Instituto Superior Técnico (IST) in Lisbon, Portugal. At CMU, Baraka was part of the Robotics Institute and was advised by Prof. Manuela Veloso. At IST, he was part of the Group on AI for People and Society (GAIPS), and was advised by Prof. Francisco Melo.

Dr. Baraka’s research focuses on computational methods that inform artificial intelligence within Human-Robot Interaction. He develops approaches for knowledge transfer between humans and robots in order to support mutual and beneficial relationships between the robot and human. Specifically, he has conducted research in assistive interactions where the robot or human helps their partner to achieve a goal, and in teaching interactions. Baraka is also a contemporary dancer, with an interest in leveraging lessons from dance to inform advances in robotics, or vice versa.

PS. If you enjoy listening to experts in robotics and asking them questions, we recommend that you check out Talking Robotics. They have a virtual seminar on Dec 11 where they will be discussing how to conduct remote research for Human-Robot Interaction; something that is very relevant to researchers working from home due to COVID-19.

Page 5 of 5
1 3 4 5