Archive 05.08.2022

Page 26 of 64
1 24 25 26 27 28 64

UBR-1 on ROS2 Humble

It has been a while since I’ve posted to the blog, but lately I’ve actually been working on the UBR-1 again after a somewhat long hiatus. In case you missed the earlier posts in this series:

ROS2 Humble

The latest ROS2 release came out just a few weeks ago. ROS2 Humble targets Ubuntu 22.04 and is also a long term support (LTS) release, meaning that both the underlying Ubuntu operating system and the ROS2 release get a full 5 years of support.

Since installing operating systems on robots is often a pain, I only use the LTS releases and so I had to migrate from the previous LTS, ROS2 Foxy (on Ubuntu 20.04). Overall, there aren’t many changes to the low-level ROS2 APIs as things are getting more stable and mature. For some higher level packages, such as MoveIt2 and Navigation2, the story is a bit different.

Visualization

One of the nice things about the ROS2 Foxy release was that it targeted the same operating system as the final ROS1 release, Noetic. This allowed users to have both ROS1 and ROS2 installed side-by-side. If you’re still developing in ROS1, that means you probably don’t want to upgrade all your computers quite yet. While my robot now runs Ubuntu 22.04, my desktop is still running 18.04.

Therefore, I had to find a way to visualize ROS2 data on a computer that did not have the latest ROS2 installed. Initially I tried the Foxglove Studio, but didn’t have any luck with things actually connecting using the native ROS2 interface (the rosbridge-based interface did work). Foxglove is certainly interesting, but so far it’s not really an RVIZ replacement – they appear to be more focused on offline data visualization.

I then moved onto running rviz2 inside a docker environment – which works well when using the rocker tool:

sudo apt-get install python3-rocker
sudo rocker --net=host --x11 osrf/ros:humble-desktop rviz2

If you are using an NVIDIA card, you’ll need to add --nvidia along with --x11.

In order to properly visualize and interact with my UBR-1 robot, I needed to add the ubr1_description package to my workspace in order to get the meshes and also my rviz configurations. To accomplish this, I needed to create my own docker image. I largely based it off the underlying ROS docker images:

ARG WORKSPACE=/opt/workspace

FROM osrf/ros:humble-desktop

# install build tools
RUN apt-get update && apt-get install -q -y --no-install-recommends \
python3-colcon-common-extensions \
git-core \
&& rm -rf /var/lib/apt/lists/*

# get ubr code
ARG WORKSPACE
WORKDIR $WORKSPACE/src
RUN git clone https://github.com/mikeferguson/ubr_reloaded.git \
&& touch ubr_reloaded/ubr1_bringup/COLCON_IGNORE \
&& touch ubr_reloaded/ubr1_calibration/COLCON_IGNORE \
&& touch ubr_reloaded/ubr1_gazebo/COLCON_IGNORE \
&& touch ubr_reloaded/ubr1_moveit/COLCON_IGNORE \
&& touch ubr_reloaded/ubr1_navigation/COLCON_IGNORE \
&& touch ubr_reloaded/ubr_msgs/COLCON_IGNORE \
&& touch ubr_reloaded/ubr_teleop/COLCON_IGNORE

# install dependencies
ARG WORKSPACE
WORKDIR $WORKSPACE
RUN . /opt/ros/$ROS_DISTRO/setup.sh \
&& apt-get update && rosdep install -q -y \
--from-paths src \
--ignore-src \
&& rm -rf /var/lib/apt/lists/*

# build ubr code
ARG WORKSPACE
WORKDIR $WORKSPACE
RUN . /opt/ros/$ROS_DISTRO/setup.sh \
&& colcon build

# setup entrypoint
COPY ./ros_entrypoint.sh /

ENTRYPOINT ["/ros_entrypoint.sh"]
CMD ["bash"]

The image derives from humble-desktop and then adds the build tools and clones my repository. I then ignore the majority of packages, install dependencies and then build the workspace. The ros_entrypoint.sh script handles sourcing the workspace configuration.

#!/bin/bash
set -e

# setup ros2 environment
source "/opt/workspace/install/setup.bash"
exec "$@"

I could then create the docker image and run rviz inside it:

docker build -t ubr:main
sudo rocker --net=host --x11 ubr:main rviz2

The full source of these docker configs is in the docker folder of my ubr_reloaded repository. NOTE: The updated code in the repository also adds a late-breaking change to use CycloneDDS as I’ve had numerous connectivity issues with FastDDS that I have not been able to debug.

Visualization on MacOSX

I also frequently want to be able to interact with my robot from my Macbook. While I previously installed ROS2 Foxy on my Intel-based Macbook, the situation is quite changed now with MacOSX being downgraded to Tier 3 support and the new Apple M1 silicon (and Apple’s various other locking mechanisms) making it harder and harder to setup ROS2 directly on the Macbook.

As with the Linux desktop, I tried out Foxglove – however it is a bit limited on Mac. The MacOSX environment does not allow opening the required ports, so the direct ROS2 topic streaming does not work and you have to use rosbridge. I found I was able to visualize certain topics, but that switching between topics frequently broke.

At this point, I was about to give up, until I noticed that Ubuntu 22.04 arm64 is a Tier 1 platform for ROS2 Humble. I proceeded to install the arm64 version of Ubuntu inside Parallels (Note: I was cheap and initially tried to use the VMWare technology preview, but was unable to get the installer to even boot). There are a few tricks here as there is no arm64 desktop installer, so you have to install the server edition and then upgrade it to a desktop. There is a detailed description of this workflow on askubuntu.com. Installing ros-humble-desktop from arm64 Debians was perfectly easy.

rviz2 runs relatively quick inside the Parallels VM, but overall it was not quite as quick or stable as using rocker on Ubuntu. However, it is really nice to be able to do some ROS2 development when traveling with only my Macbook.

Migration Notes

Note: each of the links in this section is to a commit or PR that implements the discussed changes.

In the core ROS API, there are only a handful of changes – and most of them are actually simply fixing potential bugs. The logging macros have been updated for security purposes and require c-strings like the old ROS1 macros did. Additionally the macros are now better at detecting invalid substitution strings. Ament has also gotten better at detecting missing dependencies. The updates I made to robot_controllers show just how many bugs were caught by this more strict checking.

image_pipeline has had some minor updates since Foxy, mainly to improve consistency between plugins and so I needed to update some topic remappings.

Navigation has the most updates. amcl model type names have been changed since the models are now plugins. The API of costmap layers has changed significantly, and so a number of updates were required just to get the system started. I then made a more detailed pass through the documentation and found a few more issues and improvements with my config, especially around the behavior tree configuration.

I also decided to do a proper port of graceful_controller to ROS2, starting from the latest ROS1 code since a number of improvements have happened in the past year since I had originally ported to ROS2.

Next steps

There are still a number of new features to explore with Navigation2, but my immediate focus is going to shift towards getting MoveIt2 setup on the robot, since I can’t easily swap between ROS1 and ROS2 anymore after upgrading the operating system.

Allowing social robots to learn relations between users’ routines and their mood

Social robots, robots that can interact with humans and assist them in their daily lives, are gradually being introduced in numerous real-world settings. These robots could be particularly valuable for helping older adults to complete everyday tasks more autonomously, thus potentially enhancing their independence and well-being.

ep.358: Softbank: How Large Companies Approach Robotics, with Brady Watkins

A lot of times on our podcast we dive into startups and smaller companies in robotics. Today’s talk is unique in that Brady Watkins gives us insight into how a big company like Softbank Robotics looks into the Robotics market.

we think scale first, (the) difference from a startup is our goal isn’t to think what’s the first 10 to 20, but we need to think what’s the first 20,000 look like. – Brady Watkins

Brady Watkins

Brady Watkins HeadshotBrady Watkins is the President and General Manager at Softbank Robotics America. During his career at Softbank, he helped to scale and commercialize Whiz, the collaborative robot vacuum designed to work alongside cleaning teams. Watkins played a key role in scaling the production to 20,000 units deployed globally.

Prior to his time at SBRA, Watkins was the Director of Sales, Planning, and Integration at Ubisoft, where he held several positions over the course of 10 years.

Links

Using Remote Operation to Help Solve Labor Shortage Issues in the Supply Chain

Our remotely operated forklifts allow operators to do their jobs from up to thousands of miles away. This is critically important to our customers, who have been dealing for decades with a labor shortage that recently has become even more acute than ever before.

Two hands are better than one

The HOPE Hand.

What are you doing right now other than scrolling through this article? Do you have a cup of coffee in one hand, your phone in the other? Maybe your right hand is using your laptop mouse and your left hand is holding a snack. Have you ever thought about how often we are using both of our hands? Having two healthy human hands allows us to carry too many grocery bags in one hand and unlock our apartment door with the other, and perform complex bimanual coordination like playing Moonlight Sonata by Beethoven on the piano (well, maybe not all of us can do that). Having two hands also allows us to do some of the most simple tasks in our daily lives, like holding a jar of peanut butter and unscrewing the lid, or putting our hair up in a ponytail.

If you take some time to think about how often both of your hands are occupied, you might start to realize that life could be challenging if you were missing the functionality of one or both of your most useful end effectors (as we call them in robotics). This thought experiment is the reality for someone like my friend Clare, who got into a car accident when she was 19. The impact of the crash resulted in a traumatic brain injury (TBI) that left the right side of her body partially paralyzed. In addition to re-learning how to walk, she also had to learn how to navigate her life with one functioning hand.

To get a better idea of what that would be like Clare says, “Tie your dominant hand behind your back. Try it!”

There are other neurological conditions, in addition to TBI, that could result in paralysis: stroke, spinal cord injury, cerebral palsy. When we hear the word paralysis, partial paralysis, or hemiparesis (partial paralysis of one side of the body), we might envision someone’s limb being weak and hanging at their side. This manifestation of motor impairment is only the case for a fraction of the hemiparetic population. For others like Clare, their hand and elbow are reflexively kept in a flexed position, or flexor synergy pattern, meaning that their hand is tightly closed in a fist, regardless if they try to open their hand or close it. They have little to no ability to voluntarily extend their fingers, and the amount of muscle force keeping the hand closed changes from moment to moment. If we think back to the peanut butter jar example, imagine having to use your able hand to pry open the fingers of your impaired hand to get them around the jar of peanut butter.

Thankfully, there are occupational therapists that can train individuals to adapt their approaches to activities of daily living, and physical therapists that keep their hands and limbs stretched and mobile. But also, the robotics community has been working on their own technology-based contributions to the recovery and long-term independence of individuals with hand impairments due to neurological injury. There are decades of research in the field of wearable assistive and rehabilitation devices, creating new prosthetics and exoskeletons to help individuals overcome their physical impairments. However, we came across a gap in the research when we began working with Clare and other individuals with similar hand impairments.

Most of the assistive hand exoskeletons currently being developed or commercially sold focus on restoring the user’s ability to grasp with their impaired hand. However, Clare actually needed an exoskeletal device that extended her fingers, against varying levels of resistance due to increased resting muscle tone or spasticity (click here to read about our finger extension force study). As a result, we developed the HOPE hand, a hand orthosis with powered extension to better serve individuals like Clare who need assistance opening their hand, to provide them with improved capabilities to perform activities of daily living, and to help them re-gain their independence.

The HOPE Hand is a cable-driven hand exoskeleton that uses pushing and pulling forces along the back of the hand to open and close the fingers individually. Each finger has two cables running parallel along the back of the finger which prevents medial/lateral movement and stabilizes the joint at the base of the finger. The cables are guided by rigid links that are attached to the finger using medical tape, and connect to a worm gear driven by a DC motor. The index finger and middle finger are actuated individually, and the pinky and ring finger are coupled, so they move together. The thumb has two degrees of freedom; flexion and extension are performed similarly to the other fingers, and abduction/adduction (to move the thumb into a position across from the fingers for a power grasp), is positioned manually by the user. This mechanical design allows the user to perform functional grasps by overcoming the increased muscle tone in their hand.

When we asked Clare to test out the HOPE Hand, she was happy to contribute to the research! She was able to pick up and move colored blocks from one designated area to another with the HOPE Hand, compared to not being able to pick up any blocks without the assistive device. We continued to work with Clare, her mom, and her physical therapist throughout our iterative device development process so she could give feedback and we could improve the device. We found that involving end users like Clare from the beginning was critical to designing a device which is safe, comfortable, and most importantly, usable. It was also the best way for us to understand her daily challenges and the nature of her hand impairment.

Clare’s physical therapist says “It engages the patient’s brain in ways that she would not be exposed to without participating in research, which could forge new neural pathways. I can watch the way she reacts and responds to the new technology and that enables me to create different rehab strategies to enhance recovery.”

While the HOPE Hand isn’t quite available to consumers yet, our collaborative team of patients, clinicians, caregivers, and academic researchers is making progress. One of the current challenges we are tackling, along with the rest of the wearable device industry, is how the device recognizes the user’s intention to move. The majority of electric prostheses and hand exoskeletons use residual muscle activity (myoelectric control) as an indicator of intention to move. However, the unreliable muscle activity that can be present due to neurological conditions like traumatic brain injury, can make this form of control challenging. Because of this, researchers are diving into alternative control methods such as computer vision and brain activity. We have implemented a voice control alternative, which also gives users an opportunity to practice their speech, as it is common for conditions like stroke and TBI to result in speech impairments such as aphasia, in addition to motor impairments. It has been valuable for us to consider the complexities of our targeted users, to create a device that could potentially help in more ways than one.

They say many hands make light work, but let’s start by restoring the functionality of the original two, so Clare can open that jar of peanut butter as fast as you can click on the next article while sipping your morning coffee.

#ICRA2022 awards finalists and winners

Credits: Wise Owl Multimedia

In this post we bring you all the paper awards finalists and winners presented during the 2022 edition of the IEEE International Conference on Robotics and Automation (ICRA).

ICRA 2022 Outstanding Paper

ICRA 2022 Outstanding Student Paper

ICRA 2022 Outstanding Automation Paper

ICRA 2022 Outstanding Coordination Paper

ICRA 2022 Outstanding Deployed Systems Paper

ICRA 2022 Outstanding Dynamics and Control Paper

ICRA 2022 Outstanding Interaction Paper

ICRA 2022 Outstanding Learning Paper

ICRA 2022 Outstanding Locomotion Paper

ICRA 2022 Outstanding Manipulation Paper

ICRA 2022 Outstanding Mechanisms and Design Paper

ICRA 2022 Outstanding Navigation Paper

ICRA 2022 Outstanding Planning Paper

Page 26 of 64
1 24 25 26 27 28 64