Industrial Robots Vs (or With?) Cobots
UBR-1 on ROS2 Humble
It has been a while since I’ve posted to the blog, but lately I’ve actually been working on the UBR-1 again after a somewhat long hiatus. In case you missed the earlier posts in this series:
ROS2 Humble
The latest ROS2 release came out just a few weeks ago. ROS2 Humble targets Ubuntu 22.04 and is also a long term support (LTS) release, meaning that both the underlying Ubuntu operating system and the ROS2 release get a full 5 years of support.
Since installing operating systems on robots is often a pain, I only use the LTS releases and so I had to migrate from the previous LTS, ROS2 Foxy (on Ubuntu 20.04). Overall, there aren’t many changes to the low-level ROS2 APIs as things are getting more stable and mature. For some higher level packages, such as MoveIt2 and Navigation2, the story is a bit different.
Visualization
One of the nice things about the ROS2 Foxy release was that it targeted the same operating system as the final ROS1 release, Noetic. This allowed users to have both ROS1 and ROS2 installed side-by-side. If you’re still developing in ROS1, that means you probably don’t want to upgrade all your computers quite yet. While my robot now runs Ubuntu 22.04, my desktop is still running 18.04.
Therefore, I had to find a way to visualize ROS2 data on a computer that did not have the latest ROS2 installed. Initially I tried the Foxglove Studio, but didn’t have any luck with things actually connecting using the native ROS2 interface (the rosbridge-based interface did work). Foxglove is certainly interesting, but so far it’s not really an RVIZ replacement – they appear to be more focused on offline data visualization.
I then moved onto running rviz2
inside a docker environment – which works well when using the rocker tool:
sudo apt-get install python3-rocker
sudo rocker --net=host --x11 osrf/ros:humble-desktop rviz2
If you are using an NVIDIA card, you’ll need to add --nvidia
along with --x11
.
In order to properly visualize and interact with my UBR-1 robot, I needed to add the ubr1_description
package to my workspace in order to get the meshes and also my rviz configurations. To accomplish this, I needed to create my own docker image. I largely based it off the underlying ROS docker images:
ARG WORKSPACE=/opt/workspace
FROM osrf/ros:humble-desktop
# install build tools
RUN apt-get update && apt-get install -q -y --no-install-recommends \
python3-colcon-common-extensions \
git-core \
&& rm -rf /var/lib/apt/lists/*
# get ubr code
ARG WORKSPACE
WORKDIR $WORKSPACE/src
RUN git clone https://github.com/mikeferguson/ubr_reloaded.git \
&& touch ubr_reloaded/ubr1_bringup/COLCON_IGNORE \
&& touch ubr_reloaded/ubr1_calibration/COLCON_IGNORE \
&& touch ubr_reloaded/ubr1_gazebo/COLCON_IGNORE \
&& touch ubr_reloaded/ubr1_moveit/COLCON_IGNORE \
&& touch ubr_reloaded/ubr1_navigation/COLCON_IGNORE \
&& touch ubr_reloaded/ubr_msgs/COLCON_IGNORE \
&& touch ubr_reloaded/ubr_teleop/COLCON_IGNORE
# install dependencies
ARG WORKSPACE
WORKDIR $WORKSPACE
RUN . /opt/ros/$ROS_DISTRO/setup.sh \
&& apt-get update && rosdep install -q -y \
--from-paths src \
--ignore-src \
&& rm -rf /var/lib/apt/lists/*
# build ubr code
ARG WORKSPACE
WORKDIR $WORKSPACE
RUN . /opt/ros/$ROS_DISTRO/setup.sh \
&& colcon build
# setup entrypoint
COPY ./ros_entrypoint.sh /
ENTRYPOINT ["/ros_entrypoint.sh"]
CMD ["bash"]
The image derives from humble-desktop and then adds the build tools and clones my repository. I then ignore the majority of packages, install dependencies and then build the workspace. The ros_entrypoint.sh
script handles sourcing the workspace configuration.
#!/bin/bash
set -e
# setup ros2 environment
source "/opt/workspace/install/setup.bash"
exec "$@"
I could then create the docker image and run rviz inside it:
docker build -t ubr:main
sudo rocker --net=host --x11 ubr:main rviz2
The full source of these docker configs is in the docker folder of my ubr_reloaded
repository. NOTE: The updated code in the repository also adds a late-breaking change to use CycloneDDS as I’ve had numerous connectivity issues with FastDDS that I have not been able to debug.
Visualization on MacOSX
I also frequently want to be able to interact with my robot from my Macbook. While I previously installed ROS2 Foxy on my Intel-based Macbook, the situation is quite changed now with MacOSX being downgraded to Tier 3 support and the new Apple M1 silicon (and Apple’s various other locking mechanisms) making it harder and harder to setup ROS2 directly on the Macbook.
As with the Linux desktop, I tried out Foxglove – however it is a bit limited on Mac. The MacOSX environment does not allow opening the required ports, so the direct ROS2 topic streaming does not work and you have to use rosbridge. I found I was able to visualize certain topics, but that switching between topics frequently broke.
At this point, I was about to give up, until I noticed that Ubuntu 22.04 arm64 is a Tier 1 platform for ROS2 Humble. I proceeded to install the arm64 version of Ubuntu inside Parallels (Note: I was cheap and initially tried to use the VMWare technology preview, but was unable to get the installer to even boot). There are a few tricks here as there is no arm64 desktop installer, so you have to install the server edition and then upgrade it to a desktop. There is a detailed description of this workflow on askubuntu.com. Installing ros-humble-desktop
from arm64 Debians was perfectly easy.
rviz2
runs relatively quick inside the Parallels VM, but overall it was not quite as quick or stable as using rocker
on Ubuntu. However, it is really nice to be able to do some ROS2 development when traveling with only my Macbook.
Migration Notes
Note: each of the links in this section is to a commit or PR that implements the discussed changes.
In the core ROS API, there are only a handful of changes – and most of them are actually simply fixing potential bugs. The logging macros have been updated for security purposes and require c-strings like the old ROS1 macros did. Additionally the macros are now better at detecting invalid substitution strings. Ament has also gotten better at detecting missing dependencies. The updates I made to robot_controllers show just how many bugs were caught by this more strict checking.
image_pipeline
has had some minor updates since Foxy, mainly to improve consistency between plugins and so I needed to update some topic remappings.
Navigation has the most updates. amcl
model type names have been changed since the models are now plugins. The API of costmap layers has changed significantly, and so a number of updates were required just to get the system started. I then made a more detailed pass through the documentation and found a few more issues and improvements with my config, especially around the behavior tree configuration.
I also decided to do a proper port of graceful_controller to ROS2, starting from the latest ROS1 code since a number of improvements have happened in the past year since I had originally ported to ROS2.
Next steps
There are still a number of new features to explore with Navigation2, but my immediate focus is going to shift towards getting MoveIt2 setup on the robot, since I can’t easily swap between ROS1 and ROS2 anymore after upgrading the operating system.
Wind and water: undersea drone readies to aid offshore boom
Allowing social robots to learn relations between users’ routines and their mood
Berkshire Grey and FedEx Expand Their Robotic Automation Solutions Relationship
ep.358: Softbank: How Large Companies Approach Robotics, with Brady Watkins
A lot of times on our podcast we dive into startups and smaller companies in robotics. Today’s talk is unique in that Brady Watkins gives us insight into how a big company like Softbank Robotics looks into the Robotics market.
we think scale first, (the) difference from a startup is our goal isn’t to think what’s the first 10 to 20, but we need to think what’s the first 20,000 look like. – Brady Watkins
Brady Watkins
Brady Watkins is the President and General Manager at Softbank Robotics America. During his career at Softbank, he helped to scale and commercialize Whiz, the collaborative robot vacuum designed to work alongside cleaning teams. Watkins played a key role in scaling the production to 20,000 units deployed globally.
Prior to his time at SBRA, Watkins was the Director of Sales, Planning, and Integration at Ubisoft, where he held several positions over the course of 10 years.
Links
- Download mp3
- Subscribe to Robohub using iTunes, RSS, or Spotify
- Support us on Patreon
Using Remote Operation to Help Solve Labor Shortage Issues in the Supply Chain
Two hands are better than one
What are you doing right now other than scrolling through this article? Do you have a cup of coffee in one hand, your phone in the other? Maybe your right hand is using your laptop mouse and your left hand is holding a snack. Have you ever thought about how often we are using both of our hands? Having two healthy human hands allows us to carry too many grocery bags in one hand and unlock our apartment door with the other, and perform complex bimanual coordination like playing Moonlight Sonata by Beethoven on the piano (well, maybe not all of us can do that). Having two hands also allows us to do some of the most simple tasks in our daily lives, like holding a jar of peanut butter and unscrewing the lid, or putting our hair up in a ponytail.
If you take some time to think about how often both of your hands are occupied, you might start to realize that life could be challenging if you were missing the functionality of one or both of your most useful end effectors (as we call them in robotics). This thought experiment is the reality for someone like my friend Clare, who got into a car accident when she was 19. The impact of the crash resulted in a traumatic brain injury (TBI) that left the right side of her body partially paralyzed. In addition to re-learning how to walk, she also had to learn how to navigate her life with one functioning hand.
To get a better idea of what that would be like Clare says, “Tie your dominant hand behind your back. Try it!”
There are other neurological conditions, in addition to TBI, that could result in paralysis: stroke, spinal cord injury, cerebral palsy. When we hear the word paralysis, partial paralysis, or hemiparesis (partial paralysis of one side of the body), we might envision someone’s limb being weak and hanging at their side. This manifestation of motor impairment is only the case for a fraction of the hemiparetic population. For others like Clare, their hand and elbow are reflexively kept in a flexed position, or flexor synergy pattern, meaning that their hand is tightly closed in a fist, regardless if they try to open their hand or close it. They have little to no ability to voluntarily extend their fingers, and the amount of muscle force keeping the hand closed changes from moment to moment. If we think back to the peanut butter jar example, imagine having to use your able hand to pry open the fingers of your impaired hand to get them around the jar of peanut butter.
Thankfully, there are occupational therapists that can train individuals to adapt their approaches to activities of daily living, and physical therapists that keep their hands and limbs stretched and mobile. But also, the robotics community has been working on their own technology-based contributions to the recovery and long-term independence of individuals with hand impairments due to neurological injury. There are decades of research in the field of wearable assistive and rehabilitation devices, creating new prosthetics and exoskeletons to help individuals overcome their physical impairments. However, we came across a gap in the research when we began working with Clare and other individuals with similar hand impairments.
Most of the assistive hand exoskeletons currently being developed or commercially sold focus on restoring the user’s ability to grasp with their impaired hand. However, Clare actually needed an exoskeletal device that extended her fingers, against varying levels of resistance due to increased resting muscle tone or spasticity (click here to read about our finger extension force study). As a result, we developed the HOPE hand, a hand orthosis with powered extension to better serve individuals like Clare who need assistance opening their hand, to provide them with improved capabilities to perform activities of daily living, and to help them re-gain their independence.
The HOPE Hand is a cable-driven hand exoskeleton that uses pushing and pulling forces along the back of the hand to open and close the fingers individually. Each finger has two cables running parallel along the back of the finger which prevents medial/lateral movement and stabilizes the joint at the base of the finger. The cables are guided by rigid links that are attached to the finger using medical tape, and connect to a worm gear driven by a DC motor. The index finger and middle finger are actuated individually, and the pinky and ring finger are coupled, so they move together. The thumb has two degrees of freedom; flexion and extension are performed similarly to the other fingers, and abduction/adduction (to move the thumb into a position across from the fingers for a power grasp), is positioned manually by the user. This mechanical design allows the user to perform functional grasps by overcoming the increased muscle tone in their hand.
When we asked Clare to test out the HOPE Hand, she was happy to contribute to the research! She was able to pick up and move colored blocks from one designated area to another with the HOPE Hand, compared to not being able to pick up any blocks without the assistive device. We continued to work with Clare, her mom, and her physical therapist throughout our iterative device development process so she could give feedback and we could improve the device. We found that involving end users like Clare from the beginning was critical to designing a device which is safe, comfortable, and most importantly, usable. It was also the best way for us to understand her daily challenges and the nature of her hand impairment.
Clare’s physical therapist says “It engages the patient’s brain in ways that she would not be exposed to without participating in research, which could forge new neural pathways. I can watch the way she reacts and responds to the new technology and that enables me to create different rehab strategies to enhance recovery.”
While the HOPE Hand isn’t quite available to consumers yet, our collaborative team of patients, clinicians, caregivers, and academic researchers is making progress. One of the current challenges we are tackling, along with the rest of the wearable device industry, is how the device recognizes the user’s intention to move. The majority of electric prostheses and hand exoskeletons use residual muscle activity (myoelectric control) as an indicator of intention to move. However, the unreliable muscle activity that can be present due to neurological conditions like traumatic brain injury, can make this form of control challenging. Because of this, researchers are diving into alternative control methods such as computer vision and brain activity. We have implemented a voice control alternative, which also gives users an opportunity to practice their speech, as it is common for conditions like stroke and TBI to result in speech impairments such as aphasia, in addition to motor impairments. It has been valuable for us to consider the complexities of our targeted users, to create a device that could potentially help in more ways than one.
They say many hands make light work, but let’s start by restoring the functionality of the original two, so Clare can open that jar of peanut butter as fast as you can click on the next article while sipping your morning coffee.
FANUC America Nearly Doubles Michigan Campus to Accommodate Automation Demand
Robotic sensors could help transform prosthetics
#ICRA2022 awards finalists and winners
In this post we bring you all the paper awards finalists and winners presented during the 2022 edition of the IEEE International Conference on Robotics and Automation (ICRA).
ICRA 2022 Outstanding Paper
- ‘Translating Images into Maps‘, by Saha, Avishkar; Mendez Maldonado, Oscar Alejandro; Russell, Chris; Bowden, Richard. (WINNER)
- ‘CT-ICP: Real-time Elastic LiDAR Odometry with Loop Closure‘, by Dellenbach, Pierre; Deschaud, Jean-Emmanuel; Jacquet, Bastien; GOULETTE, François.
- ‘D2A-BSP: Distilled Data Association Belief Space Planning with Performance Guarantees Under Budget Constraints‘, by Shienman, Moshe; Indelman, Vadim.
ICRA 2022 Outstanding Student Paper
- ‘Offline Learning of Counterfactual Perception as Prediction for Real-World Robotic Reinforcement Learning‘, by Jin, Jun; Graves, Daniel; Haigh, Cameron; Luo, Jun; Jagersand, Martin.
- ‘Dynamic Underactuated Manipulator Using a Flexible Body with a Structural Anisotropy‘, by Maruo, Akihiro; Shibata, Akihide; Higashimori, Mitsuru.
- ‘Interactive Robotic Grasping with Attribute-Guided Disambiguation‘, by Yang, Yang; Lou, Xibai; Choi, Changhyun. (WINNER)
ICRA 2022 Outstanding Automation Paper
- ‘SPIN Road Mapper: Extracting Roads from Aerial Images via Spatial and Interaction Space Graph Reasoning for Autonomous Driving‘, by Bandara, Wele Gedara Chaminda; Valanarasu, Jeya Maria Jose; Patel, Vishal.
- ‘ARChemist: Autonomous Robotic Chemistry System Architecture‘, by Fakhruldeen, Hatem; Pizzuto, Gabriella; Jakub, Glowacki; Cooper, Andrew Ian.
- ‘Precise 3D Reconstruction of Plants from UAV Imagery Combining Bundle Adjustment and Template Matching‘, by Marks, Elias Ariel; Magistri, Federico; Stachniss, Cyrill. (WINNER)
ICRA 2022 Outstanding Coordination Paper
- ‘Learning Scalable Policies over Graphs for Multi-Robot Task Allocation using Capsule Attention Networks‘, by Paul, Steve; Ghassemi, Payam; Chowdhury, Souma.
- ‘Decentralized Model Predictive Control for Equilibrium-based Collaborative UAV Bar Transportation‘, by Castro Sundin, Roberto; Roque, Pedro; Dimarogonas, Dimos V.. (WINNER)
- ‘A Deep Reinforcement Learning Environment for Particle Robot Navigation and Object Manipulation‘, by Shen, Jeremy; Xiao, Erdong; Liu, Yuchen; Feng, Chen.
ICRA 2022 Outstanding Deployed Systems Paper
- ‘Autonomous Teamed Exploration of Subterranean Environments using Legged and Aerial Robots‘, by Kulkarni, Mihir; Dharmadhikari, Mihir Rahul; Tranzatto, Marco; Zimmermann, Samuel; Reijgwart, Victor; De Petris, Paolo; Nguyen, Huan; Khedekar, Nikhil Vijay; Papachristos, Christos; Ott, Lionel; Siegwart, Roland; Hutter, Marco; Alexis, Kostas. (WINNER)
- ‘Learning Model Predictive Control for Quadrotors‘, by Li, Guanrui; Tunchez, Alex; Loianno, Giuseppe.
- ‘Optimizing Terrain Mapping and Landing Site Detection for Autonomous UAVs‘, by Proença, Pedro F.; Delaune, Jeff; Brockers, Roland.
ICRA 2022 Outstanding Dynamics and Control Paper
- ‘Cooperative Modular Single Actuator Monocopters Capable of Controlled Passive Separation‘, by CAI, Xinyu; Win, Shane Kyi Hla; Win, Luke Soe Thura; Sufiyan, Danial; Foong, Shaohui. (WINNER)
- ‘Real-time Optimal Landing Control of the MIT Mini Cheetah‘, by Jeon, Se Hwan; Kim, Sangbae; Kim, Donghyun.
- ‘Real-Time Multi-Contact Model Predictive Control via ADMM‘, by Aydinoglu, Alp; Posa, Michael.
ICRA 2022 Outstanding Interaction Paper
- ‘Effects of Interfaces on Human-Robot Trust: Specifying and Visualizing Physical Zones‘, by Hudspeth, Marisa; Balali, Sogol; Grimm, Cindy; Sowell, Ross T..
- ‘Synergistic Scheduling of Learning and Allocation of Tasks in Human-Robot Team‘, by Vats, Shivam; Kroemer, Oliver; Likhachev, Maxim.
- ‘Human-Robot Shared Control for Surgical Robot Based on Context-Aware Sim-to-Real Adaptation‘, by Zhang, Dandan; wu, zicong; Chen, Junhong; Zhu, Ruiqi; Munawar, Adnan; Xiao, Bo; Guan, Yuan; Su, Hang; Guo, Yao; Fischer, Gregory Scott; Lo, Benny Ping Lai; Yang, Guang-Zhong. (WINNER)
ICRA 2022 Outstanding Learning Paper
- ‘Symphony: Learning Realistic and Diverse Agents for Autonomous Driving Simulation‘, by Igl, Maximilian; Kim, Daewoo; Kuefler, Alex; Mougin, Paul; Shah, Punit; Shiarlis, Kyriacos; Anguelov, Dragomir; Palatucci, Mark; White, Brandyn; Whiteson, Shimon.
- ‘TartanDrive: A Large-Scale Dataset for Learning Off-Road Dynamics Models‘, by Triest, Samuel; Sivaprakasam, Matthew; Wang, Sean J.; Wang, Wenshan; Johnson, Aaron; Scherer, Sebastian.
- ‘Augmenting Reinforcement Learning with Behavior Primitives for Diverse Manipulation Tasks‘, by Nasiriany, Soroush; Liu, Huihan; Zhu, Yuke. (WINNER)
ICRA 2022 Outstanding Locomotion Paper
- ‘Scalable Minimally Actuated Leg Extension Bipedal Walker Based on 3D Passive Dynamics‘, by Islam, Sharfin; Carter, Kamal; Yim, Justin K.; Kyle, James; Bergbreiter, Sarah; Johnson, Aaron.
- ‘Omni-Roach: A legged robot capable of traversing multiple types of large obstacles and self-righting‘, by Mi, Jonathan; Wang, Yaqing; Li, Chen.
- ‘Trajectory Optimization Formulation with Smooth Analytical Derivatives for Track-leg and Wheel-leg Ground Robots‘, by Mane, Adwait; Swart, Dylan; White, Jason; Hubicki, Christian. (WINNER)
ICRA 2022 Outstanding Manipulation Paper
- ‘Online Object Model Reconstruction and Reuse for Lifelong Improvement of Robot Manipulation‘, by Lu, Shiyang; Wang, Rui; Miao, Yinglong; Mitash, Chaitanya; Bekris, Kostas E..
- ‘Multi-view object pose distribution tracking for pre-grasp planning on mobile robots‘, by Naik, Lakshadeep; Iversen, Thorbjørn Mosekjær; Kramberger, Aljaz; Wilm, Jakob; Krüger, Norbert.
- ‘Manipulation of unknown objects via contact configuration regulation‘, by Doshi, Neel; Taylor, Orion; Rodriguez, Alberto. (WINNER)
ICRA 2022 Outstanding Mechanisms and Design Paper
- ‘TaTa: A Universal Jamming Gripper with High-Quality Tactile Perception and Its Application to Underwater Manipulation‘, by Li, Shoujie; Yin, Xianghui; XIA, Chongkun; Ye, Linqi; WANG, xueqian; LIANG, bin.
- ‘Design of a Biomimetic Tactile Sensor for Material Classification‘, by Dai, Kevin; Wang, Xinyu; Rojas, Allison M.; Harber, Evan; Tian, Yu; Paiva, Nicholas; Gnehm, Joseph; Schindewolf, Evan; Choset, Howie; Webster-Wood, Victoria; Li, Lu. (WINNER)
- ‘A Wearable Fingertip Cutaneous Haptic Device with Continuous Omnidirectional Motion Feedback‘, by Zhang, Peizhi; Kamezaki, Mitsuhiro; Hattori, Yutaro; Sugano, Shigeki.
ICRA 2022 Outstanding Navigation Paper
- ‘Event-Triggered Tracking Control Scheme for Quadrotors with External Disturbances: Theory and Validations‘, by Gao, Pengcheng; Wang, Gang; Li, Qingdu; Zhang, Jianwei; Shen, Yantao.
- ‘EDPLVO: Efficient Direct Point-Line Visual Odometry‘, by Zhou, Lipu; Huang, Guoquan (Paul); Mao, Yinian; Wang, Shengze; Kaess, Michael. (WINNER)
- ‘Confidence-based Robot Navigation under Sensor Occlusion with Deep Reinforcement Learning‘, by Ryu, Hyeongyeol; Yoon, Minsung; Park, Daehyung; Yoon, Sung-eui.
ICRA 2022 Outstanding Planning Paper
- ‘Optimizing Trajectories with Closed-Loop Dynamic SQP‘, by Singh, Sumeet; Slotine, Jean-Jacques E.; Sindhwani, Vikas.
- ‘Trajectory Distribution Control for Model Predictive Path Integral Control using Covariance Steering‘, by Yin, Ji; Zhang, Zhiyuan; Theodorou, Evangelos; Tsiotras, Panagiotis.
- ‘Non-Gaussian Risk Bounded Trajectory Optimization for Stochastic Nonlinear Systems in Uncertain Environments‘, by Han, Weiqiao; M. Jasour, Ashkan; Williams, Brian. (WINNER)