Page 2 of 3
1 2 3

ep.351: Early Days of ICRA Competitions, with Bill Smart

Bill Smart, Professor of Mechanical Engineering and Robotics at Oregon State University, helped start competitions as part of ICRA. In this episode, Bill dives into the high-level decisions involved with creating a meaningful competition. The conversation explores how competitions are there to showcase research, potential ideas for future competitions, the exciting phase of robotics we are currently in, and the intersection of robotics, ethics, and law.

Bill Smart

Dr. Smart does research in the areas of robotics and machine learning. In robotics, Smart is particularly interested in improving the interactions between people and robots; enabling robots to be self-sufficient for weeks and months at a time; and determining how they can be used as personal assistants for people with severe motor disabilities. In machine learning, Smart is interested in developing strategies for teaching robots to act effectively (or even optimally), based on long-term interactions with the world and given intermittent and at times incorrect feedback on their performance.

Links

Duckietown Competition Spotlight

At ICRA 2022, Competitions are a core part of the conference. We shine a spotlight on influential competitions in Robotics. In this episode, Dr Liam Paull talks about the Duckietown Competition, where robots drive around Rubber Ducky passengers in an autonomous driving track.

Dr. Liam Paull

Liam Paull is an assistant professor at l’Université de Montréal and the head of the Montreal Robotics and Embodied AI Lab (REAL). His lab focuses on robotics problems including building representations of the world (such as for simultaneous localization and mapping), modeling of uncertainty, and building better workflows to teach robotic agents new tasks (such as through simulation or demonstration). Previous to this, Liam was a research scientist at CSAIL MIT where he led the TRI funded autonomous car project. He was also a postdoc in the marine robotics lab at MIT where he worked on SLAM for underwater robots. He obtained his PhD from the University of New Brunswick in 2013 where he worked on robust and adaptive planning for underwater vehicles. He is a co-founder and director of the Duckietown Foundation, which is dedicated to making engaging robotics learning experiences accessible to everyone. The Duckietown class was originally taught at MIT but now the platform is used at numerous institutions worldwide.

Links

Autonomous Need for Speed

ApexAI is driving advances in ROS2 to make it viable for use in autonomous vehicles. The changes they are implementing are bridging the gap between the automotive world and robotics.

Joe Speed, VP of Product at ApexAI, dives into the current multi-year development process of bringing a car to market, and how ApexAI will transform this process into the shorter development time we see with modern technology. This technology was showcased at the Indy Autonomous Challenge where million-dollar autonomous cars raced each other on a track.

Apex.OS is a certified software framework and SDK for autonomous systems that enable software developers to write safe and certified autonomous driving applications compatible with ROS 2.

Joe Speed

Joe Speed is VP of Product & Chief Evangelist at Apex.AI. Prior to joining Apex.AI, Joe was a member of Open Robotics ROS 2 TSC, Autoware Foundation TSC, Eclipse OpenADx SC, and ADLINK Technology’s Field CTO driving robotics and autonomy.

Joe has spent his career developing and advocating open-source at organizations including Linux Foundation and IBM where he launched IBM IoT and co-founded the IBM AutoLAB automotive incubator. Joe helped make MQTT, IoT protocol, open-source and convinced the automakers to adopt it.

Joe is working to do the same for Apex.AI’s safe ROS 2 distribution and ROS middleware. Joe has developed a dozen advanced technology vehicles but is most proud of helping develop an accessible autonomous bus for older adults and people with disabilities.

Links

Mimicking the Five Senses, On Chip

Machine Learning at the edge is gaining steam. BrainChip is accelerating this with their Akida architecture, which is mimicking the human brain by incorporating the 5 human senses on a machine learning-enabled chip.

Their chips will let roboticists and IoT developers run ML on device for low latency, low power, and low-cost machine learning-enabled products. This opens up a new product category where everyday devices can affordably become smart devices.

Rob Telson

Rob is an AI thought-leader and Vice President of Worldwide Sales at BrainChip, a global tech company that has developed artificial intelligence that learns like a brain, whilst prioritizing efficiency, ultra-low power consumption, and continuous learning. Rob has over 20 years of sales expertise in licensing intellectual property and selling EDA technology and attended Harvard Business School.

Links

Event Cameras – An Evolution in Visual Data Capture

Over the past decade, camera technology has made gradual, and significant improvements thanks to the mobile phone industry. This has accelerated multiple industries, including Robotics. Today, Davide Scaramuzza discusses a step-change in camera innovation that has the potential to dramatically accelerate vision-based robotics applications.

Davide Scaramuzza deep dives on Event Cameras, which operate fundamentally different from traditional cameras. Instead of sampling every pixel on an imaging sensor at a fixed frequency, the “pixels” on an event camera all operate independently, and each responds to changes in illumination. This technology unlocks a multitude of benefits, including extremely highspeed imaging, removal of the concept of “framerate”, removal of data corruption due to having the sun in the sensor, reduced data throughput, and low power consumption. Tune in for more.

Davide Scaramuzza

Davide Scaramuzza is a Professor of Robotics and Perception at both departments of Informatics (University of Zurich) and Neuroinformatics (joint between the University of Zurich and ETH Zurich), where he directs the Robotics and Perception Group. His research lies at the intersection of robotics, computer vision, and machine learning, using standard cameras and event cameras, and aims to enable autonomous, agile, navigation of micro drones in search-and-rescue applications.

Links

A Robot You Swallow

Torrey Smith, Co-Founder of Endiatx, is changing the reputation endoscopies have for being uncomfortable. At Endiatx, they are developing a pill-sized robot that you swallow, which will then livestream your digestive system for a doctor to view. Our interviewer Abate dives in.

Torrey Smith
Torrey Smith is the Co-Founder & CEO of Endiatx, a medical robotics company that manufactures tiny robotic pills capable of active movement inside the human stomach with control over internet protocol. Prior to launching Endiatx, he developed medical devices in the areas of endometrial ablation, atherectomy, therapeutic hypothermia, sleep apnea, and vascular closure.

An aerospace engineer by training, he takes a keen interest in the deep tech sector and is a proud mentor of up-and-coming founders at the Founder Institute. He is also the principal founder of the international arts collective known as Sextant, and he has had his art featured in the Smithsonian.

Links

NVIDIA and ROS Teaming Up To Accelerate Robotics Development

Amit Goel, Director of Product Management for Autonomous Machines at NVIDIA, discusses the new collaboration between Open Robotics and NVIDIA. The collaboration will improve the way ROS and NVIDIA’s line of products such as Isaac SIM and the Jetson line of embedded boards operate together.

NVIDIA’s Isaac SIM lets developers build robust and scalable simulations. Dramatically reducing the costs of capturing real-world data and speeding up development time.

Their Jetson line of embedded boards is core to many robotics architectures, leveraging hardware-optimized chips for machine learning, computer vision, video processing, and more.

The improvements to ROS will allow robotics companies to better utilize the available computational power, while still developing on the robotics-centric platform familiar to many.

Amit Goel

Amit Goel is Director of Product Management for Autonomous Machines at NVIDIA, where he leads the product development of NVIDIA Jetson, the most advanced platform for AI computing at the edge.

Amit has more than 15 years of experience in the technology industry working in both software and hardware design roles. Prior to joining NVIDIA in 2011, he worked as a senior software engineer at Synopsys, where he developed algorithms for statistical performance modeling of digital designs.

Amit holds a Bachelor of Engineering in electronics and communication from Delhi College of Engineering, a Master of Science in electrical engineering from Arizona State University, and an MBA from the University of California at Berkeley.

Links

#337: Autonomously Mapping the Seafloor, with Anthony DiMare

Bedrock Ocean AUV

Anthony DiMare and Charles Chiau deep dive into how Bedrock Ocean is innovating in the world of Marine Surveys. At Bedrock Ocean, they are developing an Autonomous Underwater Vehicle (AUV) that is able to map the seafloor autonomously and at a high resolution. They are also developing a data platform to access, process, and visualize data captured from other companies at the seafloor.

Bedrock Ocean is solving two problems in the industry of Marine Surveying.
1. The vast majority of the seafloor is completely unmapped
2. The data that is captured from the seafloor is not standardized or centralized.

Seafloor data conducted by two different companies with the same or different hardware to capture the data can vary significantly in the calculated seafloor profile

Anthony DiMare
Anthony previously founded Nautilus Labs, a leading maritime technology company advancing the efficiency of ocean commerce through artificial intelligence. While at Nautilus, Anthony helped global companies solve challenges with distributed, siloed maritime data systems and built the early team that launched Nautilus Platform into large publicly listed shipping companies.

Charles Chiau
Charles, Bedrock’s CTO, was previously at SpaceX where he helped design the avionics systems for Crew Dragon. He also was a system integration engineer at Reliable Robotics working on their autonomous aviation system and was the CTO of DeepFlight where he worked on manned submersibles including ones for Tom Perkins, Richard Branson, and Steve Fossett.

#335: Autonomous Aircraft by Xwing, with Maxime Gariel

xwing autonomous aircraft

Abate talks to Maxime Gariel, CTO of Xwing about the autonomous flight technology they are developing.


At Xwing, they retrofit traditional aircraft to include multiple sensors such as cameras, lidar, and radar. Using sensor fusion algorithms, they create an exceptionally accurate model of the environment. This model of the environment and advanced path planning and control algorithms allow the plane to autonomously navigate in the airport, take off, fly to a destination, and land, all without a person on board.

Maxime Gariel
Maxime Gariel is the CTO of Xwing, a San Francisco based startup whose mission is to dramatically increase human mobility using fully autonomous aerial vehicles. Xwing is developing a Detect-And-Avoid system for unmanned and remotely piloted vehicles. Maxime is a pilot but he is passionate about making airplanes fly themselves.

Maxime joined Xwing from Rockwell Collins where he was a Principal GNC Engineer. He worked on autonomous aircraft projects including DARPA Gremlins and the AgustaWestland SW4 Solo autonomous helicopter. Before becoming Chief Engineer of the SW4 Solo’s flight control system, he was in charge of the system architecture, redundancy, and safety for the project.

Before Rockwell Collins, he worked on ADS-B based conflict detection as a postdoc at MIT and on autoland systems for airliners at Thales. Maxime earned his MS and PhD in Aerospace Engineering from Georgia Tech and his BS from ISAE-Supaéro (France).

Links

#334: Intel RealSense Enabling Computer Vision and Machine Learning At The Edge, with Joel Hagberg

Intel RealSense Facial Scanning
Intel RealSense ID was designed with privacy as a top priority. Purpose-built for user protection, Intel RealSense ID processes all facial images locally and encrypts all user data. (Credit: Intel Corporation)

Intel RealSense is known in the robotics community for its plug-and-play stereo cameras. These cameras make gathering 3D depth data a seamless process, with easy integrations into ROS to simplify the software development for your robots. From the RealSense team, Joel Hagberg talks about how they built this product, which allows roboticists to perform computer vision and machine learning at the edge.

Joel Hagberg
Joel Hagberg leads the Intel® RealSense™ Marketing. Product Management and Customer Support teams. He joined Intel in 2018 after a few years as an Executive Advisor working with startups in the IoT, AI, Flash Array, and SaaS markets. Before his Executive Advisor role, Joel spent two years as Vice President of Product Line Management at Seagate Technology with responsibility for their $13B product portfolio. He joined Seagate from Toshiba, where Joel spent 4 years as Vice President of Marketing and Product Management for Toshiba’s HDD and SSD product lines. Joel joined Toshiba with Fujitsu’s Storage Business acquisition, where Joel spent 12 years as Vice President of Marketing, Product Management, and Business Development. Joel’s Business Development efforts at Fujitsu focused on building emerging market business units in Security, Biometric Sensors, H.264 HD Video Encoders, 10GbE chips, and Digital Signage. Joel earned his bachelor’s degree in Electrical Engineering and Math from the University of Maryland. Joel also graduated from Fujitsu’s Global Knowledge Institute Executive MBA leadership program.

Links

#330: Construction Site Automation by Dusty Robotics, with Tessa Lau

FieldPrinter by Dusty Robotics

Abate interviews Tessa Lau on her startup Dusty Robotics which is innovating in the field of construction.

At Dusty Robotics, they developed a robot to automate the laying of floor plans on the floors in construction sites. Typically, this is done manually using a tape measure and reading printed out plans. This difficult task can often take a team of two a week to complete. Time-consuming tasks like this are incredibly expensive on a construction site where multiple different teams are waiting on this task to complete. Any errors in this process are even more time-consuming to fix. By using a robot to automatically convert 3d models of building plans into markings on the floors, the amount of time and errors are dramatically reduced.

Dr. Tessa Lau

Dr. Tessa Lau is an experienced entrepreneur with expertise in AI, machine learning, and robotics. She is currently Founder/CEO at Dusty Robotics, a construction robotics company building robot-powered tools for the modern construction workforce. Prior to Dusty, she was CTO/co-founder at Savioke, where she orchestrated the deployment of 75+ delivery robots into hotels and high-rises. Previously, Dr. Lau was a Research Scientist at Willow Garage, where she developed simple interfaces for personal robots. She also spent 11 years at IBM Research working in business process automation and knowledge capture. More generally, Dr. Lau is interested in technology that gives people super-powers, and building businesses that bring that technology into people’s lives. Dr. Lau was recognized as one of the Top 5 Innovative Women to Watch in Robotics by Inc. in 2018 and one of Fast Company’s Most Creative People in 2015. Dr. Lau holds a PhD in Computer Science from the University of Washington.

Links

#326: Deep Sea Mining, with Benjamin Pietro Filardo

In this episode, Abate follows up with Benjamin Pietro Filardo, founder of Pliant Energy Systems and NACROM, the North American Consortium for Responsible Ocean Mining. Pietro talks about the deep sea mining industry, an untapped market with a massive potential for growth. Pietro discusses the current proposed solutions for deep sea mining which are environmentally destructive, and he offers an alternative solution using swarm robots which could mine the depths of the ocean while creating minimal disturbance to this mysterious habitat.

Benjamin “Pietro” Filardo
After several years in the architectural profession, Pietro founded Pliant Energy Systems to explore renewable energy concepts he first pondered while earning his first degree in marine biology and oceanography. With funding from four federal agencies he has broadened the application of these concepts into marine propulsion and a highly novel robotics platform.

 
 
 
 
 
 
 

Links

#325: The Advantage of Fins, with Benjamin Pietro Filardo

Abate interviews Benjamin “Pietro” Filardo, CEO and founder of Pliant Energy Systems. At PES, they developed a novel form of actuation using two undulating fins on a robot. These fins present multiple benefits over traditional propeller systems including excellent energy efficiency, low water turbulence, and an ability to maneuver in water, land, and ice. Aside from its benefits on a robot, Pietro also talks about its advantages for harnessing energy from moving water.

Benjamin “Pietro” Filardo
After several years in the architectural profession, Pietro founded Pliant Energy Systems to explore renewable energy concepts he first pondered while earning his first degree in marine biology and oceanography. With funding from four federal agencies he has broadened the application of these concepts into marine propulsion and a highly novel robotics platform.

 
 
 
 
 
 
 

Links

#321: Empowering Farmers Through RootAI, with Josh Lessing

In this episode, Abate interviews Josh Lessing, co-founder and CEO of RootAI. At RootAI they are developing a system that tracks data on the farm and autonomously harvests crops using soft grippers and computer vision. Lessing talks about the path they took to build a product with good market fit and how they brought a venture capital backed startup to market.

Josh Lessing

Josh is one of the world’s leading minds on developing robotics and AI systems for the food industry, previously serving as the Director of R&D at Soft Robotics Inc. His current venture, Root AI, is integrating advanced robotics, vision systems and machine perception to automate agriculture. Josh was a Postdoctoral Fellow in Materials Science & Robotics at Harvard University, having earned his Ph.D. studying Biophysics & Physical Chemistry at the Massachusetts Institute of Technology and received an Sc.B. in Chemistry from Brown University.

Links

#313: Solid State Lidar – the 3D Camera, with Erin Bishop

Sense Photonics

In this episode, Abate interviews Erin Bishop from Sense Photonics about the technology in their “Solid State” LiDAR sensors that allows them to detect objects more accurately and over a larger field of view than traditional scanning LiDAR. Erin dives into the technical details of Solid State Lidar, discusses the applications and industries of the technology.

Erin Bishop


Erin operates at the intersection of product management, project engineering, customer development, and product-market fit within 3D camera company Sense Photonics. Over the past several years, she has worked to initiate the market appetite for telepresence robots, indoor mobile robots, and robotic picking in warehouses. Erin has made appearances at CES, UX Week, RoboBusiness, HardwareCon, and Mobile Future Forward.

Erin creates partnerships that result in large-scale deployment of the company’s revolutionary solid-state FLASH architecture. Prior to joining Sense, she held product manager, marketing, and research roles with several robotics companies, including Industrial Perception, Inc. – which was acquired by Google in 2013 – Adept Technology, iRobot, and DEKA. Erin earned her Master’s in Mechanical Engineering from UMass Lowell, where she performed extensive research in Human-Robot Interaction.

Links

Page 2 of 3
1 2 3