Page 1 of 3
1 2 3

ep.366: Deep Learning Meets Trash: Amp Robotics’ Revolution in Materials Recovery, with Joe Castagneri

In this episode, Abate flew to Denver, Colorado, to get a behind-the-scenes look at the future of recycling with Joe Castagneri, the head of AI at Amp Robotics. With Materials Recovery Facilities (MRFs) processing a staggering 25 tons of trash per hour, robotic sorting is the clear long-term solution.

Recycling is a for-profit industry. When the margins don’t make sense, the items will not be recycled. This is why Amp’s mission to use robotics and AI to bring down the cost of recycling and increase the number of items that can be sorted for recycling is so impactful.

Joe Castagneri
Joe Castagneri graduated with his Master of Science in Applied Mathematics, with an undergrad degree in Physics. While still in university, he first joined the team at Amp Robotics in 2016 where he worked on Machine Learning models to identify recyclables in video streams of Trash in Materials Recovery Facilities (MRFs). Today, he is the Head of AI at Amp Robotics where he is changing the economics of recycling through automation.

ep.365: ReRun: An Open Source Package For Beautiful Visualizations, with Nikolaus West

Nico, Emil, and Moritz founded ReRun with the mission of making powerful visualization tools free and easily accessible for roboticists. Nico and Emil talk about how these powerful tools help debug the complex problem scopes faced by roboticists. Tune in for more.

Nikolaus West
Co-Founder & CEO
Niko is a second-time founder and software engineer with a computer vision background from Stanford. He’s fanatic about bringing great computer vision and robotics products to the physical world.

Emil Ernerfeldt
Co-Founder & CTO
Emil fell in love with coding over 20 years ago and hasn’t looked back since. He’s the creator of egui, an easy-to-use immediate mode GUI in Rust, that we’re using to build Rerun. He brings a strong perspective from the gaming industry, with a focus on great and blazing fast tools.

Links

ep.364: Shaking Up The Sheetmetal Industry, with Ed Mehr

Conventional sheet metal manufacturing is highly inefficient for the low-volume production seen in the space industry. At Machina Labs, they developed a novel method of forming sheet metal using two robotic arms to bend the metal into different geometries. This method cuts down the time to produce large sheet metal parts from several months down to a few hours. Ed Mehr, Co-Founder and CEO of Machina Labs, explains this revolutionary manufacturing process.

Ed Mehr

Ed Mehr in Work Uniform

Ed Mehr is the co-founder and CEO of Machina Labs. He has an engineering background in smart manufacturing and artificial intelligence. In his previous position at Relativity Space, he led a team in charge of developing the world’s largest metal 3D printer. Relativity Space uses 3D printing to make rocket parts rapidly, and with the flexibility for multiple iterations. Ed previously was the CTO at Cloudwear (Now Averon), and has also worked at SpaceX, Google, and Microsoft.

Links

ep.363: Going out on a Bionic Limb, with Joel Gibbard

Many people associate prosthetic limbs with nude-colored imitations of human limbs. Something built to blend into a society where people have all of their limbs while serving functional use cases. On the other end of the spectrum are the highly optimized prosthetics used by Athletes, built for speed, low weight, and appearing nothing like a human limb.

As a child under 12 years old, neither of these categories of prosthetics particularly speaks to you. Open Bionics, founded by Joel Gibbard and Samantha Payne, was started to create a third category of prosthetics. One that targets the fun, imaginative side of children, while still providing the daily functional requirements.

Through partnerships with Disney and Lucasfilms, Open Bionics has built an array of imagination-capturing prosthetic limbs that are straight-up cool.

Joel Gibbard dives into why they founded Open Bionics, and why you should invest in their company as they are getting ready to let the general public invest in them for the first time.

Joel Gibbard

Joel Gibbard lives in Bristol, UK and graduated with a first-class honors degree in Robotics from the University of Plymouth, UK.

He co-founded Open Bionics alongside Samantha Payne with the goal of bringing advanced, accessible bionic arms to the market. Open Bionics offers the Hero Arm, which is available in the UK, USA, France, Australia, and New Zealand. Open Bionics is revolutionizing the prosthetics industry through its line of inspiration-capturing products.

Links

ep.361: Recycling: An Opaque Industry, with Areeb Malik

The Recycling Industry in the United States is a for-profit industry. They profit from taking recyclable material, refining it, and reselling it to companies at a cheaper price than producing the material from scratch.

If you look at the demand side of the recycling industry, an array of multi-billion dollar companies like Coca-Cola and PepsiCo are incentivized to buy recycled goods and reduce their materials costs.

If you look at the supply side, ~300 million tons of trash are generated annually in the United States. Estimates suggest that up to 75% of that is recyclable.

On paper, it seems clear that maximizing the amount of trash the US recycles is in everyone’s interest. One issue though, less than a third of the trash ends up recycled.

Areeb, co-founder of Glacier, breaks down the multi-layered reasoning behind why the Recycling industry cannot handle this volume of trash, and what Glacier is doing to address this.

Areeb Malik

Areeb Malik is the Co-Founder of Glacier, and he is on a personal mission to fight climate change and extract value from the $123B worth of recyclables that fill the landfills and oceans. Before founding Glacier, Areeb was a Software Engineer at Facebook, where he used Machine Learning and Computer Vision to build out new product features.

Links

ep.360: Building Communities Around AI in Africa, with Benjamin Rosman

At ICRA 2022, Benjamin Rosman delivered a keynote presentation on an organization he co-founded called “Deep learning Indaba”.

Deep Learning Indaba is based in South Africa and their mission is to strengthen Artificial Intelligence and Machine Learning communities across Africa. They host yearly meetups in varying countries on the continent, as well as promote grass roots communities in each of the countries to run their own local events.

What is Indaba?

An indaba is a Zulu word for a gathering or meeting. Such meetings are held throughout southern Africa, and serve several functions: to listen and share news of members of the community, to discuss common interests and issues facing the community, and to give advice and coach others.

Benjamin Rosman

Benjamin Rosman is an Associate Professor in the School of Computer Science and Applied Mathematics at the University of the Witwatersrand, South Africa, where he runs the Robotics, Autonomous Intelligence, and Learning (RAIL) Laboratory and is the Director of the National E-Science Postgraduate Teaching and Training Platform (NEPTTP).

He is a founder and organizer of the Deep Learning Indaba machine learning summer school, with a focus on strengthening African machine learning. He was a 2017 recipient of a Google Faculty Research Award in machine learning, and a 2021 recipient of a Google Africa Research Award. In 2020, he was made a Senior Member of the IEEE.

Links

Spotlights on Three Exhibitors from CVPR 2022

Retrocausal:

The team at Retrocausal built a computer vision platform that allows any manufacturer to rapidly set up an activity recognition pipeline to assist with manual assemblies. Using any standard camera pointed at an assembly station, the system can learn the assembly procedure and give real-time feedback.

Deci:

At Deci, they are tackling the problem of taking a Machine Learning model optimized for a single hardware platform and adapting it to work on a different hardware platform.

When developing an ML model running on the edge, a time-consuming development cycle is made to optimize the model to run at peak capacity. However, over time hardware platforms get updated and companies need to make the decision of whether or not to update their hardware and dedicate engineering resources to manually tweak a multitude of settings.

Deci allows companies to upload their models to the cloud, then automatically optimize them for running on a variety of different hardware platforms that they own in their facilities.

StegAI:

Dr. Eric Wengrowski co-founded StegAI after publishing a paper in CVPR 2019 on embedding “fingerprints” that are embedded in images and videos that are invisible to the human eye, but detectable by their software.

Their product modulates the pixels in an image in such a way that even after compressing, resizing, or printing out the images, their deep learning algorithms will still be able to detect the origin of the image.

This powerful tool is used by social media firms to protect against potential copyright infringement when users upload content to their platforms

ep.358: Softbank: How Large Companies Approach Robotics, with Brady Watkins

A lot of times on our podcast we dive into startups and smaller companies in robotics. Today’s talk is unique in that Brady Watkins gives us insight into how a big company like Softbank Robotics looks into the Robotics market.

we think scale first, (the) difference from a startup is our goal isn’t to think what’s the first 10 to 20, but we need to think what’s the first 20,000 look like. – Brady Watkins

Brady Watkins

Brady Watkins HeadshotBrady Watkins is the President and General Manager at Softbank Robotics America. During his career at Softbank, he helped to scale and commercialize Whiz, the collaborative robot vacuum designed to work alongside cleaning teams. Watkins played a key role in scaling the production to 20,000 units deployed globally.

Prior to his time at SBRA, Watkins was the Director of Sales, Planning, and Integration at Ubisoft, where he held several positions over the course of 10 years.

Links

Underwater Human-Robot Interaction #ICRA2022

How do people communicate when they are underwater? With body language, of course.

Marine environments present a unique set of challenges that render several technologies that were developed for land applications completely useless. Communicating using sound, or at least as people use sound to communicate, is one of them.

Michael Fulton tackles this challenge with his presentation at ICRA 2022 by using body language to communicate with an AUV underwater. Tune in for more.

His poster can be viewed here.

Michael Fulton

Michael Fulton is a Ph.D. Candidate at the University of Minnesota Twin Cities. His research focuses primarily on underwater robotics with a focus on applications where robots work with humans. Specifically, human-robot interaction and robot perception using computer vision and deep learning, with the intent of creating systems that can work collaboratively with humans in challenging environments.

Exhibitors from ICRA 2022

At ICRA 2022, the researchers weren’t the only ones working with cutting-edge technology. We spoke to the exhibitors to get real-life demos of their products.

Tangram Vision

Tangram Vision is a hardware-agnostic sensor fusion platform. It streamlines the development and deployment of critical sensor infrastructure like calibration, fusion, and monitoring for any number of cameras, depth sensors, LiDAR, radar, and IMU. Their co-founder, Adam Rodnitzky, walks us through their sensor fusion platform.

FLX Solutions

Matt Bilsky, Founder and CEO of FLX Solutions gives us a live demo of their robot, the FLX BOT. Matt Bilsky applied his Ph.D. in Mechanical Engineering to create a novel, highly compact robot that is designed to reach and inspect parts of a building that a human cannot reach. The FLX BOT is one inch in diameter and is made of modular links that can be attached one after the other to extend the reach and degrees of freedom of the robot.

Exyn Technologies

Exyn Technologies specializes in aerial robotics, drone swarms, multi-modal sensor fusion, 3D mapping, obstacle avoidance, and autonomous navigation & planning.

Exyn’s focus is on developing software for aerial robots so they can operate in GPS-denied environments, without human control, prior information, or pre-existing infrastructure (e.g. no motion capture system).

Pollen Robotics

Pollen Robotics developed the robot, Reachy, an open-source humanoid robot with a quirky appearance. Reachy’s primary userbase is researchers studying fields such as teleoperation. Using mixed autonomy, Reachy is able to be teleoperated by humans, while still having onboard intelligence to autonomously infer the actions you want it to do.

ep.357: Origin Story of the OAK-D, with Brandon Gilles

Brandon Gilles, Founder and CEO of Luxonis, tells us his story about how Luxonis designed one of the most versatile perception platforms on the market.

Brandon took the lessons learned from his time at Ubiquiti, which transformed networking with network-on-a-chip architectures, and applied the mastery of embedded hardware and software to the OAK-D camera and the broader OAK line of products.

To refer to the OAK-D as a stereovision camera tells only part of the story. Aside from depth sensing, the OAK-D leverages the Intel Myriad X to perform perception computations directly on the camera in a highly power-efficient architecture.

Customers can also instantly leverage a wide array of open-source computer vision and AI packages that are pre-calibrated to the optics system.

Additionally, by leveraging a system-on-a-module design, the Luxonis team easily churns out a multitude of variations of the hardware platform to fit the wide variety of customer use cases. Tune in for more.

Brandon Gilles

Brandon Gilles is the Founder and CEO of Luxonis, maker of the OAK-D line of cameras. Brandon comes from a background in Electrical and RF Engineering. He spent his early career as a UniFi Lead at Ubiquiti, where his team helped bring Ubiquiti’s highly performant and power-efficient Unifi products to market.

Links

ep.356: Controlling a Drone After Sudden Rotor Failure #ICRA2022, with Sihao Sun

Dr. Sihao Sun discusses his award-winning research in the area of controlling the flight of a drone when faced with a sudden rotor failure.

Typical research in this area addressed the case where one of the four rotors in a quadrotor suddenly, spontaneously stops working. This previous research does not take into full account real-life scenarios where rotor failure is common. This includes collisions with other drones, walls, birds, and operating in degraded GPS environments.

Dr. Sihao Sun

Dr. Sihao Sun is a postdoctoral research assistant at the Robotics and Perception Group (RPG) in University of Zurich directed by Prof. Davide Scaramuzza. Currently, he is working on control and perception for aerial robots (drones).

In December 2020, he received his PhD degree in Aerospace Engineering from the Control and Simulation Group of Delft University of Technology. His works on quadrotor fault-tolerant flight control have been featured by reputable media, such as IEEE Spectrum.

Links

ep.355: SLAM fused with Satellite Imagery (ICRA 2022), with John McConnell

Underwater Autonomous Vehicles face challenging environments where GPS Navigation is rarely possible. John McConnell discusses his research, presented at ICRA 2022, into fusing overhead imagery with traditional SLAM algorithms. This research results in a more robust localization and mapping, with reduced drift commonly seen in SLAM algorithms.

Satellite imagery can be obtained for free or low cost through Google or Mapbox, creating an easily deployable framework for companies in industry to implement.

Links

ep.353: Autonomous Flight Demo with CMU AirLab – ICRA Day 1, with Sebastian Scherer

Sebastian Scherer from CMU’s Airlab gives us a behind-the-scenes demo at ICRA of their Autonomous Flight Control AI. Their approach aims to cooperate with human pilots and act the way they would.

The team took this approach to create a more natural, less intrusive process for co-habiting human and AI pilots at a single airport. They describe it as a Turing Test, where ideally the human pilot will be unable to distinguish an AI from a person operating the plane.

Their communication system works parallel with a 6-camera hardware package based on the Nvidia AGX Dev Kit. This kit measures the angular speed of objects flying across the videos.

In this world, high angular velocity means low risk — since the object is flying at a fast speed perpendicular to the camera plane.

Low angular velocity indicates high risk since the object could be flying directly at the plane, headed for a collision.

Links

ep.352: Robotics Grasping and Manipulation Competition Spotlight, with Yu Sun

Yu Sun, Professor of Computer Science and Engineering at the University of South Florida, created and organized the Robotic Grasping and Manipulation Competition. Yu talks about the impact robots will have in domestic environments, the disparity between industry and academia showcased by competitions, and the commercialization of research.

Yu Sun

Yu Sun is a Professor in the Department of Computer Science and Engineering at the University of South Florida (Assistant Professor 2009-2015, Associate Professor 2015-2020, Associate Chair of Graduate Affairs 2018-2020). He was a Visiting Associate Professor at Stanford University from 2016 to 2017, and received his Ph.D. degree in Computer Science from the University of Utah in 2007. Then he had his Postdoctoral training at Mitsubishi Electric Research Laboratories (MERL), Cambridge, MA (2007-2008) and the University of Utah (2008-2009).

He initiated the IEEE RAS Technical Committee on Robotic Hands, Grasping, and Manipulation and served as its first co-Chair. Yu Sun also served on several editorial boards as an Associate Editor and Senior Editor, including IEEE Transactions on Robotics, IEEE Robotics and Automation Letters (RA-L), ICRA, and IROS.

Links

Page 1 of 3
1 2 3