iRobot Education expands its free coding platform with social-emotional learning, multi-language support
iRobot Corp. unveiled new coding resources through iRobot Education that promote more inclusive, equitable access to STEM education and support social-emotional development. iRobot also updated its iRobot Coding App with the introduction of Python coding support and a new 3D Root coding robot simulator environment that is ideal for hybrid and remote learning landscapes.
The updates coincide with the annual National Robotics Week, a time when kids, parents and teachers across the nation tap into the excitement of robotics for STEM learning.
Supporting Social and Emotional Learning
The events of the past year changed the traditional learning environment with students, families and educators adapting to hybrid and remote classrooms. Conversations on the critical importance of diversity, equity and inclusion have also taken on increased importance in the classroom. To address this, iRobot Education has introduced social and emotional learning (SEL) lessons to its Learning Library that tie SEL competencies, like peer interaction and responsible decision-making, into coding and STEM curriculum. These SEL learning lessons, such as The Kind Playground, Seeing the Whole Picture and Navigating Conversations, provide educators with new resources that help students build emotional intelligence and become responsible global citizens, through a STEM lens.
Language translations for iRobot Coding App
More students can now enjoy the free iRobot Coding App with the introduction of Spanish, French, German, Czech and Japanese language support. iRobot’s mobile and web coding app offers three progressively challenging levels of coding language that advances users from graphical coding to hybrid coding, followed by full-text coding. Globally, users can now translate graphical and hybrid block coding levels into their preferred language, helping beginners and experts alike hone their language and computational thinking skills.
Introducing Python coding language support
One of the most popular coding languages, Python is now available to iRobot Coding App users in level 3, full-text coding. This new functionality provides an avenue to gain more complex coding experience in a coding language that is currently used in both academic and professional capacities worldwide, preparing the next generation of students for STEM curriculums and careers.
New Root Coding Robot 3D Simulator
Ready to code in 3D? The iRobot Coding App is uniquely designed to help kids learn coding at home and in school, with a virtual Root coding robot available for free within the app. iRobot updated the virtual Root SimBot with a fun and interactive 3D experience, allowing students to control their programmable Root® coding robot right on the screen.
“The COVID-19 pandemic has had, and continues to have, a tangible impact on students who’ve been learning in remote environments, which is why we identified solutions to nurture and grow SEL skills in students,” said Colin Angle, chairman and CEO of iRobot. “The expansion of these new iRobot Education resources, which are free to anyone, will hopefully facilitate greater inclusivity and accessibility for those who want to grow their coding experience and pursue STEM careers.”
In celebration of National Robotics Week, iRobot Education will also release weekly coding challenges focused on learning how to use code to communicate. Each challenge can be completed online in the iRobot Coding App with the Root SimBot or in-person with a Root coding robot. The weekly challenges build upon each other and include guided questions to facilitate discussions about the coding process, invite reflections, and celebrate new learning.
For more information on iRobot Education, Root coding robots and the iRobot Coding App, visit: https://edu.irobot.com/.
Information about National Robotics Week can be found at www.nationalroboticsweek.org.
The ulti-mutt pet? Chinese tech company develops robo-dogs
How Graph Neural Networks (GNN) work: introduction to graph convolutions from scratch
Robots can be more aware of human co-workers, with system that provides context
Digital twin can protect physical systems and train new users
A laser equipped robotic guide dog to lead people who are visually impaired
Implementing Piece-Picking Robots into Micro-Fulfillment Centers
Protecting Robotics Innovations: Five Tips From Your Favorite Robots
How to build a robotics startup: getting the team right
This episode is about understanding why you can’t build your startup alone, and some criteria to properly select your co-founders.
In this podcast series of episodes we are going to explain how to create a robotics startup step by step.
We are going to learn how to select your co-founders, your team, how to look for investors, how to test your ideas, how to get customers, how to reach your market, how to build your product… Starting from zero, how to build a successful robotics startup.
I’m Ricardo Tellez, CEO and co-founder of The Construct startup, a robotics startup at which we deliver the best learning experience to become a ROS Developer, that is, to learn how to program robots with ROS.
Our company is already 5 years long, we are a team of 10 people working around the world. We have more than 100.000 students, and tens of Universities around the world use our online academy to provide the teaching environment to their students.
We have bootstrapped our startup, but we also (unsuccessfully) tried getting investors. We have done a few pivots and finally ended at the point that we are right now.
With all this experience, I’m going to teach you how to build your own startup. And we are going to go through the process by creating ourselves another startup, so you can see in the path how to create your own. So you are going to witness the creation of such robotics startup.
Subscribe to the podcast using any of the following methods
- ROS Developers Podcast on iTunes
- ROS Developers Podcast on Stitcher
- ROS Developers Podcast on Spotify
Or watch the video
The post 91. How to build a robotics startup: getting the team right appeared first on The Construct.
Best Resources to Learn Deep Learning Theory
Robot artist sells art for $688,888, now eyeing music career
A robot that senses hidden objects
By Daniel Ackerman | MIT News Office
In recent years, robots have gained artificial vision, touch, and even smell. “Researchers have been giving robots human-like perception,” says MIT Associate Professor Fadel Adib. In a new paper, Adib’s team is pushing the technology a step further. “We’re trying to give robots superhuman perception,” he says.
The researchers have developed a robot that uses radio waves, which can pass through walls, to sense occluded objects. The robot, called RF-Grasp, combines this powerful sensing with more traditional computer vision to locate and grasp items that might otherwise be blocked from view. The advance could one day streamline e-commerce fulfillment in warehouses or help a machine pluck a screwdriver from a jumbled toolkit.
The research will be presented in May at the IEEE International Conference on Robotics and Automation. The paper’s lead author is Tara Boroushaki, a research assistant in the Signal Kinetics Group at the MIT Media Lab. Her MIT co-authors include Adib, who is the director of the Signal Kinetics Group; and Alberto Rodriguez, the Class of 1957 Associate Professor in the Department of Mechanical Engineering. Other co-authors include Junshan Leng, a research engineer at Harvard University, and Ian Clester, a PhD student at Georgia Tech.
As e-commerce continues to grow, warehouse work is still usually the domain of humans, not robots, despite sometimes-dangerous working conditions. That’s in part because robots struggle to locate and grasp objects in such a crowded environment. “Perception and picking are two roadblocks in the industry today,” says Rodriguez. Using optical vision alone, robots can’t perceive the presence of an item packed away in a box or hidden behind another object on the shelf — visible light waves, of course, don’t pass through walls.
But radio waves can.
For decades, radio frequency (RF) identification has been used to track everything from library books to pets. RF identification systems have two main components: a reader and a tag. The tag is a tiny computer chip that gets attached to — or, in the case of pets, implanted in — the item to be tracked. The reader then emits an RF signal, which gets modulated by the tag and reflected back to the reader.
The reflected signal provides information about the location and identity of the tagged item. The technology has gained popularity in retail supply chains — Japan aims to use RF tracking for nearly all retail purchases in a matter of years. The researchers realized this profusion of RF could be a boon for robots, giving them another mode of perception.
“RF is such a different sensing modality than vision,” says Rodriguez. “It would be a mistake not to explore what RF can do.”
RF Grasp uses both a camera and an RF reader to find and grab tagged objects, even when they’re fully blocked from the camera’s view. It consists of a robotic arm attached to a grasping hand. The camera sits on the robot’s wrist. The RF reader stands independent of the robot and relays tracking information to the robot’s control algorithm. So, the robot is constantly collecting both RF tracking data and a visual picture of its surroundings. Integrating these two data streams into the robot’s decision making was one of the biggest challenges the researchers faced.
“The robot has to decide, at each point in time, which of these streams is more important to think about,” says Boroushaki. “It’s not just eye-hand coordination, it’s RF-eye-hand coordination. So, the problem gets very complicated.”
The robot initiates the seek-and-pluck process by pinging the target object’s RF tag for a sense of its whereabouts. “It starts by using RF to focus the attention of vision,” says Adib. “Then you use vision to navigate fine maneuvers.” The sequence is akin to hearing a siren from behind, then turning to look and get a clearer picture of the siren’s source.
With its two complementary senses, RF Grasp zeroes in on the target object. As it gets closer and even starts manipulating the item, vision, which provides much finer detail than RF, dominates the robot’s decision making.
RF Grasp proved its efficiency in a battery of tests. Compared to a similar robot equipped with only a camera, RF Grasp was able to pinpoint and grab its target object with about half as much total movement. Plus, RF Grasp displayed the unique ability to “declutter” its environment — removing packing materials and other obstacles in its way in order to access the target. Rodriguez says this demonstrates RF Grasp’s “unfair advantage” over robots without penetrative RF sensing. “It has this guidance that other systems simply don’t have.”
RF Grasp could one day perform fulfilment in packed e-commerce warehouses. Its RF sensing could even instantly verify an item’s identity without the need to manipulate the item, expose its barcode, then scan it. “RF has the potential to improve some of those limitations in industry, especially in perception and localization,” says Rodriguez.
Adib also envisions potential home applications for the robot, like locating the right Allen wrench to assemble your Ikea chair. “Or you could imagine the robot finding lost items. It’s like a super-Roomba that goes and retrieves my keys, wherever the heck I put them.”
The research is sponsored by the National Science Foundation, NTT DATA, Toppan, Toppan Forms, and the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS).