The European Robotics Week (ERW) is achieving a major success with around 1200 interactive robotics related events across Europe, showing how robots will impact the way we work, live, and learn both now and in the future. Every year the ERW changes the central event and hosts an eco-system of various engaging activities in the chosen location. From 16 to 18 November, Augsburg has been in the spotlight, hosting the Central event of the European Robotics Week 2018 with 1,500 visitors coming to the exhibition over the three days.
On Friday, 16 November, ERW2018 started with an Opening, gathering high-level representatives from industry, research, policy and politics, at the Augsburg Town Hall.
Photo: Lucilla Sioli, Director for “Artificial Intelligence and Digital Industry” at DG Connect, European Commission
“The European Commission has been investing in robotics research for over a decade. SPARC, the public-private partnership between the EC and euRobotics, has been particularly successful in bringing together the European robotics community. Europe aims to become world leader in cutting-edge technology and human-centric artificial intelligence (AI). The EC will publish in December a “Coordinated plan on Artificial Intelligence”, developed with Member States so as to address together societal needs such as healthcare, transport or climate change. Additionally, Horizon Europe (the research and innovation program) and the newly-proposed “Digital Europe” programme will address AI. The latter plans to devote about one quarter of its € 9 billion to AI, especially to increase Europe’s capacity in testing facilities as well as in digital capabilities of SMEs through support to digital innovation hubs.
We are also aware of the challenges related to robotics and AI. First, there are the ethical issues, like privacy, accountability, transparency, bias, etc. The Commission tasked a High-Level expert Group to draft a set of ethical guidelines on AI by March next year. Stakeholders from all over Europe are invited to contribute through the “EU Alliance platform”, available online. Guidelines will also help address fears associated to robots, very common in the media. The European Robotics Week can also help and show what robots can really do. With over 1000 events in more than 30 countries, we hope that the European Robotics Week will attract young people. It will be their opportunity to choose careers in this field. We need more young people active in these areas. Shortages in skills in engineering, or in data science, risk to block developments and need to be addressed”, explained Lucilla Sioli, Director for “Artificial Intelligence and Digital Industry” at DG Connect, European Commission.
From Friday 16 to Sunday 18 November, a robots exhibition, several public talks and activities for adults and children were offered at the Zeughaus.
Photo: Vito di Bari, world’s premiere futurist, innovation strategist and inspirational keynote speaker
“Automation means more work in less time. Consequently, employment goes down, which means prices go down as well. In response to this, demand raises, and employment goes up again. There is one thing we cannot do: we cannot upload new software to people. We need new people with new skills. This has happened already with the automobile industry, the ATM and computers. This is the automation paradox. By 2030, 47% of jobs could be automated, however 565 million more jobs will be created. Not only are we not going to lose jobs because of robots, but jobs will double. Robots will be man’s best friends in the future”, said Vito di Bari, world’s premiere futurist, innovation strategist and inspirational keynote speaker during his presentation “Robots, AI and the Future – Technology vs. Humanity”, held on 17 November, at Zeughaus.
Photo: Group of children and teachers from Bosnia and Herzegovina
A group of 30 children and 10 teachers from Bosnia and Herzegovina, supported by euRobotics and the British Embassy in Sarajevo, visited Augsburg and the robots’ exhibition. “The Central event of ERW2018 in Augsburg was the very first opportunity for children from Bosnia and Herzegovina to see and interact with real industrial and humanoid robots. For some of them this was the very first visit to another country. The educational trip paves the way for the future organisation of even more robotics events in Bosnia and Herzegovina”, said Maja Hadziselimovic, National Coordinator of the European Robotics Week in Bosnia and Herzegovina.
Around 1200 events showing how robots will impact the way we work, live, and learn have been announced so far. Robotics researchers, Universities and Industry opened their doors Europe-wide, from the very West in Spain to Romania in the East, from far North in Finland, to the Southern reaches of Cyprus.
In Southern Europe, the Italian Scuola di Robotica is organising over 360 events, and in Spain, 300 events are planned, out of which 190 are in Catalonia. Cyprus is organising over 60 events.
In Western Europe, Germany is organising over 130 events, on various topics, with open doors by companies and research institutes alike. In Brussels, Belgium, La maison du livre is proposing some 20 creative events about robots and artificial intelligence, combining cultural, economic, environmental and artistic perspectives.
East Europe is focusing on extending access to new technologies for all. In Romania, e-Civis, the organiser of the European Robotics Forum 2019, is teaching the teachers how to integrate robotics in the curricula and has planned some 60 events. Bosnia Herzegovina and Hungary both organise kids’ workshops on programming and competitions.
ERW was conceived with the desire of the European Robotics community to bring robotics research and development closer to the public and to build the future Robotics Society. Many more than 500,000 people across Europe have been part of ERW since its first edition in 2011. The European Robotics Week is organised under SPARC, the public-private partnership for robotics between euRobotics and the European Commission.
European Robotics Week 2018 (ERW2018) will take place around Europe on 16-25 November 2018.
A couple of months ago I interviewed Joel Esposito about the state of robotics education for the ROS Developers Podcast #21. On that podcast, Joel talks about his research on how robotics is taught around the world. He identifies a set of common robotics subjects that need to be explained in order to make students know about robotics, and a list of resources that people are using to teach them. But most important, he points out the importance of practicing with robots what students learn.
From my point of view, robotics is about doing, not about reading. A robotics class cannot be about learning how to compute the Jacobian of a matrix to find the inverse kinematics. Computing the Jacobian has nothing to do with robotics, it is just a mathematical tool that we use to solve a robotics problem. That is why a robotics class should not be focused on how to compute the Jacobian but on how to move the end effector to a given location.
I want to move this robotic arm’s end effector to that point, how do I do it?
Well, you need to represent the current configuration of the arm in a matrix!
Ah, ok, how do I do that?
You need to use a matrix to encode its current state. Let’s do it. Take that simulated robot arm and create a ROS node that subscribes to the /joint_states and captures the joint values. From those values, create a matrix storing it. It has to be able to modify the matrix at any time step, so if I move the arm, the matrix must change.
[…]
Done! So what do I have to do now if I want to move the robot gripper close to this bottle?
You need to compute the Jacobian. Let’s do it! Create a program that manually introduce a desired position for the end effector. Then based on that data, compute the Jacobian.
How do I compute the Jacobian?
I love that you ask me that!
…
I understand that learning and studying theory is a big part of robotics, but not the biggest. The biggest should be practicing and doing with robots. For that, I propose to use ROS as the base system, the enabler, that allows practical practice (does that exist?!?!) while learning robotics.
The proposal
Teaching subjects as complex as inverse kinematics or robot navigation should not be (just) a theoretical matter. Practice should be embedded into the teaching of those complex robotics subjects.
I propose to teach ROS at the same time that we teach robotics, and use the former during the whole robotics semester as a tool to build and implement the robotics subject we are teaching. The idea is that we use ROS to allow the student to actually practice what they are learning. For instance, if we are talking about the different algorithms for obstacle avoidance, we also provide a simulated robot and make the student create a ROS program that actually implements the algorithm for that robot. By following this approach, the learning of the student is not only theoretical but instead includes the practice of what is being told.
The advantage of using ROS in this procedure is that ROS already provides a lot of material that can provide a working infrastructure, where you as a teacher, can concentrate on teaching the student how to create the small part that is required for the specific subject you are teaching.
We have so many tools at present that were not available just 5 years ago… Let’s make use of them to increase the student quality and quantity of learning! Instead of using Powerpoint slides, let’s use interactive Python notebooks. Instead of using a single robot for the whole class, let’s provide a robot simulation to each student. Instead of providing an equation on the screen, let’s provide students the implementation of it in a ROS node for them to modify.
Students learn robotics with ROS
Not a new method
What I preach is what I practice. This is the method that I am using on my class of Robot Navigation at the University of LaSalle in Barcelona. In this class, I teach the basics about how to make a robot autonomously move from one point in space to another while avoiding obstacles. You know, SLAM, particle filters, Dynamic Window Approaches and the like. As part of the class, students must learn ROS and use it to implement some of the theoretical concepts I explain to them, like for example, how to compute $odometry based on the values provided by the encoders.
However, I’m not the only one using this method to teach robotics. For example, professor Ross Knepper from Cornell University explained to me on Using ROS to teach the foundations of robotics for the ROS Developers podcast #23, how he teaching of ROS and robotics in parallel. He even goes further, forbidding students to use many ROS software like the ROS navigation stack or MoveIt! His point is very good: he wants the students to actually learn how to do it, not just how to use something that somebody else has done (like the navigation stack). But he uses the ROS infrastructure anyway.
How to teach robotics with ROS
The method would involve the following steps:
You start with a ROS class that explains the most basic topics of ROS, just what is required to start working
Then you start explaining the robotics subject of your choice, making the students implement what you are teaching in a ROS program, using robots. I advise to use simulated robots.
Whenever a new ROS concept is required for continuing the implementation of some algorithm, then, a class dedicated to that subject is performed.
Continue with the next robotics subject
Add a real robot project where students must apply all that they have learned into a single project with a real ROS based robot
Add an exam
1. Explain the basic ROS concepts
For this step, you should describe the basic subjects of ROS that would allow a student to create ROS programs. Those subjects include the following:
ROS packages
Launching ROS nodes
Basic ROS line commands
ROS topic subscribers
ROS topic publishers
Rviz for visualization and debugging
It is not necessary to dedicate too much time to those subjects, just the necessary time for them to know about ROS. Deeper knowledge and understanding of ROS will be learned by practicing along the different classes of the semester. While they are trying to create the software for the implementation of the theoretical concepts, they will have to practice those concepts, and ingrain them deeper into their brain.
Very important, explain those concepts by using simulated robots and making the students apply the concepts to the simulated robot. For example, make the students read from a topic, to get the distance to the closest obstacle. Or make them write to a topic to make the robot move around. Do not just explain what a topic is and provide a code on the slide. Make them actually connect to a producer of data, that is, the simulated robot.
I would also recommend that at this step you forget teaching about creating custom messages, or what are services or action clients. That is too much for such an introductory class and it is very unlikely you are going to need it in your next robotics class.
2. Explain and implement robotics subject
You can now proceed on teaching your robotics subject. How I recommend is to divide the time of the class into two:
First part will be to explain the theory of the robotics subject
Second part to actually implement that theory. Create a program that actually does what you explained.
It may happen that in order to implement that theoretical part, the student needs a lot of pre-made code that supports the specific point you are teaching. That will mean that you have to prepare your class even harder and provide to the student will all that support code. The good news is that, by using ROS, it is very likely that you will find that code already done by someone. Hence, find or develop that code, and provide it as ROS package to your students. More about where to find the code and how to provide it to your students, below.
An interesting point to include here is what professor Knepper indicates: he creates some exercises with deliberate errors, so the students learn to recognize situations where the system is not performing correctly, and, more important, to create the skills necessary to figure out how to solve those errors. Take that point also into consideration.
3. Add new ROS concept
At some point in time, you may need to create a custom ROS message for the robotics subject you are teaching. Or you may need to use a ROS service because you need to provide a service for face recognition. Whatever you need to explain about ROS, now is the best moment. Use that necessity to explain the concept. What I mean is that explaining a concept like ROS action servers when the student has no idea what this could be used for is a bad idea. The optimal moment to explain that new ROS concept is when it is so clear for everyone that the concept is needed.
I have found myself that explaining complex concepts like ROS services when the students don’t need them makes it very difficult for them to understand, even if you provide good examples of usage. They cannot feel the pain of not using those concepts in their programs. Only when they are in the situation that actually requires the concept, only then are they going to feel the pain, and the knowledge is going to be integrated into their heads.
4. Continue with the next robotics subject
Keep pushing this way. Move to the next subject. Provide a different robot simulation. Request the students implement the concept. And keep going this way until you finish the whole course.
5. Real robots project
Testing with simulators is good because it provides real life-like experience. However, the real test is when you use a real robot. Usually, Universities do not have a budget for a robot for each student, that is why we have been promoting so much the use of simulations. However, some real robot experience is necessary, and most universities can afford to have a few robots for the whole class. That is the point where the robotics project come into action.
Define a robotics project that encapsulates all the knowledge of the whole course. Make the students create groups that will test their results on the real robot. This approach has many advantages: students will have to summarize all the lessons into a single application. They will have to make it work in a real robot. Additionally, they will have to work in teams.
In order to provide this step to the students you will need to prepare the following:
A simulation of the environment where the real robot will be executed. This allows the students to practice most of the code in the simulator, in a faster way. It also allows you to have some rest (otherwise the students may require your presence all the time at the real robots lab
You cannot escape providing some hours per week for the students to practice with the real robot. So schedule that.
Finally, decide a day when you will evaluate the project for each team. That is going to be called demo day. On demo day, each group has to show how their code is able to make the robot perform as it is supposed to.
For example, at LaSalle, we use two Turtlebots for 15 students. The students must do teams of two people in order to do the project. Then, on demo day, their robot has to be able to serve the coffee on a real coffee shop in Barcelona (thank you Costa Coffee for your support!). All the students go to the coffee shop with their programs, and we bring the two robots. Then, while one team is demonstrating on one robot, the other is preparing.
In case you need ROS certified robots, let me recommend you the online shop of my friends: ROS Components shop. That is where I buy robots or pieces when I need them. No commission for me recommending it! I just think they are great.
6. Evaluation
One very important point in the whole process is how to evaluate the students. In my opinion, the evaluation must be continuous, otherwise the students just do nothing until the previous day of the exam. Afterward, everyone complains.
At LaSalle, I do an exam of one hour every month, covering the topics I have taught during that month, and the previous months too.
In our case, the exams are completely practical. They have to make the robot perform something related to the subjects taught. For example, they have to make the robot create a map of the environment using an extended Kalman filter. I may provide the implementation of the extended Kalman filter for straight usage, but the student must know how to use it. How to capture the proper data from the robot, how to provide that to the filter, and how to use the output to build the map.
As an example, here you can find a ROSject containing the last exam I gave to the students about dead reckoning navigation. The ROSject contains everything they needed to do the exam, including instructions, scores, and robot simulations. That leads us to the next point.
How to provide the material to the students
If I have convinced you on using a practical method to teach robotics using ROS, at this point you must be concerned about two points:
How am I going to create all that material?
How can I provide my students with a running environment where to execute that bunch of simulations and ROS code?
Very good concerns.
Preparing the material is a lot of work. And more important, there is a lot of risk. When you prepare such material, there is a large probability that what you prepared does not work for the student’s computer. That is why I propose to use an online platform for preparing all that material, and share the material inside the same platform, so you will be 100% sure that it will work no matter who is going to use it.
I propose to use the ROS Development Studio. That is a platform that we have developed at our company, The Construct, for the easy creation, testing and distribution of ROS code.
The ROS Development Studio (or ROSDS for short), allows you to create the material using a web browser on any type of computer. It already provides the simulations you may need (even if you can add your own ones). It also provides a Python notebook structure that you can fill with the material for the students. It also allows you to include any code that your students may require in the form of ROS packages.
But the most interesting point of the ROSDS is that it allows you to share all your material by sending a simple web link to the students. That is what we call a ROSject. A ROSject is a complete ROS project that includes simulations, notebooks and code in a single web link. Whenever you provide this link to anybody, they will get a copy of the whole content, and they will be able to execute it in the exact same conditions as you created the ROSject.
This sharing feature makes it very easy also for the students to share with you their program for evaluation, in case of exams, or to help students when they are stuck while studying. Since the ROSject contains all the material, it will be very easy for you to correct the exam in the conditions the student created the program, without requiring you to copy files or set up your computer environment. Just tell the students to share with you the ROSject link.
We use the ROSjects at LaSalle to provide the different lessons and exercises. For example, we have a ROSject with the simulation of the coffee shop where the students will do the project evaluation. Also, we use the ROSject to create the exams (as you can see in the previous section).
Summarizing: ROSjects allow you to encapsulate your lessons in a complete unit that includes the explanations, with the simulations and with the code. And all that for free.
Still, you will have to create your ROSjects for your classes. This is something that is going to finish in the close future as more and more Universities are creating their ROSjects and publishing them online for anybody.
However, if you do not want to wait until they are created, I can recommend you our online academy the Robot Ignite Academy, where we provide already made ROS-based courses about deep learning, manipulation, robot perception, and more. The only drawback is that the academy has a certain cost per student. However, it is highly recommended because it simplifies your preparation. We use the Robot Ignite Academy at LaSalle, but many other Universities use it around the world, like Clarkson University (USA), University of Michigan (USA), Chukyo University (Japan), University of Sydney (Australia), University of Luxembourg (Luxembourg), Université de Reims (France) or University of Alicante (Spain).
Additional topics you may need to cover
Finally, I would like to make a point about a problem that I have identified when using this method.
I find that most of the people that come to our Robot Navigation class have zero knowledge about programming in either Python or C++, nor any knowledge on how to use Linux shell. After interviewing several robotics teachers around the world, I found that this is the case also for other countries.
If that is your case, first I would recommend that you reduce the scope of learning programming to the use of Python. Do not go to C++. C++ is too complex to start teaching it together with robotics. Also, Python is very powerful and easy to learn.
I usually create an initial class about Python and Linux shell. I do this class and immediately, on the next week, I do an exam to the students about Python and Linux shell. The purpose of the exam is to stress to the students the importance of mastering those programming skills. They are the base of the rest.
Additional problems that you may find is that the students have problems understanding English. Most of ROS documentation is in English. The way to communicate in the ROS community is in English (whether we like it or not). You may feel tempted to create the course notes in your mother language, and only provide documentary resources in your own language. I suggest not to do it. Push your students to learn English . The community needs a common language, and at present it is English.
Webinar about this subject
If you want to know more and discuss about the subject, let me suggest you come to the webinar we are doing on the 29th November about this matter.
I started this post talking about the findings of Joel Esposito. Even if you listen to the podcast interview, you will see that his final conclusion is actually NOT to use ROS to teach robotics. I’m sure there are other teachers with the same opinion. Other teachers like professor Knepper do advocate for the opposite. Those are points of views, like mine in this article. I recommend you listen to the podcast interview with Joel so you can understand why he suggests that.
It is up to you to decide. Now you have several opinions here. Which one is best for you? Please leave your answer into the comments so we can start a debate about it.
The Nuclear and Applied Robotics Group, based in the University of Texas at Austin, has a mission to “develop and deploy advanced robotics in hazardous environments to minimize risk for the human operator.”
Funded by a five-year, $3.5 million NIH grant, the academic-industry partnership aims to develop an MRI-compatible robotic technology to provide minimally invasive brain tumor therapy that is ready for clinical trials.
Look around and you’ll likely see something that runs on an electric motor. Powerful and efficient, they keep much of our world moving, everything from our computers to refrigerators to the automatic windows in our cars. But these qualities change for the worse when such motors are shrunk down to sizes smaller than a cubic centimeter.
“At very small scales, you get a heater instead of a motor,” said Jakub Kedzierski, staff in MIT Lincoln Laboratory’s Chemical, Microsystem, and Nanoscale Technologies Group. Today, no motor exists that is both highly efficient and powerful at microsizes. And that’s a problem, because motors on that scale are needed to put miniaturized systems into motion — microgimbals that can point lasers to a fraction of a degree over thousands of miles, tiny drones that can squeeze into wreckage to find survivors, or even bots that can crawl through the human digestive tract.
To help power systems like these, Kedzierski and his team are making a new type of motor called a microhydraulic actuator. The actuators move with a level of precision, efficiency, and power that has not yet been possible at the microscale. A paper describing this work was published in the September 2018 issue of Science Robotics.
The microhydraulic actuators use a technique called electrowetting to achieve motion. Electrowetting applies an electrical voltage to water droplets on a solid surface to distort the surface tension of the liquid. The actuators take advantage of this distortion to force water droplets inside of the actuator to move, and with them, the entire actuator.
“Think about a droplet of water on a window; the force of gravity distorts it, and it moves down,” said Kedzierski. “Here, we use voltage to cause the distortion, which in turn produces motion.”
The actuator is constructed in two layers. The bottom layer is a sheet of metal with electrodes stamped into it. This layer is covered with a dielectric, an insulator that becomes polarized when an electric field is applied. The top layer is a sheet of polyimide, a strong plastic, that has shallow channels drilled into it. The channels guide the path of dozens of water droplets that are applied in between the two layers and are aligned with the electrodes. To hold off evaporation, the water is premixed with a solution of lithium chloride, which depresses the water’s vapor pressure enough for the micrometer-sized droplets to last for months. The droplets keep their rounded shape (instead of being squashed between the layers) due to their surface tension and relatively small size.
The actuator comes to life when voltage is applied to the electrodes, though not to all of them at once. It’s done in a cycle of turning on two electrodes per droplet at a time. With no voltage, a single water droplet rests neutrally on two electrodes, 1 and 2. But apply a voltage to electrodes 2 and 3, and suddenly the droplet is deformed, stretching to touch the energized electrode 3 and pulling off of electrode 1.
This horizontal force in one droplet isn’t enough to move the actuator. But with this voltage cycle being applied in unison to the electrodes underneath every drop in the array, the entire polyimide layer slides over to appease the drops’ attraction to the energized electrodes. Keep cycling the voltage through, and droplets continue to walk over the electrodes and the layer continues to slide over; turn the voltage off, and the actuator stops in its tracks. The voltage, then, becomes a powerful tool to precisely control the actuator’s movement.
But how does the actuator stand up against other types of motors? The two metrics to measure performance are power density, or the amount of power the motor produces in relation to its weight, and efficiency, or the measure of wasted energy. One of best electric motors in terms of efficiency and power density is the motor of the Tesla Model S sedan. When the team tested the microhydraulic actuators, they found them to be just behind the Model S’s power density (at 0.93 kilowatt per kilogram) and efficiency output (at 60 percent efficient at maximum power density). They widely exceeded piezoelectric actuators and other types of microactuators.
“We’re excited because we are meeting that benchmark, and we are still improving as we scale to smaller sizes,” Kedzierski said. The actuators improve at smaller sizes because surface tension remains the same regardless of the water droplet size — and smaller droplets make room for even more droplets to squeeze in and exert their horizontal force on the actuator. “Power density just shoots up. It’s like having a rope whose strength doesn’t weaken as it gets thinner,” he added.
The latest actuator, the one edging close to the Model S, had a separation of 48 micrometers between droplets. The team is now shrinking that down to 30 micrometers. They project that, at that scale, the actuator will match the Tesla Model S in power density, and, at 15 micrometers, eclipse it.
Scaling the actuators down is just one part of the equation. The other aspect the team is actively working on is 3-D integration. Right now, a single actuator is a two-layer system, thinner than a plastic bag and flexible like one too. They want to stack the actuators in a scaffold-like system that can move in three dimensions.
Kedzierski envisions such a system mimicking our bodies’ muscle matrix, the network of tissues that allow our muscles to achieve instantaneous, powerful, and flexible motion. Ten times more powerful than muscle, the actuators were inspired by muscle in many ways, from their flexibility and lightness to their composition of fluid and solid components.
And just as muscle is an excellent actuator at the scale of an ant or an elephant, these microhydraulic actuators, too, could have a powerful impact not just at the microscale, but at the macro.
“One might imagine,” said Eric Holihan, who has been assembling and testing the actuators, “the technology being applied to exoskeletons,” built with the actuators working as lifelike muscle, configured into flexible joints instead of gears. Or an aircraft wing could shapeshift on electrical command, with thousands of actuators sliding past each other to change the wing’s aerodynamic form.
While their imaginations are churning, the team faces challenges in developing large systems of the actuators. One challenge is how to distribute power at that volume. A parallel effort at the laboratory that is developing microbatteries to integrate with the actuators could help solve that issue. Another challenge is how to package the actuators so that evaporation is eliminated.
“Reliability and packaging will continue to be the predominate questions posed to us about the technology until we demonstrate a solution,” said Holihan. “This is something that we look to attack head on in the coming months.”
With an industry leading manufacturing process, Universal Robots can meet delivery deadlines for the Section 179 tax deduction incentive that can dramatically reduce the cost of collaborative robots to address repetitive and difficult-to-fill manufacturing jobs
Japan's affection for robots is no secret. But is the feeling mutual in the country's amazing androids? Roboticists are now a step closer to giving androids greater facial expressions to communicate with.
Music plays an important role in most people's lives regardless of the genre and in a wide variety of contexts from celebrations and parties to simply providing background while a task is being performed. Until very recently, music was only heard when musicians played it live, the ability to record music displaced that live performance to some degree, and then the invention of electronic musical instruments and digitisation changed our appreciation of music yet again.
The company is expected to dissolve the project later this year and relocate employees involved in development to other departments within Alphabet or help them find jobs elsewhere.
The newest version of MVTec HALCON is here to solve all of your machine vision tasks at utmost speed and robustness! Deep learning functions, like pixel-precise semantic segmentation or object detection, help you to identify and classify objects and flaws more flexibly and easier than ever before – HALCON extracts relevant image information automatically. Try it for free here!
Industry 4.0 represents a giant leap in the evolution of machines — an evolution that has already pioneered leaps in productivity and quality of life over several hundred years. Today, our machines are connected with each other, their physical environments, and to people
Autonomous mobile robots are a simple, efficient and cost-effective way to automate material handling and in-house transportation tasks in nearly any situation where employees would previously have been required to push carts around the facility.