Archive 31.05.2018

Page 1 of 4
1 2 3 4

ANYbotics wins ICRA 2018 Robot Launch competition!

The four-legged design of ANYmal allows the robot to conquer difficult terrain such as gravel, sand, and snow. Photo credit: ETH Zurich / Andreas Eggenberger.

ANYbotics led the way in the ICRA 2018 Robot Launch Startup Competition on May 22, 2018 at the Brisbane Conference Center in Australia. Although ANYbotics pitched last out of the 10 startups presenting, they clearly won over the judges and audience. As competition winners, ANYbotics received a $3,000 prize from QUT bluebox, Australia’s robotics accelerator (currently taking applications for 2018!), plus Silicon Valley Robotics membership and mentoring from The Robotics Hub.

ANYbotics is a Swiss startup creating fabulous four legged robots like ANYmal and the core component, the ANYdrive highly integrated modular robotic joint actuator. Founded in 2016 by a group of ETH Zurich engineers, ANYbotics is a spin-off company of the Robotic Systems Lab (RSL), ETH Zurich.

ANYmal moves and operates autonomously in challenging terrain and interacts safely with the environment. As a multi-purpose robot platform, it is applicable on industrial indoor or outdoor sites for inspection and manipulation tasks, in natural terrain or debris areas for search and rescue tasks, or on stage for animation and entertainment. Its four legs allow the robot to crawl, walk, run, dance, jump, climb, carry — whatever the task requires.

ANYdrive is a highly integrated modular robotic joint actuator that guarantees

  • very precise, low-impedance torque control,
  • high impact robustness,
  • safe interaction,
  • intermittent energy storage and peak power amplification

Motor, gear, titanium spring, sensors, and motor electronics are incorporated in a compact and sealed (IP67) unit and connected by a EtherCAT and power bus. With ANYdrive joint actuators, any kinematic structure such as a robot arm or leg can be built without additional bearings, encoders or power electronics.

ANYdrive’s innovative design allows for highly dynamic movements and collision maneuvers without damage from impulsive contact forces, and at the same time for highly sensitive force controlled interaction with the environment. This is of special interest for robots that should interact with humans, such as collaborative and mobile robots.

ICRA 2018 finalists and judges; Roland Siegwart from ETH Zurich, Juliana Lim from SG Innovate, Yotam Rosenbaum from QUT bluebox, Martin Duursma from Main Sequence Ventures and Chris Moehle from The Robotics Hub Fund.

The ICRA 2018 Robot Launch Startup Competition was judged by experienced roboticists, investors and entrepreneurs. Roland Siegwart is a Professor at ETH Zurich’s Autonomous Systems Lab and cofounder of many successful robotics spinouts. Juliana Lim is Head of Talent from SG Innovate, a Singapore venture capital arm specializing in pre-seed, seed, startup, early-stage, and Series A investments in deep technologies, starting with artificial intelligence (AI) and robotics.

Yotam Rosenbaum is the ICT Entrepreneur in Residence at QUT bluebox, building on successful exits from global startups. Martin Duursma is a venture partner in Main Sequence Ventures, Australia’s new innovation fund specializing in AI, robotics and deep tech like biotech, quantum computing and the space industry. Chris Moehle is the managing partner at The Robotics Hub Fund, who may invest up to $250,000 in the overall winner of the Robot Launch Startup Competition 2018.

Organized by Silicon Valley Robotics, the Robot Launch competition is in it’s 5th year and has seen hundreds of startups from more than 20 countries around the globe. The MC for the evening, Silicon Valley Robotics Director Andra Keay, said “Some of the best robotics startups come from places like Switzerland or Australia, but to get funding and to grow fast, they usually need to spend some time in Silicon Valley.”

“The Robot Launch competition allows us to reach startups from all over the world and get them in front of top investors. Many of these startups have gone on to win major events and awards like TechCrunch Battlefield and CES Innovation Awards. So we know that robotics is also coming of age.”

As well as ANYbotics, the other 9 startups gave great pitches. In order of appearance they were:

  • Purple Robotics
  • Micromelon Robotics
  • EXGwear
  • HEBI Robotics
  • Abyss Solutions
  • EyeSyght
  • Niska Retail Robotics
  • Aubot
  • Sevensense

Purple Robotics creates drones for work, which fly for 3x longer than, or carry 3x the payload of existing commercial drones, due to their innovative design. They are not standard quadrocopters but they use the same battery technology. Purple Robotics drones are also gust resistant, providing maximum stability in the air and enabling them to fly closer to structures.

Micromelon creates a seamless integration between visual and text coding, with the ability to translate between the two languages in real time. Students and teachers are able to quickly begin programming the wireless robots. The teacher dashboard and software are designed to work together to assist teachers who may have minimal experience in coding, to instruct a class of students through the transition. Students are able to backtrack to blocks, see how the program looks as text or view both views at once students are able to be supported throughout the entire journey.

EXGwear is currently developing a “hands-free”, intuitive interaction method, in the form of a portable wearable device that is extremely compact, non-obtrusive, and comfortable to wear long hours to help disabled people solve their daily interaction problems with the environment. Our first product, EXGbuds, a customizable earbud-like device is based on patent-pending biosensing technology and machine learning-enabled App. It can measure eye movement and facial expression physiological signals at extremely high accuracy to generate user-specific actionable commands for seamless interaction with the smart IoTs and robotic devices.

HEBI Robotics produces Lego-like robotic building blocks. Our platform consists of hardware and software that make it easy to design, build and program world class robotics quickly. Our hardware platform is robust, flexible, and safe. Our cross-platform software development tools take care of the difficult math that’s required to develop a robot so that the roboticist can focus on the creative aspects of robot design.

Abyss Solutions delivers key innovations in Remotely Operated Vehicles (ROVs) and sensor technology to collect high fidelity, multi-modal data comprehensively across underwater inspections. By pushing the state-of-the-art in machine learning and data analytics, accurate and efficient condition assessments can be conducted and used to build an asset database. The database is able to grow over repeat inspection and the objectivity of the analytics enables automated change tracking. The output is a comprehensive asset representation that can enable efficient risk management for critical infrastructure.

EyeSyght is TV for your fingers. As humans we use our senses to gather and collect information to analyse the environment around us and create a mental picture of our surroundings. But what about touch? When we operate our smartphones, tablets and computers we interact with a flat piece of glass. Now through the use of Haptic Feedback, Electrical Impulses, Ultra Sound, EyeSyght will enable any surface to render Shapes, Textures, Depth, and much much more.

Niska Retail Robotics is reimagining retail, starting with icecream. “Customer demands are shifting away from products and towards services and experiences.” (CSIRO, 2017) Niska creates wonderful customer experiences with robot servers scooping out delicious gourmet icecream for you, 24/7.

Aubot (‘au’ is to meet in Japanese – pronounced “our-bot”) is focused on building robots that help us in our everyday lives. The company was founded in April 2013 by Marita Cheng, Young Australian of the Year 2012. Our first product, Teleport, is a telepresence robot. Teleport will reduce people’s need to travel while allowing them greater freedom to explore new surroundings. In the future, aubot aims to combine Jeva and Teleport to create a telepresence robot with an arm attached.

Sevensense (still based at ETH Zurich Autonomous Systems Lab) provide a visual localization system tailored to the needs of professional service robots. The use of cameras instead of laser rangefinders enables our product to perform more reliably, particularly in dynamic and geometrically ambiguous environments, and allows for a cost advantage. In addition, we offer market specific application modules along with the engineering services to successfully apply our product on the customer’s machinery.

We thank all the startups for sharing their pitches with us – the main hall at ICRA was packed and we look forward to hearing from more startups in the next rounds of Robot Launch 2018.

#261: Cozmo, by Anki, with Andrew Neil Stein

In this episode, Abate interviews Andrew Stein from Anki. At Anki they developed an engaging robot called Cozmo which packs sophisticated robotic software inside a lifelike, palm sized, robot. Cozmo recognizes people and objects around him and plays games with them. Cozmo is unique in that a large amount of development has been implemented to make his animations and behavior feel natural, in addition to focusing on classical robotic elements such as computer vision and object manipulation.

Andrew Neil Stein

Andrew Stein is the Head of Robotics & AI at Anki, where he began working on the Cozmo project more than four years ago as the team’s first member. He has contributed to several core systems of the product, including vision, cube manipulation, animation streaming, localization, high-level behaviors, and low-level actions. He earned his Ph.D. from the Robotics Institute at Carnegie Mellon University, and his Bachelor’s and Master’s degrees in Electrical and Computer Engineering from the Georgia Institute of Technology.

 

 

 

 

Links

Nearly 1000 research videos from #ICRA2018

The International Conference on Robotics and Automation (ICRA) is the IEEE Robotics and Automation Society’s flagship conference and is a premier international forum for robotics researchers to present their work. ICRA 2018 is just wrapping up over in Brisbane Australia.

Robohub will be bringing you stories and podcasts in the weeks ahead.

In the meantime, have a look at the #ICRA2018 tweets and nearly 1000 research spotlight videos from the conference!


Garbage-collecting aqua drones and jellyfish filters for cleaner oceans

An aqua drone developed by the WasteShark project can collect litter in harbors before it gets carried out into the open sea. Image credit – WasteShark

By Catherine Collins

The cost of sea litter in the EU has been estimated at up to €630 million per year. It is mostly composed of plastics, which take hundreds of years to break down in nature, and has the potential to affect human health through the food chain because plastic waste is eaten by the fish that we consume.

‘I’m an accidental environmentalist,’ said Richard Hardiman, who runs a project called WASTESHARK. He says that while walking at his local harbour one day he stopped to watch two men struggle to scoop litter out of the sea using a pool net. Their inefficiency bothered Hardiman, and he set about trying to solve the problem. It was only when he delved deeper into the issue that he realised how damaging marine litter, and plastic in particular, can be, he says.

‘I started exploring where this trash goes – ocean gyres (circular currents), junk gyres, and they’re just full of plastic. I’m very glad that we’re now doing something to lessen the effects,’ he said.

Hardiman developed an unmanned robot, an aqua drone that cruises around urban waters such as harbours, marinas and canals, eating up marine litter like a Roomba of the sea. The waste is collected in a basket which the WasteShark then brings back to shore to be emptied, sorted and recycled.

The design of the autonomous drone is modelled on a whale shark, the ocean’s largest known fish. These giant filter feeders swim around with their mouths open and lazily eat whatever crosses their path.

It’s powered by rechargeable electric batteries, ensuring that it doesn’t pollute the environment through oil spillage or exhaust fumes, and it is relatively silent, avoiding noise pollution. It produces zero carbon emissions and the device moves quite slowly, allowing fish and birds to merely swim away when it gets too close for comfort.

‘We’ve tested it in areas of natural beauty and natural parks where we know it doesn’t harm the wildlife,’ said Hardiman. ‘We’re quite fortunate in that, all our research shows that it doesn’t affect the wildlife around.’

WasteShark’s autonomous drone is modelled on a whale shark. Credit – RanMarine Technology

WasteShark is one of a number of new inventions designed to tackle the problem of marine litter. A project called CLAIM is developing five different kinds of technology, one of which is a plasma-based tool called a pyrolyser. 

Useful gas

CLAIM’s pyrolyser will use heat treatment to break down marine litter to a useful gas. Plasma is basically ionised gas, capable of reaching very high temperatures of thousands of degrees. Such heat can break chemical bonds between atoms, converting waste into a type of gas called syngas.

The pyrolyser will be mounted onto a boat collecting floating marine litter – mainly large items of plastic which, if left in the sea, will decay into microplastic – so that the gas can then be used as an eco-friendly fuel to power the boat, or to provide energy for heating in ports.

Dr Nikoleta Bellou of the Hellenic Centre for Marine Research, one of the project coordinators of CLAIM, said: ‘We know that we humans are actually the key drivers for polluting our oceans. Unlike organic material, plastic never disappears in nature and it accumulates in the environment, especially in our oceans. It poses a threat not only to the health of our oceans and to the coasts but to humans, and has social, economic and ecological impacts.’

The researchers chose areas in the Mediterranean and Baltic Seas to act as their case studies throughout the project, and will develop models that can tell scientists which areas are most likely to become litter hotspots. A range of factors influence how littered a beach may be – it’s not only affected by litter louts in the surrounding area but also by circulating winds and currents which can carry litter great distances, dumping the waste on some particular beaches rather than others.

CLAIM’s other methods to tackle plastic pollution include a boom – a series of nets criss-crossing a river that catches all the large litter that would otherwise travel to the sea. The nets are then emptied and the waste is collected for treatment with the pyrolyser. There have been problems with booms in the past, when bad weather conditions cause the nets to overload and break, but CLAIM will use automated cameras and other sensors that could alert relevant authorities when the nets are full.

Microplastics

Large plastic pieces that can be scooped out of the water are one thing, but tiny particles known as microplastics that are less than 5mm wide pose a different problem. Scientists on the GoJelly project are using a surprising ingredient to create a filter that prevents microplastics from entering the sea – jellyfish slime.

The filter will be deployed at waste water management plants, a known source of microplastics. The method has already proven to be successful in the lab, and now GoJelly is planning to upscale the biotechnology for industrial use.

Dr Jamileh Javidpour of the GEOMAR Helmholtz Centre for Ocean Research Kiel, who coordinates the project, said: ‘We have to be innovative to stop microplastics from entering the ocean.’

The GoJelly project kills two birds with one stone – tackling the issue of microplastics while simultaneously addressing the problem of jellyfish blooms, where the creatures reproduce in high enough levels to blanket an area of ocean.

Jellyfish are one of the most ancient creatures on the planet, having swum in Earth’s oceans during the time of the dinosaurs. On the whole, due to a decline in natural predators and changes in the environment, they are thriving. When they bloom, jellyfish can attack swimmers and fisheries.

Fishermen often throw caught jellyfish back into the sea as a nuisance but, according to Dr Javidpour, jellyfish can be used much more sustainably. Not only can their slime be used to filter out microplastics, they can also be used as feed for aquaculture, for collagen in anti-ageing products, and even in food.

In fact, part of the GoJelly project involves producing a cookbook, showing people how to make delicious dishes from jellyfish. While Europeans may not be used to cooking with jellyfish, in many Asian cultures they are a daily staple. However, Dr Javidpour stresses that the goal is not to replace normal fisheries.

‘We are mainly ecologists, we know the role of jellyfish as part of a healthy ecosystem,’ she said. ‘We don’t want to switch from classical fishery to jellyfish fishery, but it is part of our task to investigate if it is doable, if it is sustainable.’

The research in this article has been funded by the EU.

Fleet of autonomous boats could service some cities, reducing road traffic

Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Senseable City Lab have designed a fleet of autonomous boats that offer high maneuverability and precise control.
Courtesy of the researchers
By Rob Matheson

The future of transportation in waterway-rich cities such as Amsterdam, Bangkok, and Venice — where canals run alongside and under bustling streets and bridges — may include autonomous boats that ferry goods and people, helping clear up road congestion.

Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Senseable City Lab in the Department of Urban Studies and Planning (DUSP), have taken a step toward that future by designing a fleet of autonomous boats that offer high maneuverability and precise control. The boats can also be rapidly 3-D printed using a low-cost printer, making mass manufacturing more feasible.

The boats could be used to taxi people around and to deliver goods, easing street traffic. In the future, the researchers also envision the driverless boats being adapted to perform city services overnight, instead of during busy daylight hours, further reducing congestion on both roads and canals.

“Imagine shifting some of infrastructure services that usually take place during the day on the road — deliveries, garbage management, waste management — to the middle of the night, on the water, using a fleet of autonomous boats,” says CSAIL Director Daniela Rus, co-author on a paper describing the technology that’s being presented at this week’s IEEE International Conference on Robotics and Automation.

Moreover, the boats — rectangular 4-by-2-meter hulls equipped with sensors, microcontrollers, GPS modules, and other hardware — could be programmed to self-assemble into floating bridges, concert stages, platforms for food markets, and other structures in a matter of hours. “Again, some of the activities that are usually taking place on land, and that cause disturbance in how the city moves, can be done on a temporary basis on the water,” says Rus, who is the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science.

The boats could also be equipped with environmental sensors to monitor a city’s waters and gain insight into urban and human health.

Co-authors on the paper are: first author Wei Wang, a joint postdoc in CSAIL and the Senseable City Lab; Luis A. Mateos and Shinkyu Park, both DUSP postdocs; Pietro Leoni, a research fellow, and Fábio Duarte, a research scientist, both in DUSP and the Senseable City Lab; Banti Gheneti, a graduate student in the Department of Electrical Engineering and Computer Science; and Carlo Ratti, a principal investigator and professor of the practice in the DUSP and director of the MIT Senseable City Lab.

Better design and control

The work was conducted as part of the “Roboat” project, a collaboration between the MIT Senseable City Lab and the Amsterdam Institute for Advanced Metropolitan Solutions (AMS). In 2016, as part of the project, the researchers tested a prototype that cruised around the city’s canals, moving forward, backward, and laterally along a preprogrammed path.

The ICRA paper details several important new innovations: a rapid fabrication technique, a more efficient and agile design, and advanced trajectory-tracking algorithms that improve control, precision docking and latching, and other tasks. 

To make the boats, the researchers 3-D-printed a rectangular hull with a commercial printer, producing 16 separate sections that were spliced together. Printing took around 60 hours. The completed hull was then sealed by adhering several layers of fiberglass.

Integrated onto the hull are a power supply, Wi-Fi antenna, GPS, and a minicomputer and microcontroller. For precise positioning, the researchers incorporated an indoor ultrasound beacon system and outdoor real-time kinematic GPS modules, which allow for centimeter-level localization, as well as an inertial measurement unit (IMU) module that monitors the boat’s yaw and angular velocity, among other metrics.

The boat is a rectangular shape, instead of the traditional kayak or catamaran shapes, to allow the vessel to move sideways and to attach itself to other boats when assembling other structures. Another simple yet effective design element was thruster placement. Four thrusters are positioned in the center of each side, instead of at the four corners, generating forward and backward forces. This makes the boat more agile and efficient, the researchers say.

The team also developed a method that enables the boat to track its position and orientation more quickly and accurately. To do so, they developed an efficient version of a nonlinear model predictive control (NMPC) algorithm, generally used to control and navigate robots within various constraints.

The NMPC and similar algorithms have been used to control autonomous boats before. But typically those algorithms are tested only in simulation or don’t account for the dynamics of the boat. The researchers instead incorporated in the algorithm simplified nonlinear mathematical models that account for a few known parameters, such as drag of the boat, centrifugal and Coriolis forces, and added mass due to accelerating or decelerating in water. The researchers also used an identification algorithm that then identifies any unknown parameters as the boat is trained on a path.

Finally, the researchers used an efficient predictive-control platform to run their algorithm, which can rapidly determine upcoming actions and increases the algorithm’s speed by two orders of magnitude over similar systems. While other algorithms execute in about 100 milliseconds, the researchers’ algorithm takes less than 1 millisecond.

Testing the waters

To demonstrate the control algorithm’s efficacy, the researchers deployed a smaller prototype of the boat along preplanned paths in a swimming pool and in the Charles River. Over the course of 10 test runs, the researchers observed average tracking errors — in positioning and orientation — smaller than tracking errors of traditional control algorithms.

That accuracy is thanks, in part, to the boat’s onboard GPS and IMU modules, which determine position and direction, respectively, down to the centimeter. The NMPC algorithm crunches the data from those modules and weighs various metrics to steer the boat true. The algorithm is implemented in a controller computer and regulates each thruster individually, updating every 0.2 seconds.

“The controller considers the boat dynamics, current state of the boat, thrust constraints, and reference position for the coming several seconds, to optimize how the boat drives on the path,” Wang says. “We can then find optimal force for the thrusters that can take the boat back to the path and minimize errors.”

The innovations in design and fabrication, as well as faster and more precise control algorithms, point toward feasible driverless boats used for transportation, docking, and self-assembling into platforms, the researchers say.

A next step for the work is developing adaptive controllers to account for changes in mass and drag of the boat when transporting people and goods. The researchers are also refining the controller to account for wave disturbances and stronger currents.

“We actually found that the Charles River has much more current than in the canals in Amsterdam,” Wang says. “But there will be a lot of boats moving around, and big boats will bring big currents, so we still have to consider this.”

The work was supported by a grant from AMS.

Making driverless cars change lanes more like human drivers do

At the International Conference on Robotics and Automation tomorrow, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) will present a new lane-change algorithm.
By Larry Hardesty

In the field of self-driving cars, algorithms for controlling lane changes are an important topic of study. But most existing lane-change algorithms have one of two drawbacks: Either they rely on detailed statistical models of the driving environment, which are difficult to assemble and too complex to analyze on the fly; or they’re so simple that they can lead to impractically conservative decisions, such as never changing lanes at all.

At the International Conference on Robotics and Automation tomorrow, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) will present a new lane-change algorithm that splits the difference. It allows for more aggressive lane changes than the simple models do but relies only on immediate information about other vehicles’ directions and velocities to make decisions.

“The motivation is, ‘What can we do with as little information as possible?’” says Alyssa Pierson, a postdoc at CSAIL and first author on the new paper. “How can we have an autonomous vehicle behave as a human driver might behave? What is the minimum amount of information the car needs to elicit that human-like behavior?”

Pierson is joined on the paper by Daniela Rus, the Viterbi Professor of Electrical Engineering and Computer Science; Sertac Karaman, associate professor of aeronautics and astronautics; and Wilko Schwarting, a graduate student in electrical engineering and computer science.

“The optimization solution will ensure navigation with lane changes that can model an entire range of driving styles, from conservative to aggressive, with safety guarantees,” says Rus, who is the director of CSAIL.

One standard way for autonomous vehicles to avoid collisions is to calculate buffer zones around the other vehicles in the environment. The buffer zones describe not only the vehicles’ current positions but their likely future positions within some time frame. Planning lane changes then becomes a matter of simply staying out of other vehicles’ buffer zones.

For any given method of computing buffer zones, algorithm designers must prove that it guarantees collision avoidance, within the context of the mathematical model used to describe traffic patterns. That proof can be complex, so the optimal buffer zones are usually computed in advance. During operation, the autonomous vehicle then calls up the precomputed buffer zones that correspond to its situation.

The problem is that if traffic is fast enough and dense enough, precomputed buffer zones may be too restrictive. An autonomous vehicle will fail to change lanes at all, whereas a human driver would cheerfully zip around the roadway.

With the MIT researchers’ system, if the default buffer zones are leading to performance that’s far worse than a human driver’s, the system will compute new buffer zones on the fly — complete with proof of collision avoidance.

That approach depends on a mathematically efficient method of describing buffer zones, so that the collision-avoidance proof can be executed quickly. And that’s what the MIT researchers developed.

They begin with a so-called Gaussian distribution — the familiar bell-curve probability distribution. That distribution represents the current position of the car, factoring in both its length and the uncertainty of its location estimation.

Then, based on estimates of the car’s direction and velocity, the researchers’ system constructs a so-called logistic function. Multiplying the logistic function by the Gaussian distribution skews the distribution in the direction of the car’s movement, with higher speeds increasing the skew.

The skewed distribution defines the vehicle’s new buffer zone. But its mathematical description is so simple — using only a few equation variables — that the system can evaluate it on the fly.

The researchers tested their algorithm in a simulation including up to 16 autonomous cars driving in an environment with several hundred other vehicles.

“The autonomous vehicles were not in direct communication but ran the proposed algorithm in parallel without conflict or collisions,” explains Pierson. “Each car used a different risk threshold that produced a different driving style, allowing us to create conservative and aggressive drivers. Using the static, precomputed buffer zones would only allow for conservative driving, whereas our dynamic algorithm allows for a broader range of driving styles.”

This project was supported, in part, by the Toyota Research Institute and the Office of Naval Research.

Videos from European Robotics Forum 2018


The European Robotics Forum 2018 (ERF2018), the most influential meeting of the robotics community in Europe, took place in Tampere on 13-15 March 2018. ERF2018 brought together over 900 leading scientists, companies, and policymakers.

Under the theme “Robots and Us”, the over 50 workshops cover current societal and technical themes, including human-robot-collaboration and how robotics can improve industrial productivity and service sector operations.

Click on the list below to watch: Opening Ceremony (13 March), euRobotics Awards Ceremony (14 March), Opening reception (13 March), and the following workshops:

The new H2020 robotics projects in the SPARC strategy
EU Projects offering services
Innovation in H2020 projects – EC Innovation Radar Prize 2017
Success Stories – Step Change Results from FP7 Projects
Drafting a Robot Manifesto
Innovation with Robotics in Regional Clusters

Credits: Visual Outcasts, Tampere Talo, Olli Perttula

British robot-using online grocer licensing their technology to US Kroger chain

Ocado , the UK leader in home-delivered groceries from robot-run distribution centers, has established a licensing deal with US grocery chain Kroger (NYSE:KR) whereby Kroger will take a 5% stake in Ocado – an investment valued at ~$247.5 million and Ocado will help Kroger set up systems to manage online ordering, fulfillment and delivery operations utilizing Ocado-proven technologies. Ocado will see the Kroger chain build up to 20 Ocado-designed robot-run warehouses over its first three years.

In a recent letter to Kroger stockholders, CEO Rodney McMullen said that Kroger is redeploying capital to emphasize improving its digital capabilities and enabling customers to shop in the store, by ordering online and picking up their order at the store, or getting their groceries delivered to their homes. Although McMullen didn’t single out Amazon or any other competitors in the supermarket arena, Amazon’s acquisition of Whole Foods and Walmart’s price-cutting moves and partnering with online grocery deliver service Instacart are rapidly changing the landscape of grocery shopping.

“Kroger is right in the middle of such a reinvention,” McMullen said in the shareholder letter. “We are proactively addressing customer changes and we’re making strategic investments to create the future of retail: a seamless digital experience, customer-centric technology solutions, an enhanced associate experience, space-optimized stores and smart-priced products.”

Ocado has begun to commercialize its technologies and signed its first major deal outside the U.K. with Casino, the operator of French supermarket chain Monoprix. Then came Canada’s Sobeys in January, and this month it was the turn of Sweden’s ICA. The Kroger deal is the biggest yet, and Ocado’s share price is at the time of writing up 56% on the news.

Ocado invested $57.5 million on technology in 2017, up from $46 million the previous year. The company is developing and deploying proprietary technology, has a tech staff of 1,100, and uses about 500 robots interacting with each other on a stacked grid which has allowed it to process more than 20,000 daily orders.

Earlier this year (in February) Ocado raised ~$192.5 million by selling shares. Back then the stock was priced at £487. Today it closed at £861, an increase of 76%!

Page 1 of 4
1 2 3 4