Page 390 of 427
1 388 389 390 391 392 427

Cheetah III robot preps for a role as a first responder

Associate professor of mechanical engineering Sangbae Kim and his team at the Biomimetic Robotics Lab developed the quadruped robot, the MIT Cheetah.
Photo: David Sella

By Eric Brown

If you were to ask someone to name a new technology that emerged from MIT in the 21st century, there’s a good chance they would name the robotic cheetah. Developed by the MIT Department of Mechanical Engineering’s Biomimetic Robotics Lab under the direction of Associate Professor Sangbae Kim, the quadruped MIT Cheetah has made headlines for its dynamic legged gait, speed, jumping ability, and biomimetic design.

The dog-sized Cheetah II can run on four articulated legs at up to 6.4 meters per second, make mild running turns, and leap to a height of 60 centimeters. The robot can also autonomously determine how to avoid or jump over obstacles.

Kim is now developing a third-generation robot, the Cheetah III. Instead of improving the Cheetah’s speed and jumping capabilities, Kim is converting the Cheetah into a commercially viable robot with enhancements such as a greater payload capability, wider range of motion, and a dexterous gripping function. The Cheetah III will initially act as a spectral inspection robot in hazardous environments such as a compromised nuclear plant or chemical factory. It will then evolve to serve other emergency response needs.

“The Cheetah II was focused on high speed locomotion and agile jumping, but was not designed to perform other tasks,” says Kim. “With the Cheetah III, we put a lot of practical requirements on the design so it can be an all-around player. It can do high-speed motion and powerful actions, but it can also be very precise.”

The Biomimetic Robotics Lab is also finishing up a smaller, stripped down version of the Cheetah, called the Mini Cheetah, designed for robotics research and education. Other projects include a teleoperated humanoid robot called the Hermes that provides haptic feedback to human operators. There’s also an early stage investigation into applying Cheetah-like actuator technology to address mobility challenges among the disabled and elderly.

Conquering mobility on the land

“With the Cheetah project, I was initially motivated by copying land animals, but I also realized there was a gap in ground mobility,” says Kim. “We have conquered air and water transportation, but we haven’t conquered ground mobility because our technologies still rely on artificially paved roads or rails. None of our transportation technologies can reliably travel over natural ground or even man-made environments with stairs and curbs. Dynamic legged robots can help us conquer mobility on the ground.”

One challenge with legged systems is that they “need high torque actuators,” says Kim. “A human hip joint can generate more torque than a sports car, but achieving such condensed high torque actuation in robots is a big challenge.”

Robots tend to achieve high torque at the expense of speed and flexibility, says Kim. Factory robots use high torque actuators but they are rigid and cannot absorb energy upon the impact that results from climbing steps. Hydraulically powered, dynamic legged robots, such as the larger, higher-payload, quadruped Big Dog from Boston Dynamics, can achieve very high force and power, but at the expense of efficiency. “Efficiency is a serious issue with hydraulics, especially when you move fast,” he adds.

A chief goal of the Cheetah project has been to create actuators that can generate high torque in designs that imitate animal muscles while also achieving efficiency. To accomplish this, Kim opted for electric rather than hydraulic actuators. “Our high torque electric motors have exceeded the efficiency of animals with biological muscles, and are much more efficient, cheaper, and faster than hydraulic robots,” he says.

Cheetah III: More than a speedster

Unlike the earlier versions, the Cheetah III design was motivated more by potential applications than pure research. Kim and his team studied the requirements for an emergency response robot and worked backward.

“We believe the Cheetah III will be able to navigate in a power plant with radiation in two or three years,” says Kim. “In five to 10 years it should be able to do more physical work like disassembling a power plant by cutting pieces and bringing them out. In 15 to 20 years, it should be able to enter a building fire and possibly save a life.”

In situations such as the Fukushima nuclear disaster, robots or drones are the only safe choice for reconnaissance. Drones have some advantages over robots, but they cannot apply large forces necessary for tasks such as opening doors, and there are many disaster situations in which fallen debris prohibits drone flight.

By comparison, the Cheetah III can apply human-level forces to the environment for hours at a time. It can often climb or jump over debris, or even move it out of the way. Compared to a drone, it’s also easier for a robot to closely inspect instrumentation, flip switches, and push buttons, says Kim. “The Cheetah III can measure temperatures or chemical compounds, or close and open valves.”

Advantages over tracked robots include the ability to maneuver over debris and climb stairs. “Stairs are some of the biggest obstacles for robots,” says Kim. “We think legged robots are better in man-made environments, especially in disaster situations where there are even more obstacles.”

The Cheetah III was slowed down a bit compared to the Cheetah II, but also given greater strength and flexibility. “We increased the torque so it can open the heavy doors found in power plants,” says Kim. “We increased the range of motion to 12 degrees of freedom by using 12 electric motors that can articulate the body and the limbs.”

This is still far short of the flexibility of animals, which have over 600 muscles. Yet, the Cheetah III can compensate somewhat with other techniques. “We maximize each joint’s work space to achieve a reasonable amount of reachability,” says Kim.

The design can even use the legs for manipulation. “By utilizing the flexibility of the limbs, the Cheetah III can open the door with one leg,” says Kim. “It can stand on three legs and equip the fourth limb with a customized swappable hand to open the door or close a valve.”

The Cheetah III has an improved payload capability to carry heavier sensors and cameras, and possibly even to drop off supplies to disabled victims. However, it’s a long way from being able to rescue them. The Cheetah III is still limited to a 20-kilogram payload, and can travel untethered for four to five hours with a minimal payload.

“Eventually, we hope to develop a machine that can rescue a person,” says Kim. “We’re not sure if the robot would carry the victim or bring a carrying device,” he says. “Our current design can at least see if there are any victims or if there are any more potential dangerous events.”

Experimenting with human-robot interaction

The semiautonomous Cheetah III can make ambulatory and navigation decisions on its own. However, for disaster work, it will primarily operate by remote control.

“Fully autonomous inspection, especially in disaster response, would be very hard,” says Kim. Among other issues, autonomous decision making often takes time, and can involve trial and error, which could delay the response.

“People will control the Cheetah III at a high level, offering assistance, but not handling every detail,” says Kim. “People could tell it to go to a specific location at the map, find this place, and open that door. When it comes to hand action or manipulation, the human will take over more control and tell the robot what tool to use.”

Humans may also be able to assist with more instinctive controls. For example, if the Cheetah uses one of its legs as an arm and then applies force, it’s hard to maintain balance. Kim is now investigating whether human operators can use “balanced feedback” to keep the Cheetah from falling over while applying full force.

“Even standing on two or three legs, it would still be able to perform high force actions that require complex balancing,” says Kim. “The human operator can feel the balance, and help the robot shift its momentum to generate more force to open or hammer a door.”

The Biomimetic Robotics Lab is exploring balanced feedback with another robot project called Hermes (Highly Efficient Robotic Mechanisms and Electromechanical System). Like the Cheetah III, it’s a fully articulated, dynamic legged robot designed for disaster response. Yet, the Hermes is bipedal, and completely teleoperated by a human who wears a telepresence helmet and a full body suit. Like the Hermes, the suit is rigged with sensors and haptic feedback devices.

“The operator can sense the balance situation and react by using body weight or directly implementing more forces,” says Kim.

The latency required for such intimate real-time feedback is difficult to achieve with Wi-Fi, even when it’s not blocked by walls, distance, or wireless interference. “In most disaster situations, you would need some sort of wired communication,” says Kim. “Eventually, I believe we’ll use reinforced optical fibers.”

Improving mobility for the elderly

Looking beyond disaster response, Kim envisions an important role for agile, dynamic legged robots in health care: improving mobility for the fast-growing elderly population. Numerous robotics projects are targeting the elderly market with chatty social robots. Kim is imagining something more fundamental.

“We still don’t have a technology that can help impaired or elderly people seamlessly move from the bed to the wheelchair to the car and back again,” says Kim. “A lot of elderly people have problems getting out of bed and climbing stairs. Some elderly with knee joint problems, for example, are still pretty mobile on flat ground, but can’t climb down the stairs unassisted. That’s a very small fraction of the day when they need help. So we’re looking for something that’s lightweight and easy to use for short-time help.”

Kim is currently working on “creating a technology that could make the actuator safe,” he says. “The electric actuators we use in the Cheetah are already safer than other machines because they can easily absorb energy. Most robots are stiff, which would cause a lot of impact forces. Our machines give a little.”

By combining such safe actuator technology with some of the Hermes technology, Kim hopes to develop a robot that can help elderly people in the future. “Robots can not only address the expected labor shortages for elder care, but also the need to maintain privacy and dignity,” he says.

European digital innovation hub strengthens robotics development

Digital innovation hubs are vital to create innovative solutions and employment. ROBOTT-NET is an example of a digital innovation hub, where four leading Research Technology Organizations (RTOs) in Europe aim to strengthen robotics development and the competitiveness of European Manufacturing.

The main objective of ROBOTT-NET is to create a sustainable European infrastructure to support novel robotic technologies on their path to market.

ROBOTT-NET includes Danish Technological Institute (DTI,DK), Fraunhofer IPA (DE), Tecnalia (ES) and The Manufacturing Technology Centre (MTC,UK).

The initiative combines European competencies in state-of-the art applied robotics and enables companies to benefit from Danish, German, Spanish and British expertise, says project coordinator Kurt Nielsen of DTI’s Centre for Robot Technology.

By offering highly qualified consulting services, ROBOTT-NET contributes to an easy access to specialist expertise and helps companies of all sizes bring their ideas to the market and optimize the production.

During the project ROBOTT-NET has arranged Open Labs in Denmark, Germany, England and Spain, where companies can learn about robotics opportunities.

Any European company that wanted to use or produce robots has been invited to apply for a voucher, which is the backbone of the program. Big manufacturers, garage start-ups and everything in between were eligible as long as the idea was concrete enough.

64 projects were selected and received a voucher. The voucher entitled the companies to approximately 400 hours of free consulting with robotics experts from all over Europe at the four partner locations.

Amongst these 64 projects ROBOTT-NET has assisted Orifarm in developing vision system for identifying new medicin boxes, helped AirLiquide in developing an autonomous mobile robot for handling of high pressure containers and supported Trumpf in developing a robot gripper that can perform unstructured bin picking of large metal sheets.

Additionally, eight projects will be selected for a ROBOTT-NET pilot, that will help the companies develop their voucher work through proof of concept level and accelerate it towards commerciality.

Check out the voucher page on ROBOTT-NET.eu and get inspired by the work that already has been done between European Companies and Research Technology Organizations.

Cheetah III robot preps for a role as a first responder

If you were to ask someone to name a new technology that emerged from MIT in the 21st century, there's a good chance they would name the robotic cheetah. Developed by the MIT Department of Mechanical Engineering's Biomimetic Robotics Lab under the direction of Associate Professor Sangbae Kim, the quadruped MIT Cheetah has made headlines for its dynamic legged gait, speed, jumping ability, and biomimetic design.

Arizona bans Uber self-driving cars

The governor of Arizona has told Uber to “get an Uber” and stop testing in the state. With no instructions on how to come back.

Unlike the early positive statements from Tempe police, this letter is harsh and to the point. It’s even more bad news for Uber, and the bad news is not over. Uber has not released any log data that makes them look better, the longer they take to do that, the more it seems that the data don’t tell a good story for them.

In other news, both MobilEye and Velodyne have issued releases that their systems would have seen the pedestrian. Waymo has said the same, and I believe that all of them are correct. Waymo has a big press event scheduled for this week in New York, rumoured to announce some new shuttle operations there. I wonder how much consideration they gave to delaying it, because in spite of their superior performance, a lot of the questions they will get at the press conference won’t be about their new project.

There are more signs that Uber’s self-driving project may receive the “death penalty,” or at the very least a very long and major setback. A long and major setback in a field where Uber thought “second place is first loser” to quote Anthony Levandowski.

Sherlock Drones – automated investigators tackle toxic crime scenes

Using drones to gather information and samples from a hazardous scene can help incident commanders make critical decisions. Image credit – ROCSAFE

by Anthony King

Crimes that involve chemical, biological, radiological or nuclear (CBRN) materials pose a deadly threat not just to the target of the attack but to innocent bystanders and police investigators. Often, these crimes may involve unusual circumstances or they are terrorist-related incidents, such as an assassination attempt or the sending of poisons through the mail.

In the recent notorious case of poisoning in the UK city of Salisbury in March 2018, a number of first responders and innocent bystanders were treated in hospital after two victims of chemical poisoning were found unconscious on a park bench. One policeman who attended the scene became critically ill after apparent exposure to a suspected chemical weapon, said to be a special nerve agent called novichok. Police said a total of 21 people required medical care after the incident.

Past examples of rare but toxic materials at crime scenes include the 2001 anthrax letter attacks in the US and the Tokyo subway sarin gas attack in 1995. Following the radioactive poisoning of the Russian former spy, Alexander Litvinenko in London, UK, in 2006, investigators detected traces of the toxic radioactive material polonium in many locations around the city.

Despite these dangers, crime scene investigators must begin their forensic investigations immediately. European scientists are developing robot and remote-sensing technology to provide safe ways to assess crime or disaster scenes and begin gathering forensic evidence.

Harm’s way

‘We will send robots into harm’s way instead of humans,’ explained Professor Michael Madden at the National University of Ireland Galway, who coordinates a research project called ROCSAFE. ‘The goal is to improve the safety of crime scene investigators.’

The ROCSAFE project, which ends in 2019, will deploy remote-controlled aerial and ground-based drones equipped with sensors to assess the scene of a CBRN event without exposing investigators to risk. This will help to determine the nature of the threat and gather forensics.

In the first phase of a response, a swarm of drones with cameras will fly into an area to allow investigators to view it remotely. Rugged sensors on the drones will check for potential CBRN hazards. In the second phase, ground-based robots will roll in to collect evidence, such as fingerprint or DNA samples.

The ROCSAFE aerial drone could assess crime or disaster scenes such as that of a derailed train carrying radioactive material. It will deploy sensors for immediate detection of radiation or toxic agents and collect air samples to test later in a laboratory. Meanwhile, a miniature lab-on-a-chip on the drone will screen returned samples for the presence of viruses or bacteria, for instance.

An incident command centre usually receives a huge volume of information in a short space of time – including real-time video and images from the scene. Commanders need to process a lot of confusing information in an extreme situation quickly and so ROCSAFE is also developing smart software to lend a hand.

Rare events

‘These are rare events. This is nobody’s everyday job,’ said Prof. Madden. ‘We want to use artificial intelligence and probabilistic reasoning to reduce cognitive load and draw attention to things that might be of interest.’

As an example, image analysis software might flag an area with damaged vegetation, suggest a possible chemical spill and suggest that samples be taken. Information such as this could be presented to the commander on a screen in the form of a clickable map, in a way that makes their job easier.

Sometimes, the vital evidence itself could be contaminated. ‘There may be some physical evidence we need to collect – a gun or partly exploded material or a liquid sample,’ said Prof. Madden. ‘The robot will pick up, tag and bag the evidence, all in a way that will stand up in court.’ The researchers are constructing a prototype six-wheeled robot for this task that is about 1.5-metres-long and that can handle rough terrain.

Helping forensic teams to deal with hazardous evidence is a new forensic toolbox called GIFT CBRN. The GIFT researchers have devised standard operating procedures on how to handle, package and analyse toxins such as the nerve-agent ricin, which is deadly even in minute amounts.

When the anthrax powder attacks took place in the US in 2001, the initial response of the security services was slow, partly because of the unprecedented situation. GIFT scientists have drawn up ‘how to’ guides for investigators, so they can act quickly in response to an incident.

‘We want to find the bad guys quickly so we can stop them and arrest those involved,’ said Ed van Zalen at the Netherlands Forensic Institute, who coordinated the GIFT project.

Nerve agents

As well as containment, GIFT devised sensing technology such as a battery-powered boxed device that can be brought to a crime scene to identify nerve agents like sarin within an hour or two. This uses electrophoresis, a chemical technique that identifies charged molecules by applying an electric field and analysing their movements. Usually, samples must be collected and returned to the lab for identification, which takes far longer.

Additionally, they developed a camera to detect radioactive material that emits potentially damaging radiation called alpha particles. This form of radiation is extremely difficult to detect and even a Geiger counter – used to detect most radiation – cannot pick it up. The substance polonium, used to murder Litvinenko, emits alpha particles which create radiation poisoning but because it was a novel attack, the failure to detect it initially slowed the police investigation.

‘That detector would have been very helpful at the time of the Litvinenko case,’ said van Zalen.

The research in this article is funded by the EU.

Almost every thing that went wrong in the Uber fatality is both terrible and expected

Source: Uber

Today I’m going to examine how you attain safety in a robocar, and outline a contradiction in the things that went wrong for Uber and their victim. Each thing that went wrong is both important and worthy of discussion, but at the same time unimportant. For almost every thing that went wrong Is something that we want to prevent going wrong, but it’s also something that we must expect will go wrong sometimes, and to plan for it.

In particular, I want to consider how things operate in spite of the fact that people will jaywalk, illegal or not, car systems will suffer failures and safety drivers will sometimes not be looking.

What’s new

First, an update on developments.

Uber has said it is cooperating fully, but we certainly haven’t heard anything more from them, or from the police. That’s because:

  • Police have indicated that the accident has been referred for criminal investigation, and the NTSB is also present.
  • The family (only a stepdaughter is identified) have retained counsel, and are demanding charges and considering legal action.

A new story in the New York Times is more damning for Uber. There we learn:

  • Uber’s performance has been substandard in Arizona. They are needing an intervention after 13 miles of driving on average. Other top companies like Waymo go many thousands of miles.
  • Uber just recently switched to having one safety driver instead of two, though it was still using two in the more difficult test situations. Almost all companies use two safety drivers, though Waymo has operations with just one, and quite famously, zero.
  • Uber has safety drivers use a tablet to log interventions and other data, and there are reports of safety drivers doing this while in self-drive mode. Wow.
  • Uber’s demos of this car have shown the typical “what the car sees” view to both passengers and the software operator. It shows a good 360 degree view as you would expect from this sort of sensor suite, including good renderings of pedestrians on the sidewalk. Look at this youtube example from two years ago.
  • Given that when operating normally, Uber’s system has the expected pedestrian detection, it is probably not the case that Uber’s car just doesn’t handle this fairly basic situation. We need to learn what specifically went wrong.
  • If Uber did have a general pedestrian detection failure at night, there should have been near misses long before this impact. Near misses are viewed as catastrophic by most teams, and get immediate attention, to avoid situations like this fatality.
  • If equipment such as LIDARs or cameras or radars were to fail, this would generally be obvious to the software system, which should normally cause an immediate alert asking the safety driver to take over. In addition, most designs have some redundancies, so they can still function at some lower level with a failed sensor while they wait for the safety driver — or even attempt to get off the road.
  • The NYT report indicates that new CEO Dara Khosrowshahi had been considering cancelling the whole project, and a big demo for him was pending. There was pressure on to make the demo look good.

In time, either in this investigation or via lawsuits, we should see:

  • Logs of raw sensor data from the LIDAR, radar and camera arrays, along with tools to help laypeople interpret this data.
  • If those logs clearly show the victim well in advance of impact, why did the vehicle not react? What sort of failure was it?
  • Why was the safety driver staring at something down and to the right for so long? Why was there only one safety driver on this road at night at 40mph?

That the victim was jaywalking is both important and unimportant

The law seems to be clear that the Uber had the right of way. The victim was tragically unwise to cross there without looking. The vehicle code may find no fault with Uber. In addition, as I will detail later, crosswalk rules exist for a reason, and both human drivers and robocars will treat crosswalks differently from non-crosswalks.

Even so, people will jaywalk, and robocars need to be able to handle that. Nobody can handle somebody leaping quickly off the sidewalk into your lane, but a person crossing 3.5 lanes of open road is something even the most basic cars should be able to handle, and all cars should be able to perceive and stop for a pedestrian standing in their lane on straight non-freeway road. (More on this in a future article.)

The law says this as well. While the car has right of way, the law still puts a duty on the driver to do what they reasonably can to avoid hitting a jaywalker in the middle of the road.

That Uber’s system failed to detect the pedestrian is both important and unimportant

We are of course very concerned as to why the system failed. In particular, this sort of detect-and-stop is a very basic level of operation, expected of even the most simple early prototypes, and certainly of a vehicle from a well funded team that’s logged a million miles.

At the same time, cars must be expected to have failures, even failures as bad as this. In the early days of robocars, even at the best teams, major system failures happened. I’ve been in cars that suddenly tried to drive off the road. It happens, and you have to plan for it. The main fallback is the safety driver, though now that the industry is slightly more mature, it is also possible to use simpler automated systems (like ADAS “forward collision warning” and “lanekeeping” tools) to also guard against major failures.

We’re going to be very hard on Uber, and with justification, for having such a basic failure. “Spot a pedestrian in front of you and stop” have been moving into the “solved problem” category, particularly if you have a high-end LIDAR. But we should not forget there are lots of other things that can, and do go wrong that are far from solved, and we must expect them to happen. These are prototypes. They are on the public roads because we know no other way to make them better, to find and solve these problems.

That the safety driver wasn’t looking is both important and unimportant

She clearly was not doing her job. The accident would have been avoided if she had been vigilant. But we must understand that safety drivers will sometimes look away, and miss things, and make mistakes.

That’s true for all of us when we drive, with our own life and others at stake. Many of us do crazy things like send texts, but even the most diligent are sometimes not paying enough attention for short periods. We adjust controls, we look at passengers, we look behind us and (as we should) check blindspots. Yet the single largest cause of accidents is “not paying attention.” What that really means is that two things went wrong at once — something bad happened while we were looking somewhere else. For us the probability of an accident is highly related to the product of those two probabilities.

The same is true for robocars with safety drivers. The cars will make mistakes. Sometimes the driver will not catch it. When both happen, an accident is possible. If the total probability of that is within the acceptable range (which is to say, the range for good human drivers) then testing is not putting the public at any extraordinary risk.

This means a team should properly have a sense of the capabilities of its car. If it’s needing interventions very frequently, as Uber was reported to, it needs highly reliable safety driving. In most cases, the answer is to have two safety drivers, 2 sets of eyes potentially able to spot problems. Or even 1.3 sets of eyes, because the 2nd operator is, on most teams, including Uber, mostly looking at a screen and only sometimes at the road. Still better than just one pair.

At the same time, since the goal is to get to zero safety drivers, it is not inherently wrong to just have one. There has to be a point where a project graduates to needing only one. Uber’s fault is, possibly, graduating far, far too soon.

To top all this, safety drivers, if the company is not careful, are probably more likely to fatigue and look away from the road than ordinary drivers in their own cars. After all, it is actually safer to do so than it is to do in your own car. Tesla autopilot owners are also notoriously bad at this. Perversely, the lower the intervention rate, the more likely it is people will get tempted. Companies have to combat this.

If you’re a developer trying out some brand new and untrusted software, you safety drive with great care. You keep your hands near the wheel. Your feet near the pedals. Your eyes on the lookout. You don’t do it for very long, and you are “rewarded” by having to do an intervention often enough that you never tire. To consider the extreme view of that, think about driving adaptive cruise control. You still have to steer, so there’s no way you take your eyes off the road even though your feet can probably relax.

Once your system gets to a high level (like Tesla’s autopilot in simple situations or Waymo’s car) you need to find other ways to maintain that vigilance. Some options include gaze-tracking systems that make sure eyes are on the road. I have also suggested that systems routinely simulate a failure, but drifting out of their lane when it is safe to do so, but correcting it before it gets dangerous if for some reason the safety driver does not intervene. A safety driver who is grabbing the wheel 3 times an hour and scored on it is much less likely to miss the one time a week they actually have to grab it for real.

That it didn’t brake at all — that’s just not acceptable

While we don’t have final confirmation, reports suggest the vehicle did not slow at all. Even if study of the accident reveals a valid reason for not detecting the victim 1.4 seconds out (as needed to fully stop) there are just too many different technologies that are all, independently, able to detect her at a shorter distance which should have at least triggered some braking and reduced severity.

They key word is independently. As explained above, failures happen. A proper system is designed to still do the best it can in the event of failures of independent components. Failure of the entire system should be extremely unlikely, because the entire system should not be a monolith. Even if the main perception system of the car fails for some reason (as may have happened here) that should result in alarm bells going off to alert the safety driver, and it should also result in independent safety systems kicking in to fire those alarms or even hit the brakes. The Volvo comes with such a system, but that system is presumably disabled. Where possible, a system like that should be enabled, but used only to beep warnings at the safety driver. There should be a “reptile brain” at the low level of the car which, in the event of complete failure of all high level systems, knows enough to look at raw radar, LIDAR or camera data and sound alarms or trigger braking if the main system can’t.

All the classes of individual failures that happened to Uber could happen to a more sophisticated team in some fashion. In extreme bad luck they could even happen all at once. The system should be designed to make it very unlikely that they won’t all happen at once, and that the probability of that is less than the probability of a human having a crash.

More to come

So much to write here, so in the future look for thoughts on:

  • How humans and robocars will act differently when approaching a crosswalk and not approaching one, and why
  • More about how many safety drivers you have
  • What are reasonable “good practices” that any robocar should have, and what are exceptions to them
  • How do we deal with the fact that we probably have to overdrive our sensor range on high speed highways, as humans usually do?
  • More about interventions and how often they happen and why they happen
  • Does this mean the government should step in to stop bad actors like Uber? Or will the existing law (vehicle code, criminal law and tort law) punish them so severely — possibly with a “death penalty” for their project — that we can feel it’s working?

Quiet inroads in robotics: the Vecna story

Robotics is undergoing fundamental change in three core areas: collaboration, autonomous mobility and increasing intelligence.

Autonomous mobility technology is entering the industrial vehicle marketplace of AGVs, forklifts and tugs with new products, better navigation technologies and lower costs.

Forecasters Grandview Research and IDTechEx suggest that autonomous forklifts and tugs will emerge as the standard from 2022/2023 onwards, ultimately growing to represent 70% of annual mobile material handling equipment by 2037. The key to this transformation is unmanned mobile autonomy. These new mobile autonomous robots can achieve higher productivity and cost efficiencies because the technology largely reduces the driver labor costs, increases safety, and lowers insurance rates and spoilage.

The Vecna Story

Cambridge, MA-based Vecna Technologies, founded in 1998 by a group of MIT scientists on a $5,000 shoe-string investment from the founders, has self-funded itself into a profitable ongoing manufacturer, researcher and software firm serving the healthcare, logistics and remote presence marketplaces. They have amassed more than a hundred issued and pending patents and employ more than 200.

Earlier this year Vecna Technologies spun off 60 employees and the robotics business to found and operate Vecna Robotics working with a large number of partners and contractors. The new entity’s primary applications are to provide mixed fleets of mobile robotic solutions for:

  • Goods to person
  • Receiving to warehouse
  • Production cell to cell
  • Point to point gofering
  • Zone picking transport
  • Tote and case picking transport

Vecna already has a broad range of products serving these applications: from tuggers like at FedEx (see video below) to RC20s which are the lowest cost per performance mobile robot on the market and several models in between. Thousands of Vecna robots are deployed worldwide in (1) major manufacturing facilities doing line-side replenishment; (2) in major shipping companies moving non-conveyables and automating indoor and outdoor tuggers and lifts; and (3) in major 3PLs and retailers doing order fulfillment transport both for store replenishment and for e-commerce.

A recent NY Times story exemplifies how these new Vecna Robotics autonomous mobile robots are impacting the world of material handling. In this case, Vecna robots are used by FedEx to handle large items that don’t fit on conveyor belts.

“When a truck filled with packages arrives, workers load the bulky items onto trailers hitched to a robot. Once these trailers are full, they press a button that sends the vehicle on its way. Equipped with laser-based sensors, cameras and other navigation tools, the robots stop when people or other vehicles get in the way. In some cases, they even figure out a new way to go.”

Vecna robots have vision systems that allow them to navigate safely around humans so that they can share common paths. And they have Autonomy Kit, a general purpose robot brain that can turn any piece of equipment into a safe and efficient mobile robot. Everything from large earth moving and construction equipment to forklifts, tuggers, floor cleaners, and even small order fulfillment and each picking systems can easily be automated and operate in collaborative human-filled environments. Further, all Vecna systems are directed by a smart centralized controller for optimization, traffic control and service. Because Vecna Robotics is finding so much demand (and success) in this sector, it is considering bringing in outside money to fund a more rapid expansion into the marketplace.

Meanwhile, Vecna Technologies, sans the robotics group, remains a leader in healthcare information technology providing patient portals, payment solutions, kiosks, mobile apps, telepresence and medical logistics, and “will continue to innovate and accelerate cutting edge solutions to our customers in the commercial and government healthcare markets,” says Vecna CTO Daniel Theobald.

Marketplace full of competitors, many from China

Source: Styleintelligence G2P Robotics, Feb 2018

As competitors sense the growing demand from distribution and fulfillment center executives in need of solutions to pick, pack and ship more parcels quickly, there are many startups and companies inventing or modifying their products to solve those problems and take advantage of the demand.

There is also increasing demand from factory managers who need flexibility to move goods within their facilities that cannot be handled economically by human workers or fixed conveyor systems.

Both markets are growing exponentially and, as can be seen by the two charts above, there are many players competing in the field. Further, the market is also fueled by approved investment priorities in capital purchases that were put off during and after the financial crisis of 2008-9. This can be seen in the VDC Research graphic on the right which surveyed manufacturing executives about their capital purchasing plans for 2018-2020.

Vecna responded to those demands years ago when it began developing and expanding its line of robots and accompanying software. The refocusing that went into spinning off Vecna Robotics will help enable Vecna to continue to be a big, innovative and progressive player in the mobile robotics market.

Robotic collaboration in timber construction

At Spatial Timber Assemblies, man and machine work together in both the planning and the manufacturing process. (Photograph: NFS Digital Fabrication / Roman Keller)

NCCR Researchers are using a new method for digital timber construction in a real project for the first time. The load-bearing timber modules, which are prefabricated by robots, will be assembled on the top two floors at the DFAB HOUSE construction site.

Digitalisation has found its way into timber construction, with entire elements already being fabricated by computer-aided systems. The raw material is cut to size by the machines, but in most cases it still has to be manually assembled to create a plane frame. In the past, this fabrication process came with many geometric restrictions.

Under the auspices of the National Centre of Competence in Research (NCCR) Digital Fabrication, researchers from ETH Zurich’s Chair of Architecture and Digital Fabrication have developed a new, digital timber construction method that expands the range of possibilities for traditional timber frame construction by enabling the efficient construction and assembly of geometrically complex timber modules. Spatial Timber Assemblies evolved from a close collaboration with Erne AG Holzbau and will be used for the first time in the DFAB HOUSE project at the Empa and Eawag NEST research and innovation construction site in Dübendorf. It is also the first large-scale architectural project to use the construction robots developed by ETH Zurich’s new Robotic Fabrication Laboratory.

With robotic precision
The robot first takes a timber beam and guides it while it is sawed to size. After an automatic tool change, a second robot drills the required holes for connecting the beams. In the final step, the two robots work together and position the beams in the precise spatial arrangement based on the computer layout. To prevent collisions when positioning the individual timber beams, the researchers have developed an algorithm that constantly recalculates the path of motion for the robots according to the current state of construction. Workers then manually bolt the beams together.

Longer lasting, more individual construction
Unlike traditional timber frame construction, Spatial Timber Assemblies can manage without reinforcement plates because the required rigidity and load-bearing result from the geometric structure. Not only does this save material; it also opens up new creative possibilities. A total of six spatial, geometrically unique timber modules will be prefabricated in this way for the first time. Lorries will then transport them to the DFAB HOUSE construction site at the NEST in Dübendorf, where they will be joined to build a two-storey residential unit with more than 100 m2 of floor space. The complex geometry of the timber construction will remain visible behind a transparent membrane façade.

Integrated digital architecture
The robots use information from a computer-aided design model to cut and arrange the timber beams. This method was specially developed during the project and uses various input parameters to create a geometry consisting of 487 timber beams in total.

The fact that Spatial Timber Assemblies is being used for digital fabrication and also in design and planning offers a major advantage according to Matthias Kohler, Professor of Architecture and Digital Fabrication at ETH Zurich and the man spearheading the DFAB HOUSE project: “If any change is made to the project overall, the computer model can be constantly adjusted to meet the new requirements. This kind of integrated digital architecture is closing the gap between design, planning and execution.”

More Information in ETH Zurich Press Release
Detailed information about the building process, quotes as well as image and video material can be found in the extended press release by ETH Zurich.

Projects that will shape the future of robotics win the euRobotics Awards 2018


The European Robotics Forum 2018 (ERF2018) in Tampere brought together over 900 attendees from robotics academia and industry. To bridge the two, euRobotics hosted the Georges Giralt PhD Award 2017 & 2018 and the TechTransfer Award 2018, during a Gala Dinner event on 14 March, in Tampere, Finland.

The aim of the euRobotics Technology Transfer Award (now in its 15th year) is to showcase the impact of robotics research and to raise the profile of technology transfer between science and industry. Outstanding innovations in robot technology and automation that result from cooperative efforts between research and industry are eligible for the prize.

The First prize went to Germany’s Roboception: rc_visard – 3D perception & manipulation for robots made easy team, composed of Michael Suppa, and Heiko Hirschmueller from Roboception GmbH, and Alin Albu-Schaeffer from the Institute of Robotics and German Aerospace Center (DLR).

“This award is a recognition of our institute’s continued efforts of supporting the go-to-market of technologies developed at our institute or – as in this case – derived thereof,” said Prof. Albu-Schaeffer. Dr. Suppa added that “since spinning Roboception off the DLR in 2015, a significant amount of thought, hard work and – first and foremost – unfailing commitment of our team have gone into bringing this technology from a research state to a market-ready product. And we are very proud to see these efforts recognized by the jury, and rewarded with this prestigious award.” The rc_visard is already in operational use in a number of customer projects across a variety of robotic domains. Prof. Albu-Schaeffer, an enthused rc_visard user at his DLR Institute himself, is convinced that “this product is one that will shape the future of robotics, thanks to its unique versatility.”

Watch the an interview with rc_visard filmed at Hannover Messe

The Second prize went to Germany, to the project Mobile Agricultural Robot Swarms (MARS), created by a team made up of Timo Blender and Christian Schlegel, from Hochschule Ulm, and Benno Pichlmaier from AGCO GmbH. The MARS experiment aims at the development of small and stream-lined mobile agricultural robot units to fuel a paradigm shift in farming practices. 

The Third prize went to Smart Robots from Italy, a team made up of Paolo Rocco and Andrea Maria Zanchettin, from Politecnico di Milano, and Roberto Rossi, from Smart Robots s.r.l., Italy. Smart robots provides advanced perception and intelligent capabilities to robots, enabling new disruptive forms of collaboration with human beings.

The Finalist was the In situ Fabricator (IF), an autonomous construction robot from Switzerland, with a team made up of Kathrin Dörfler from Gramazio Kohler Research, ETH Zurich, NCCR Digital Fabrication, Markus Giftthaler and Timothy Sandy from Agile & Dexterous Robotics Lab, ETH Zurich, NCCR Digital Fabrication.

Technology Transfer Award winners and coordinator Martin Haegele. Credits: Visual Outcasts

The members of jury of the TechTransfer Award were: Susanne Bieller (Eunited Robotics), Rainer Bischoff (KUKA), Georg von Wichert (Siemens), Herman Bruyninckx (KU Leuven), Martin Haegele (Fraunhofer IPA). The euRobotics TechTransfer Award 2018 is funded by the EU’s Horizon 2020 Programme.

The Georges Giralt PhD Award showcased the best PhD theses defended during 2017 and 2018 in European universities from all areas of robotics

The 2017 edition saw 41 submissions from 11 countries. The jury composed of 26 academics from 16 European countries awarded:

Winner: Johannes Englsberger, Technical University of Munich, Germany – Combining reduced dynamics models and whole-body control for agile humanoid locomotion

Finalists:

  • Christian Forster, University of Zurich, Switzerland – Visual Inertial Odometry and Active Dense Reconstruction for Mobile Robots
  • Stefan Groothuis, University of Twente, the Netherlands – On the Modeling, Design, and Control of Compliant Robotic Manipulators
  • Meng Guo, KTH Royal Institute of Technology, Sweden – Hybrid Control of Multi-robot Systems under Complex Temporal Tasks
  • Nanda van der Stap, University of Twente, the Netherlands – Image-based endoscope navigation and clinical applications
Georges Giralt PhD Award 2017 winners and coordinator Gianluca Antonelli. Credits: Visual Outcasts

The 2018 edition saw 27 submissions from 11 countries. The jury composed of 26 academics from 16 European countries awarded:

Winners:

  • Frank Bonnet, École polytechnique fédérale de Lausanne (EPFL), Switzerland – Shoaling with fish: using miniature robotic agents to close the interaction loop with groups of zebrafish Danior rerio
  • Daniel Leidner, University of Bremen, Germany – Cognitive Reasoning for Compliant Robot Manipulation

Finalists:

  • Adrià Colomé Figueras, University Politècnica de Catalunya, Spain – Bimanual Robot Skills: MP Encoding and Dimensionality Reduction
  • Đula Nađ, University of Zagreb, Croatia – Guidance and control of autonomous underwater agents with acoustically aided navigation
  • Angel Santamaria-Navarro, University Politècnica de Catalunya, Spain – Visual Guidance of Unmanned Aerial Manipulators
Georges Giralt PhD Award 2018 winners and coordinator Gianluca Antonelli. Credits: Visual Outcasts

The Ceremony saw the handing of the European Robotics League (ERL) Awards: ERL Industrial Robots and ERL Service Robots, and ERL Emergency Robots. Points are awarded by attending local and major tournaments, not a central event, with three main objectives: strengthening the European robotics industry, pushing innovative autonomous systems for emergency response, and addressing societal challenges of aging populations in Europe. The European Robotics League is funded by the EU’s Horizon 2020 Programme. 

Read the ERL Awards press release: European Robotics League winners revealed in Tampere, Finland (Season 2017/18)

 

Robot-mounted vacuum grippers flex their artificial muscles

A short electric pulse is all it takes to generate and release a powerful vacuum in the blink of an eye. The novel vacuum gripper developed by the research team led by Professor Stefan Seelecke at Saarland University enables robot arms to pick up objects and move them around freely in space. The system works without the need for compressed air to generate the vacuum, it is energy efficient, quiet and suitable for use in clean rooms. The specialists for intelligent materials systems make use of artificial muscles, which are bundles of ultrafine shape memory wires that are able to tense and relax just as real muscle fibres do. The wires also function as sensors and can sense, for example, when the gripper needs to readjust or tighten its grip.
Page 390 of 427
1 388 389 390 391 392 427