Page 396 of 427
1 394 395 396 397 398 427

#254: Collaborative Systems for Drug Discovery, with Peter Harris



In this episode, Abate interviews Peter Harris from HighRes Biosolutions about automation in the field of drug discovery. At HighRes Biosolutions they are developing modular robotic systems that work alongside scientists to automate laboratory tasks. Because the requirements of each biomedical research laboratory are so varied, the robotic systems are specifically tailored to meet the requirements of each lab.


Peter Harris
Peter Harris is the CEO of HighRes Biosolutions. Prior to HighRes, Peter was VP and Managing Director at Axel Johnson, Inc. He spent most his career as the President & CEO of Cadence, Inc., a high technology medical device manufacturing and engineering firm enabling medical companies to bring better devices to market faster. Peter has been a Visiting Executive Lecturer at the Darden School of Business at the University of Virginia for over 10 years.

 

 

Links

Robots in Depth with Peter Corke


In this episode of Robots in Depth, Per Sjöborg speaks with Peter Corke, distinguished professor of robotic vision from Queensland University of Technology, and Director of the ARC Centre of Excellence for Robotic Vision. Peter is well known for his work in computer vision and has written one of the books that defines the area. He talks about how serendipity made him build a checkers playing robot and then move on to robotics and machine vision. We get to hear about how early experiments with “Blob Vision” got him interested in analyzing images and especially moving images, and his long and interesting journey giving robots eyes to see the world.

The interview ends with Peter adding a new item to the CV, fashion model, when he shows us the ICRA 2018 T-shirt!

Programming drones to fly in the face of uncertainty

Researchers trail a drone on a test flight outdoors.
Photo: Jonathan How/MIT

Companies like Amazon have big ideas for drones that can deliver packages right to your door. But even putting aside the policy issues, programming drones to fly through cluttered spaces like cities is difficult. Being able to avoid obstacles while traveling at high speeds is computationally complex, especially for small drones that are limited in how much they can carry onboard for real-time processing.

Many existing approaches rely on intricate maps that aim to tell drones exactly where they are relative to obstacles, which isn’t particularly practical in real-world settings with unpredictable objects. If their estimated location is off by even just a small margin, they can easily crash.

With that in mind, a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has developed NanoMap, a system that allows drones to consistently fly 20 miles per hour through dense environments such as forests and warehouses.

One of NanoMap’s key insights is a surprisingly simple one: The system considers the drone’s position in the world over time to be uncertain, and actually models and accounts for that uncertainty.

“Overly confident maps won’t help you if you want drones that can operate at higher speeds in human environments,” says graduate student Pete Florence, lead author on a new related paper. “An approach that is better aware of uncertainty gets us a much higher level of reliability in terms of being able to fly in close quarters and avoid obstacles.”

Specifically, NanoMap uses a depth-sensing system to stitch together a series of measurements about the drone’s immediate surroundings. This allows it to not only make motion plans for its current field of view, but also anticipate how it should move around in the hidden fields of view that it has already seen.

“It’s kind of like saving all of the images you’ve seen of the world as a big tape in your head,” says Florence. “For the drone to plan motions, it essentially goes back into time to think individually of all the different places that it was in.”

The team’s tests demonstrate the impact of uncertainty. For example, if NanoMap wasn’t modeling uncertainty and the drone drifted just 5 percent away from where it was expected to be, the drone would crash more than once every four flights. Meanwhile, when it accounted for uncertainty, the crash rate reduced to 2 percent.

The paper was co-written by Florence and MIT Professor Russ Tedrake alongside research software engineers John Carter and Jake Ware. It was recently accepted to the IEEE International Conference on Robotics and Automation, which takes place in May in Brisbane, Australia.

For years computer scientists have worked on algorithms that allow drones to know where they are, what’s around them, and how to get from one point to another. Common approaches such as simultaneous localization and mapping (SLAM) take raw data of the world and convert them into mapped representations.

But the output of SLAM methods aren’t typically used to plan motions. That’s where researchers often use methods like “occupancy grids,” in which many measurements are incorporated into one specific representation of the 3-D world.

The problem is that such data can be both unreliable and hard to gather quickly. At high speeds, computer-vision algorithms can’t make much of their surroundings, forcing drones to rely on inexact data from the inertial measurement unit (IMU) sensor, which measures things like the drone’s acceleration and rate of rotation.

The way NanoMap handles this is that it essentially doesn’t sweat the minor details. It operates under the assumption that, to avoid an obstacle, you don’t have to take 100 different measurements and find the average to figure out its exact location in space; instead, you can simply gather enough information to know that the object is in a general area.

“The key difference to previous work is that the researchers created a map consisting of a set of images with their position uncertainty rather than just a set of images and their positions and orientation,” says Sebastian Scherer, a systems scientist at Carnegie Mellon University’s Robotics Institute. “Keeping track of the uncertainty has the advantage of allowing the use of previous images even if the robot doesn’t know exactly where it is and allows in improved planning.”

Florence describes NanoMap as the first system that enables drone flight with 3-D data that is aware of “pose uncertainty,” meaning that the drone takes into consideration that it doesn’t perfectly know its position and orientation as it moves through the world. Future iterations might also incorporate other pieces of information, such as the uncertainty in the drone’s individual depth-sensing measurements.

NanoMap is particularly effective for smaller drones moving through smaller spaces, and works well in tandem with a second system that is focused on more long-horizon planning. (The researchers tested NanoMap last year in a program tied to the Defense Advanced Research Projects Agency, or DARPA.)

The team says that the system could be used in fields ranging from search and rescue and defense to package delivery and entertainment. It can also be applied to self-driving cars and other forms of autonomous navigation.

“The researchers demonstrated impressive results avoiding obstacles and this work enables robots to quickly check for collisions,” says Scherer. “Fast flight among obstacles is a key capability that will allow better filming of action sequences, more efficient information gathering and other advances in the future.”

This work was supported in part by DARPA’s Fast Lightweight Autonomy program.

Research arrows are up with one common thread: digital data

MasterCard CEO Ajay Banga said that “data is the new oil.” Masayoshi Son, CEO of SoftBank, says that artificial intelligence combined with data gathered by billions of sensors is bringing on an information revolution.

Manufacturers everywhere are changing – some, with government assistance – because of new technologies, new competitors, new ecosystems, and new ways of doing business. Companies that adopt these new digital-based capabilities are creating value in their businesses and becoming leaders of their industries.

The following 18 recent research reports, grouped into five categories, cover the robotics industry. All indicate double-digit compound annual growth rates (CAGR) and each makes the connection of digitalization and data to growth and economic efficiencies.

  1. Industrial & collaborative robots
  2. Service & surgical robotics
  3. Agricultural & food handling robotics
  4. Surveillance robots, drones & ROVs
  5. Autonomous vehicles

Industrial & collaborative robots

  1. Feb 2018, 123 pages, QY Research, $3,600
    QY doesn’t provide forecasts in their promotional materials.
  2. Dec 2017, 23 pages, ABI Research, $3,500
    To 2025, global revenue of collaborative robotics shipments is forecast to grow at a 49.8% CAGR compared to 12.1% for Industrial robots and 23.2% per cent for service robotics.
  3. Dec 2017, 95 pages, TechNavio, $2,500
    The global robotic injection molding machine market to grow at a CAGR of 4.94% during the period 2017-2021.
  4. Dec 2017, 93 pages, TechNavio, $2,500
    The global handling, degating, and deflashing robots market is expected to grow at a CAGR of close to 11% from 2017-2021.

surgeon-robotService & surgical robots

  1. Dec 2017, 127 pages, Tractica, $4,200
    Tractica forecasts that worldwide shipments of enterprise robots (ag, construction, warehousing and logistics, remote presence, customer service) will grow from 83,000 units in 2016 to 1.2 million units in 2022, at a CAGR of 57% during that period.  Revenue will increase from $5.9 billion in 2016 to $67.9 billion in 2022.
  2. Feb 2018, BIS Research, $4,599
    Analyzes progress of Intuitive Surgical Inc., Stryker, Mazor Robotics, Hansen Medical, MedRobotics, TransEnterix, Accuray, Renishaw, Think Surgical, Synaptive Medical, Titan Medical and Smith & Nephew.
  3. Global service report market
    Jan 2018, 124 pages, QY Research, $3,600
    No forecasts available.

Agricultural & food handling robotics

  1. Dec 2017, Progressive Markets, $3,619
    The agricultural robots market is likely to garner $15.3 billion by 2025, growing at a CAGR of 20.95% during the forecast period from 2018 to 2025.
  2. Dec 2017, AlphaBrown, $2,975
    Early adopters (500+ acres and greenhouse growers) of harvesting robotics to reach $5.5bn according to study of 1,300 growers whose principle reason is to off-set the cost of labor.
  3. Nov 2017, 90 pages, Grand View Research, $4,950
    The global food robotics market is anticipated to reach USD 3.35 billion by 2025.
  4. Jan 2018, 213 pages, IDTechEx, $4,995
    Forecasts based on technology roadmaps, suggest that the market will grow to $35Bn by 2038 with the potential to reach higher levels- around $45Bn- in a highly accelerated technology progression and market adoption scenario. The infection point will arrive in 2024 onwards. At this point, sales will rapidly grow.
  5. Dec 2017, 104 pages, QY Research, $2,900
    No forecasts available.
  6. Dec 2017, 99 pages, QY Research, $3,400
    No forecasts available.

Surveillance robots, drones, mobile robots & ROVs

  1. Sep 2017, 70 pages, TechNavio, $3,500
    Forecasts the global surveillance robots market to grow at a CAGR of 12.31% during the period 2017-2021.
  2. Oct 2017, 170 pages, Persistence Market Research, $4,900
    Focuses on maritime security coupled with greater offshore oil & gas production are the key drivers of the autonomous underwater vehicle market that is anticipated to push past $445 million by end 2022 at a CAGR of 6.1%.
  3. Dec 2017, 193 pages, QY Research, $2,900
    The Autonomous Mobile Robots industry was $158 million in 2016 and is projected to reach $390 million by 2022, at a CAGR of 16.26% between 2016 and 2022.
  4. Sep 2017, 88 pages, Skylogic Research, $1,450
    An analysis of 2,600 drone buyers, providers and business users but no forecasts available.

5G autonomous bus at 2018 Olympics.

Autonomous vehicles

  1. Feb 2018, 93 pages, Tractica, $4,500
    Tractica forecasts that annual unit shipments will increase from approximately 343 vehicles in 2017 to 188,000 units in 2022.

Robots and workers of the world, unite!

Robots in the workforce will give rise to new jobs for humans, including safety engineers, robot specialists and augmented reality experts, according to researchers. Image credit – ‘FANUC robots’, by Mixabest – Own work, CC BY-SA 3.0

Robots are already changing the way we work – particularly in factories – but worries that they will steal our jobs are only part of the picture, as new technologies are also opening up workplace opportunities for workers and are likely to create new jobs in the future.

Last year, the BBC reported that 800 million global workers will lose their jobs to robotic automation by 2030. This statistic, from a McKinsey Global Institute study, led to countless headlines asking, will robots take your job?

The study found that robots will eliminate some jobs, but also create new ones. As the field develops, European roboticists are busy investigating how factory robots could create new opportunities for workers in manufacturing jobs.

The MANUWORK project is collaborating with non-profit group Lantegi Batuak in Spain, which helps to incorporate people with disabilities into the world of work.

The project scientists are testing a number of assistive technologies including augmented reality (AR) displays that can help workers with disabilities in complex tasks such as the assembly and wiring of electrical cabinets. The display shows step-by-step wiring instructions to the worker.

‘We want to introduce a robot to work with them in the assembly process, to indicate cable connections or do some quick quality checks,’ said Dr Kosmas Alexopoulos at the University of Patras in Greece, who coordinates the project. This kind of human-robot collaboration technology could allow greater involvement of people with disabilities in manufacturing roles across Europe.

Predictable

Predictable physical work in factories plays to the strengths of robots, but even the most modern factories are not run entirely by machines.

European car manufacturing is leading the way in adopting robotics. Last year, the number of robots in French car factories rose by 22% to 1,400 units, with around 9 robots for every 100 workers.

Robots are used heavily for the assembly of cars, mainly for body welding and positioning of large metal parts. Later, the skilled human labour comes in to create the interior. ‘Assembling car interiors is complex and it requires the skills of a human to perform,’ explained Dr Sotiris Makris, industrial robotics expert at the University of Patras.

His industrial robots lab in Patras took a leading role in a project called ROBO-PARTNER, which aimed to safely mix people with robots operating in the same workspace to perform a car assembly task. Under normal circumstances, close man-machine teamwork is not possible because robots and humans are kept apart for safety reasons. The human worker brings intelligence and fine skills, while their robot buddy delivers super-human strength and precision.

Allowing robots and humans work together adds flexibility. – Prof. Björn Hein, Karlsruhe Institute of Technology, Germany

The team took a real-life scenario as a starting point. In an automobile factory, the rear axle of a vehicle – which can weigh over 50 kilograms (kgs) – must be brought into position by a worker. They must secure the part and then fit drum brake components, which also involves heavy lifting and manipulation of flexible wire parts.

‘The worker carries the drums, mounts them and must screw them in place. It is quite a stressful process, both physically and cognitively,’ said Dr Makris.

The project developed a robot, capable of lifting 130 kgs, to move the axle and brake parts into place for the human worker. Importantly, the worker remains in the vicinity as the robot – essentially a powerful arm that can move around and grip and lift items – manoeuvres the car parts into position.

Strenuous

At present, one particular 14-kg part for the rear wheel must be lifted 500 times in a single shift by car assembly workers, but the robot could shoulder this strenuous work. This would mean physical strength would no longer be a reason for someone not to do this job.

Video cameras, sensors and adjustable safety zones prevent the robot from moving dangerously close to the worker. The worker interacts with the robot using a control panel, but also uses virtual reality (VR) glasses that visualise the next task for the worker. The completion of each task is signalled using a smart watch and the human remains in charge.

Dr Makris is optimistic that such research will allow people and robots to work side-by-side. There will be new jobs too. Factories of the future will create new roles, such as safety engineers, robot specialists and AR experts, Dr Makris predicted.

Robots are also being used to transform warehouses for online businesses, which are currently divided into two areas. In the centre, wheeled robots shift shelves around and move items to the perimeter, where human workers are waiting. The workers and robots are kept apart for safety. A laser barrier detects any unauthorised entries by people, which will cause the system to shut down. In the case of a robot failure, the warehouse has to be stopped so a service technician can do repairs.

But what if robots and people could safely tango together in warehouses? This is the vision of SafeLog, a project that is developing a flexible warehouse system where humans and automated guided vehicles safely share the same space.

‘With SafeLog, the service technician could just walk in the warehouse and fix the robot, so the warehouse could stay online,’ explained Professor Björn Hein, robotics engineer at the Karlsruhe Institute of Technology, Germany, and coordinator of the project. Right now, there are usually a few dozen robots, so downtime is not a significant issue. As numbers of robots start to increase, downtime could start to be a problem.

Safety vest

SafeLog is creating a safety vest to be worn by warehouse workers. This wirelessly transmits the worker’s location and can be detected by robots nearby, so that they will sense if a human is too close and stop.

For coordination, SafeLog scientists are developing algorithms that track and predict the movements of people using the vest and robots. This would be used by a fleet management system, the silicon brain that will safely guide hundreds of robots around a warehouse.

The workers will wear smart glasses that will warn them about objects close by and help guide them to items in the warehouse.

‘We intend to make it possible for warehouses to become even bigger, because allowing robots and humans work together adds flexibility and makes it easier to extend,’ said Prof. Hein.

The SafeLog system will be put through its paces in a real warehouse in early 2019.

All research in this article is funded by the EU.

Call for evidence – European Robotics Flagship


The European Commission is launching a new call for Flagships. Existing Flagships include the Human Brian Project, and the Graphene Flagship – each funded at the level of 1B EUR.

Given the excitement around the field of robotics and its potential to benefit society and the economy, we’ve brought together the robotics community to apply for a Robotics Flagship Preparatory Action. The Robotics Flagship aims to drive developments in European Robotics for the next 10 years. You can read more about it on this website.

As a first step, we’ll be submitting a request for exploratory funds to inform the preparation of the full flagship proposal.

Key to our work will be to understand the robotics community’s and the public’s view on the technology so we can develop it in a way that positively impacts society and the economy.

That’s why we need your help. By filling in a short call for evidence (max 10 minutes) you’ll be providing us with valuable input for the proposal.

If you’re from the public, fill in this call for evidence.

If you’re from the robotics community, fill in this call for evidence.

You can also follow us on twitter, and help spread the word by RTing:

Page 396 of 427
1 394 395 396 397 398 427