Page 312 of 400
1 310 311 312 313 314 400

Robots guarded Buddha’s relics in a legend of ancient India

Two small figures guard the table holding the Buddha’s relics. Are they spearmen, or robots? British Museum, CC BY-NC-SA

By Adrienne Mayor

As early as Homer, more than 2,500 years ago, Greek mythology explored the idea of automatons and self-moving devices. By the third century B.C., engineers in Hellenistic Alexandria, in Egypt, were building real mechanical robots and machines. And such science fictions and historical technologies were not unique to Greco-Roman culture.

In my recent book “Gods and Robots,” I explain that many ancient societies imagined and constructed automatons. Chinese chronicles tell of emperors fooled by realistic androids and describe artificial servants crafted in the second century by the female inventor Huang Yueying. Techno-marvels, such as flying war chariots and animated beings, also appear in Hindu epics. One of the most intriguing stories from India tells how robots once guarded Buddha’s relics. As fanciful as it might sound to modern ears, this tale has a strong basis in links between ancient Greece and ancient India.

The story is set in the time of kings Ajatasatru and Asoka. Ajatasatru, who reigned from 492 to 460 B.C., was recognized for commissioning new military inventions, such as powerful catapults and a mechanized war chariot with whirling blades. When Buddha died, Ajatasatru was entrusted with defending his precious remains. The king hid them in an underground chamber near his capital, Pataliputta (now Patna) in northeastern India.

A sculpture depicting the distribution of the Buddha’s relics.
Los Angeles County Museum of Art/Wikimedia Commons

Traditionally, statues of giant warriors stood on guard near treasures. But in the legend, Ajatasatru’s guards were extraordinary: They were robots. In India, automatons or mechanical beings that could move on their own were called “bhuta vahana yanta,” or “spirit movement machines” in Pali and Sanskrit. According to the story, it was foretold that Ajatasatru’s robots would remain on duty until a future king would distribute Buddha’s relics throughout the realm.

Ancient robots and automatons

A statue of Visvakarman, the engineer of the universe.
Suraj Belbase/Wikimedia Commons, CC BY-SA

Hindu and Buddhist texts describe the automaton warriors whirling like the wind, slashing intruders with swords, recalling Ajatasatru’s war chariots with spinning blades. In some versions the robots are driven by a water wheel or made by Visvakarman, the Hindu engineer god. But the most striking version came by a tangled route to the “Lokapannatti” of Burma – Pali translations of older, lost Sanskrit texts, only known from Chinese translations, each drawing on earlier oral traditions.

In this tale, many “yantakara,” robot makers, lived in the Western land of the “Yavanas,” Greek-speakers, in “Roma-visaya,” the Indian name for the Greco-Roman culture of the Mediterranean world. The Yavanas’ secret technology of robots was closely guarded. The robots of Roma-visaya carried out trade and farming and captured and executed criminals.

Robot makers were forbidden to leave or reveal their secrets – if they did, robotic assassins pursued and killed them. Rumors of the fabulous robots reached India, inspiring a young artisan of Pataliputta, Ajatasatru’s capital, who wished to learn how to make automatons.

In the legend, the young man of Pataliputta finds himself reincarnated in the heart of Roma-visaya. He marries the daughter of the master robot maker and learns his craft. One day he steals plans for making robots, and hatches a plot to get them back to India.

Certain of being slain by killer robots before he could make the trip himself, he slits open his thigh, inserts the drawings under his skin and sews himself back up. Then he tells his son to make sure his body makes it back to Pataliputta, and starts the journey. He’s caught and killed, but his son recovers his body and brings it to Pataliputta.

Once back in India, the son retrieves the plans from his father’s body, and follows their instructions to build the automated soldiers for King Ajatasatru to protect Buddha’s relics in the underground chamber. Well hidden and expertly guarded, the relics – and robots – fell into obscurity.

The sprawling Maurya Empire in about 250 B.C.
Avantiputra7/Wikimedia Commons, CC BY-SA

Two centuries after Ajatasatru, Asoka ruled the powerful Mauryan Empire in Pataliputta, 273-232 B.C. Asoka constructed many stupas to enshrine Buddha’s relics across his vast kingdom. According to the legend, Asoka had heard the legend of the hidden relics and searched until he discovered the underground chamber guarded by the fierce android warriors. Violent battles raged between Asoka and the robots.

In one version, the god Visvakarman helped Asoka to defeat them by shooting arrows into the bolts that held the spinning constructions together; in another tale, the old engineer’s son explained how to disable and control the robots. At any rate, Asoka ended up commanding the army of automatons himself.

Exchange between East and West

Is this legend simply fantasy? Or could the tale have coalesced around early cultural exchanges between East and West? The story clearly connects the mechanical beings defending Buddha’s relics to automatons of Roma-visaya, the Greek-influenced West. How ancient is the tale? Most scholars assume it arose in medieval Islamic and European times.

But I think the story could be much older. The historical setting points to technological exchange between Mauryan and Hellenistic cultures. Contact between India and Greece began in the fifth century B.C., a time when Ajatasatru’s engineers created novel war machines. Greco-Buddhist cultural exchange intensified after Alexander the Great’s campaigns in northern India.

Inscriptions in Greek and Aramaic on a monument originally erected by King Asoka at Kandahar, in what is today Afghanistan.
World Imaging/Wikimedia Commons

In 300 B.C., two Greek ambassadors, Megasthenes and Deimachus, resided in Pataliputta, which boasted Greek-influenced art and architecture and was the home of the legendary artisan who obtained plans for robots in Roma-visaya. Grand pillars erected by Asoka are inscribed in ancient Greek and name Hellenistic kings, demonstrating Asoka’s relationship with the West. Historians know that Asoka corresponded with Hellenistic rulers, including Ptolemy II Philadelphus in Alexandria, whose spectacular procession in 279 B.C. famously displayed complex animated statues and automated devices.

Historians report that Asoka sent envoys to Alexandria, and Ptolemy II sent ambassadors to Asoka in Pataliputta. It was customary for diplomats to present splendid gifts to show off cultural achievements. Did they bring plans or miniature models of automatons and other mechanical devices?

I cannot hope to pinpoint the original date of the legend, but it is plausible that the idea of robots guarding Buddha’s relics melds both real and imagined engineering feats from the time of Ajatasatru and Asoka. This striking legend is proof that the concepts of building automatons were widespread in antiquity and reveals the universal and timeless link between imagination and science.

Adrienne Mayor is the author of:

Gods and Robots: Myths, Machines, and Ancient Dreams of TechnologyThe Conversation

Princeton University Press provides funding as a member of The Conversation US.

Adrienne Mayor, Research Scholar, Classics and History and Philosophy of Science, Stanford University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Laying the ground for robotic strategies in environmental protection

By Benjamin Boettner
Along developed riverbanks, physical barriers can help contain flooding and combat erosion. In arid regions, check dams can help retain soil after rainfall and restore damaged landscapes. In construction projects, metal plates can provide support for excavations, retaining walls on slopes, or permanent foundations. All of these applications can be addressed with the use of sheet piles, elements folded from flat material and driven vertically into the ground to form walls and stabilize soil. Proper soil stabilization is key to sustainable land management in industries such as construction, mining, and agriculture; and land degradation, the loss of ecosystem services from a given terrain, is a driver of climate change and is estimated to cost up to $10 trillion annually.

With this motivation, a team of roboticists at Harvard’s Wyss Institute for Biologically Inspired Engineering has developed a robot that can autonomously drive interlocking steel sheet piles into soil. The structures that it builds could function as retaining walls or check dams for erosion control. The study will be presented at the upcoming 2019 IEEE International Conference on Robotics and Automation.

Researchers at the Wyss Institute have developed a robot designed to drive interlocking sheet piles into the ground to help stabilize soil. Teams of such robots could help combat erosion, restore damaged landscapes, and facilitate sustainable land management in a variety of settings. Credit: Wyss Institute at Harvard University
Romu, the Wyss Institute’s sheet pile driving robot has been extensively tested in a sandbox in the laboratory. Credit: Wyss Institute at Harvard University

Conventional sheet pile driving processes are extremely energy intensive. Only a fraction of the weight of typical heavy machinery is used for applying downward force. The Wyss team’s “Romu” robot, on the other hand, is able to leverage its own weight to drive sheet piles into the ground. This is made possible by each of its four wheels being coupled to a separate linear actuator, which also allows it to adapt to uneven terrain and ensure that piles are driven vertically. From a raised position, Romu grips a sheet pile and then lowers its chassis, pressing the pile into the soil with the help of an on-board vibratory hammer. By gripping the pile again at a higher position and repeating this process, the robot can drive a pile much taller than its own range of vertical motion. After driving a pile to sufficient depth, Romu advances and installs the next pile such that it interlocks with the previous one, thereby forming a continuous wall. Once it has used all of the piles it carries, it may return to a supply cache to restock.

The study grew out of previous work at the Wyss Institute on teams or swarms of robots for construction applications. In work inspired by mound-building termites, Core Faculty member Radhika Nagpal and Senior Research Scientist Justin Werfel designed an autonomous robotic construction crew called TERMES, whose members worked together to build complex structures from specialized bricks. Further work by Werfel and researcher Nathan Melenbrink explored strut-climbing robots capable of building cantilevering truss structures, addressing applications like bridges. However, neither of these studies addressed the challenge of anchoring structures to the ground. The Romu project began as an exploration of methods for automated site preparation and installation of foundations for the earlier systems to build on; as it developed, the team determined that such interventions could also be directly applicable to land restoration tasks in remote environments.

The robot is designed to drive interlocking sheet piles into granular soils like sand on a beach. Credit: Wyss Institute at Harvard University

“In addition to tests in the lab, we demonstrated Romu operating on a nearby beach,” said Melenbrink. “This kind of demonstration can be an icebreaker for a broader conversation around opportunities for automation in construction and land management. We’re interested in engaging with experts in related fields who might see potential benefit for the kind of automated interventions we’re developing.”

The researchers envision large numbers of Romu robots working together as a collective or swarm. They demonstrated in computer simulations that teams of Romu robots could make use of environmental cues like slope steepness in order to build walls in effective locations, making efficient use of limited resources. “The swarm approach gives advantages like speedup through parallelism, robustness to the loss of individual robots, and scalability for large teams,” said Werfel. “By responding in real-time to the conditions they actually encounter as they work, the robots can adapt to unexpected or changing situations, without needing to rely on a lot of supporting infrastructure for abilities like site surveying, communication, or localization.”

“The name Terramanus ferromurus (Romu) is a nod to the concept of ‘machine ecology’ in which autonomous systems can be introduced into natural environments as new participants, taking specific actions to complement and promote human environmental stewardship,” said Melenbrink. In the future, the Terramanus “genus” could be extended by additional robots carrying out different tasks to protect or restore ecosystem services. Based on their findings, the team now is interested in investigating interventions ranging from groundwater retention structures for supporting agriculture in arid regions, to responsive flood barrier construction for hurricane preparedness. Future versions of the robot could perform other interventions such as spraying soil-binding agents or installing silt fencing, such that a family of these robots could act to stabilize soil in a wide range of situations.

In many scenarios for environmental protection or restoration, the opportunity for action is limited by the availability of human labor and by site access for heavy machinery. Smaller, more versatile construction machines could provide a solution. “Clearly, the needs of many degraded landscapes are not being met with the currently available tools and techniques,” said Melenbrink. “Now, 100 years after the dawn of the heavy equipment age, we’re asking whether there might be more resilient and responsive ways to approach land management and restoration.”

“This sheet pile driving robot with its demonstrated ability to perform in a natural setting signals a path on which the Wyss Institute’s robotics and swarm robotics capabilities can be brought to bear on both natural and man-made environments where conventional machinery, man power limitations, or cost is inadequate to prevent often disastrous consequences. This robot also could address disaster situations where walling off dangerous chemical spills or released radioactive fluids makes it difficult or impossible for humans to intervene,” said Wyss Institute Founding Director Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at HMS and the Vascular Biology Program at Boston Children’s Hospital, as well as Professor of Bioengineering at SEAS.

Nanoparticles take a fantastic, magnetic voyage

MIT engineers have designed a magnetic microrobot that can help push drug-delivery particles into tumor tissue (left). They also employed swarms of naturally magnetic bacteria to achieve the same effect (right).
Image courtesy of the researchers.

By Anne Trafton

MIT engineers have designed tiny robots that can help drug-delivery nanoparticles push their way out of the bloodstream and into a tumor or another disease site. Like crafts in “Fantastic Voyage” — a 1960s science fiction film in which a submarine crew shrinks in size and roams a body to repair damaged cells — the robots swim through the bloodstream, creating a current that drags nanoparticles along with them.

The magnetic microrobots, inspired by bacterial propulsion, could help to overcome one of the biggest obstacles to delivering drugs with nanoparticles: getting the particles to exit blood vessels and accumulate in the right place.

“When you put nanomaterials in the bloodstream and target them to diseased tissue, the biggest barrier to that kind of payload getting into the tissue is the lining of the blood vessel,” says Sangeeta Bhatia, the John and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science, a member of MIT’s Koch Institute for Integrative Cancer Research and its Institute for Medical Engineering and Science, and the senior author of the study.

“Our idea was to see if you can use magnetism to create fluid forces that push nanoparticles into the tissue,” adds Simone Schuerle, a former MIT postdoc and lead author of the paper, which appears in the April 26 issue of Science Advances.

In the same study, the researchers also showed that they could achieve a similar effect using swarms of living bacteria that are naturally magnetic. Each of these approaches could be suited for different types of drug delivery, the researchers say.

Tiny robots

Schuerle, who is now an assistant professor at the Swiss Federal Institute of Technology (ETH Zurich), first began working on tiny magnetic robots as a graduate student in Brad Nelson’s Multiscale Robotics Lab at ETH Zurich. When she came to Bhatia’s lab as a postdoc in 2014, she began investigating whether this kind of bot could help to make nanoparticle drug delivery more efficient.

In most cases, researchers target their nanoparticles to disease sites that are surrounded by “leaky” blood vessels, such as tumors. This makes it easier for the particles to get into the tissue, but the delivery process is still not as effective as it needs to be.

The MIT team decided to explore whether the forces generated by magnetic robots might offer a better way to push the particles out of the bloodstream and into the target site.

The robots that Schuerle used in this study are 35 hundredths of a millimeter long, similar in size to a single cell, and can be controlled by applying an external magnetic field. This bioinspired robot, which the researchers call an “artificial bacterial flagellum,” consists of a tiny helix that resembles the flagella that many bacteria use to propel themselves. These robots are 3-D-printed with a high-resolution 3-D printer and then coated with nickel, which makes them magnetic.

To test a single robot’s ability to control nearby nanoparticles, the researchers created a microfluidic system that mimics the blood vessels that surround tumors. The channel in their system, between 50 and 200 microns wide, is lined with a gel that has holes to simulate the broken blood vessels seen near tumors.

Using external magnets, the researchers applied magnetic fields to the robot, which makes the helix rotate and swim through the channel. Because fluid flows through the channel in the opposite direction, the robot remains stationary and creates a convection current, which pushes 200-nanometer polystyrene particles into the model tissue. These particles penetrated twice as far into the tissue as nanoparticles delivered without the aid of the magnetic robot.

This type of system could potentially be incorporated into stents, which are stationary and would be easy to target with an externally applied magnetic field. Such an approach could be useful for delivering drugs to help reduce inflammation at the site of the stent, Bhatia says.

Bacterial swarms

The researchers also developed a variant of this approach that relies on swarms of naturally magnetotactic bacteria instead of microrobots. Bhatia has previously developed bacteria that can be used to deliver cancer-fighting drugs and to diagnose cancer, exploiting bacteria’s natural tendency to accumulate at disease sites.

For this study, the researchers used a type of bacteria called Magnetospirillum magneticum, which naturally produces chains of iron oxide. These magnetic particles, known as magnetosomes, help bacteria orient themselves and find their preferred environments.

The researchers discovered that when they put these bacteria into the microfluidic system and applied rotating magnetic fields in certain orientations, the bacteria began to rotate in synchrony and move in the same direction, pulling along any nanoparticles that were nearby. In this case, the researchers found that nanoparticles were pushed into the model tissue three times faster than when the nanoparticles were delivered without any magnetic assistance.

This bacterial approach could be better suited for drug delivery in situations such as a tumor, where the swarm, controlled externally without the need for visual feedback, could generate fluidic forces in vessels throughout the tumor.  

The particles that the researchers used in this study are big enough to carry large payloads, including the components required for the CRISPR genome-editing system, Bhatia says. She now plans to collaborate with Schuerle to further develop both of these magnetic approaches for testing in animal models.

The research was funded by the Swiss National Science Foundation, the Branco Weiss Fellowship, the National Institutes of Health, the National Science Foundation, and the Howard Hughes Medical Institute.

It’s 2019 – where’s my supersuit?

I loved the "Thundercats" cartoon as a child, watching cat-like humanoids fighting the forces of evil. Whenever their leader was in trouble, he'd unleash the Sword of Omens to gain "sight beyond sight," the ability to see events happening at faraway places, or bellow "Thunder, Thunder, Thunder, Thundercats, Hooo!" to instantaneously summon his allies to his location to join the fight. What kid didn't want those superpowers?

Bucharest was the European capital of robotics for #ERF2019

European Robotics Forum, the most influential meeting of the robotics and AI community, held its 10th anniversary edition in Romania. The event was organized Under the High Patronage of the President of Romania and Under the Patronage of the Romanian Presidency of the Council of the European Union.

The most advanced prototypes, high end technology projects, financed under Horizon 2020, were exhibited to be admired and analysed at JW Marriott between 20 and 22 March.

Among the robots that were displayed one could find: the famous REEM-C – the humanoid robot that speaks 9 languages and that costs 1 million euro, QT – the robot especially created to help children who suffer from autism, Trimbot – the gardening robot that will help cut the roses and the bushes, as well as other prototypes that take innovation to the next level. Among the exhibitors we could also find Romanian companies developing advanced software solutions for international robotics companies.

European Robotics Forum was also a great opportunity for a multidisciplinary approach of some essential topics related to the future developments of technology in Europe. 900 experts gathered in over 50 workshops to discuss the future of robotics and AI in a European landscape that is shaping its future more and more under the sign of technology and innovation.

”European Robotics Forum is THE annual event for robotics in Europe, gathering a vibrant community, with more than 900 participants from Robotics and neighbouring communities, such as big data or cybersecurity, offering a unique opportunity for academia and industry to discuss and boost together science and innovation. We count on the robotics community to drive the development of AI in Europe, being at the forefront of the digital transformation for the benefit of our economy and society”, said Lucilla Sioli Director “Artificial Intelligence and Digital Industry”, DG Connect, European Commission, during the Opening of the Forum.

”European integration flourishes when industry is strong. Automation through robots and artificial intelligence increases productivity and therefore brings jobs and prosperity to Europe”, thinks Peter Droel, Director, Industrial Technologies, DG Research & Innovation, European Commission.

“The European Robotics Forum 2019 put one more time Romania on the innovation map, highlighting the potential that we hold in the technology field, an extensive area that will dramatically influence the future of all”, said Ana-Maria Stancu, President E-Civis, local organizer of the event and member of the euRobotics Board.

The European Robotics Forum 2019 is organised by euRobotics under SPARC, the Public-Private partnership for Robotics in Europe and hosted by E-Civis Association. ERF2020 will take place in Malaga, Spain, on 3-5 March 2020.

euRobotics Awards 2019

The winners of these prestigious pan-European awards were:

For the Georges Giralt PhD Award 2019 – best European thesis in robotics: Tomic Teodor – Leibniz U Hannover for “Model-Based Control of Flying Robots for Robust Interaction under Wind Influence” and Grazioso Stanislao – Università degli Studi di Napoli Federico II – for “Geometric soft robotics: a finite element approach”.

Sevensense Robotics AG received the Third Prize of the Technology Transfer Award 2019, which was handed by Victor Negrescu, Vice-rector SNSPA, former MEP, former Minister for EU Affairs, Romanian Robotics Ambassador.

The Entrepreneurship Award 2019 (First Prize) was given to IM Systems – Jack Schorsch.

ERF2019 was also the perfect opportunity to announce the European Robotics League awards.

See all Awards 2019 winners and finalists

For all the e​vent pictures, check here or below for a small selection.

















Robotic arms and temporary motorisation – the next generation of wheelchairs

An automated wheelchair with an exoskeleton arm is designed to help people with varying forms of disability carry out daily tasks independently. Image credit – AIDE, Universidad Miguel Hernandez

by Julianna Photopoulos

Next-generation wheelchairs could incorporate brain-controlled robotic arms and rentable add-on motors in order to help people with disabilities more easily carry out daily tasks or get around a city.

Professor Nicolás García-Aracil from the Universidad Miguel Hernández (UMH) in Elche, Spain, has developed an automated wheelchair with an exoskeleton robotic arm to use at home, as part of a project called AIDE.

It uses artificial intelligence to extract relevant information from the user, such as their behaviour, intentions and emotional state, and also analyses its environmental surroundings, he says.

The system, which is based on an arm exoskeleton attached to a robotised wheelchair, is designed to help people living with various degrees and forms of disabilities carry out daily functions such as eating, drinking, and washing up, on their own and at home. While the user sits in the wheelchair, they wear the robotised arm to help them grasp objects and bring them close — or as the whole system is connected to the home automation system they can ask the wheelchair to move in a specific direction or go into a particular room.

Its mechanical wheels are made to move in narrow spaces, ideal for home-use, and the system can control the environment remotely – for example, switching lights on and off, using the television or making and answering phone calls. What’s more, it can anticipate the person’s needs.

‘We can train artificially intelligent algorithms to predict what the user wants to do,’ said Prof. García-Aracil. ‘Maybe the user is in the kitchen and wants a drink. The system provides their options (on a monitor) so they can control the exoskeleton to raise the glass and drink.’

Multimodal system

The technology isn’t simple. As well as the exoskeleton robotic arm attached to the robotic wheelchair, the chair has a small monitor and uses various sensors, including two cameras to recognise the environment, voice control, eye-tracking glasses to recognise objects, and sensors that capture brain activity, eye movements and signals from muscles.

Depending on each person’s needs and disabilities, the multiple devices are used accordingly. For example, someone with a severe disability such as a cervical spinal cord injury, who wouldn’t otherwise be able to use voice control, could use the brain activity and eye movement sensors combined.

The user wears a cap on their head, filled with electrodes, to record the brain’s activity which controls the exoskeleton hand’s movement, explains Prof. García-Aracil. So when the user sees themself closing their hand onto an object for example, the exoskeleton arm actually does it for them. This technology is called brain-neural-computer interaction (BNCI), where brain — as well as muscle — activity can be recorded and used to interact with an electronic device.

But the system can sometimes make mistakes so there is an abort signal, says Prof. García-Aracil. ‘We use the horizontal movement of the eye, so when you move your eyes to the right you trigger an action, but when you move your eyes to the left you abort that action,’ he explains.

The AIDE prototype was successfully tested last year by 17 people with disabilities including acquired brain injury (ABI), multiple sclerosis (MS), and spinal cord injury (SCI), at the Cedar Foundation in Belfast, Northern Ireland. Its use was also demonstrated at UMH in Elche, with the user asking to be taken to the cafeteria, then asking for a drink, and drinking it with the help of the exoskeletal arm.

Now more work needs to be carried out to make the system easier to use, cheaper and ready for the market, says Prof. García-Aracil.

But it’s not just new high-tech wheelchairs that can increase the functionality for users. Researchers on the FreeWheel project are developing a way of adding motorised units to existing wheelchairs to improve their utility in urban areas.

‘Different settings have different challenges,’ said project coordinator Ilaria Schiavi at IRIS SRL in Torino, Italy. For example, someone with a wheelchair may struggle to go uphill or downhill without any physical assistance whilst outdoors. But this system could allow people using wheelchairs to have an automated wheelchair experience regardless of whether they are indoors or outdoors, she says.

Rentable

The motorised units would attach to manual wheelchairs people already have in order to help them move around more easily and independently, Schiavi explains. These could either be rented for short periods of time and tailored according to the location — an indoor or outdoor environment — or bought, in which case would be completely personalised to the individual.

The researchers are also developing an app for the user which would include services such as ordering a bespoke device to connect the wheelchair and the unit, booking the unit, controlling it, and planning a journey within urban areas for shopping or sightseeing.

‘You have mobility apps that allow you to book cars, for example. Our app will allow the owner of a wheelchair to firstly subscribe to the service, which would include buying a customised interface to use between their own wheelchair and the motorising unit they have booked,’ said Schiavi.

‘A simple customised interface will allow wheelchair users to motorise their exact device, as it is used by them, at a reasonable cost.’

Customisation is made possible through additive manufacturing (AM) technologies, she says. AM technologies build 3D objects by adding materials, such as metal or plastic, layer-by-layer.

Schiavi and her colleagues are exploring various uses for the motorised units and next year, the team plans to test this system with mobility-impaired people in both Greece and Italy. They hope that, once developed, they will be made available like city bicycles in public spaces such as tourist attractions or shopping centres.

The research in this article was funded by the EU.

Giving robots a better feel for object manipulation


A new “particle simulator” developed by MIT researchers improves robots’ abilities to mold materials into simulated target shapes and interact with solid objects and liquids. This could give robots a refined touch for industrial applications or for personal robotics— such as shaping clay or rolling sticky sushi rice.
Courtesy of the researchers

By Rob Matheson

A new learning system developed by MIT researchers improves robots’ abilities to mold materials into target shapes and make predictions about interacting with solid objects and liquids. The system, known as a learning-based particle simulator, could give industrial robots a more refined touch — and it may have fun applications in personal robotics, such as modelling clay shapes or rolling sticky rice for sushi.

In robotic planning, physical simulators are models that capture how different materials respond to force. Robots are “trained” using the models, to predict the outcomes of their interactions with objects, such as pushing a solid box or poking deformable clay. But traditional learning-based simulators mainly focus on rigid objects and are unable to handle fluids or softer objects. Some more accurate physics-based simulators can handle diverse materials, but rely heavily on approximation techniques that introduce errors when robots interact with objects in the real world.

In a paper being presented at the International Conference on Learning Representations in May, the researchers describe a new model that learns to capture how small portions of different materials — “particles” — interact when they’re poked and prodded. The model directly learns from data in cases where the underlying physics of the movements are uncertain or unknown. Robots can then use the model as a guide to predict how liquids, as well as rigid and deformable materials, will react to the force of its touch. As the robot handles the objects, the model also helps to further refine the robot’s control.

In experiments, a robotic hand with two fingers, called “RiceGrip,” accurately shaped a deformable foam to a desired configuration — such as a “T” shape — that serves as a proxy for sushi rice. In short, the researchers’ model serves as a type of “intuitive physics” brain that robots can leverage to reconstruct three-dimensional objects somewhat similarly to how humans do.

Humans have an intuitive physics model in our heads, where we can imagine how an object will behave if we push or squeeze it. Based on this intuitive model, humans can accomplish amazing manipulation tasks that are far beyond the reach of current robots,” says first author Yunzhu Li, a graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL).We want to build this type of intuitive model for robots to enable them to do what humans can do.”

“When children are 5 months old, they already have different expectations for solids and liquids,” adds co-author Jiajun Wu, a CSAIL graduate student. “That’s something we know at an early age, so maybe that’s something we should try to model for robots.”

Joining Li and Wu on the paper are: Russ Tedrake, a CSAIL researcher and a professor in the Department of Electrical Engineering and Computer Science (EECS); Joshua Tenenbaum, a professor in the Department of Brain and Cognitive Sciences; and Antonio Torralba, a professor in EECS and director of the MIT-IBM Watson AI Lab.

Dynamic graphs

A key innovation behind the model, called the “particle interaction network” (DPI-Nets), was creating dynamic interaction graphs, which consist of thousands of nodes and edges that can capture complex behaviors of so-called particles. In the graphs, each node represents a particle. Neighboring nodes are connected with each other using directed edges, which represent the interaction passing from one particle to the other. In the simulator, particles are hundreds of small spheres combined to make up some liquid or a deformable object.

The graphs are constructed as the basis for a machine-learning system called a graph neural network. In training, the model over time learns how particles in different materials react and reshape. It does so by implicitly calculating various properties for each particle — such as its mass and elasticity — to predict if and where the particle will move in the graph when perturbed.

The model then leverages a “propagation” technique, which instantaneously spreads a signal throughout the graph. The researchers customized the technique for each type of material — rigid, deformable, and liquid — to shoot a signal that predicts particles positions at certain incremental time steps. At each step, it moves and reconnects particles, if needed.

For example, if a solid box is pushed, perturbed particles will be moved forward. Because all particles inside the box are rigidly connected with each other, every other particle in the object moves the same calculated distance, rotation, and any other dimension. Particle connections remain intact and the box moves as a single unit. But if an area of deformable foam is indented, the effect will be different. Perturbed particles move forward a lot, surrounding particles move forward only slightly, and particles farther away won’t move at all. With liquids being sloshed around in a cup, particles may completely jump from one end of the graph to the other. The graph must learn to predict where and how much all affected particles move, which is computationally complex.

Shaping and adapting

In their paper, the researchers demonstrate the model by tasking the two-fingered RiceGrip robot with clamping target shapes out of deformable foam. The robot first uses a depth-sensing camera and object-recognition techniques to identify the foam. The researchers randomly select particles inside the perceived shape to initialize the position of the particles. Then, the model adds edges between particles and reconstructs the foam into a dynamic graph customized for deformable materials.

Because of the learned simulations, the robot already has a good idea of how each touch, given a certain amount of force, will affect each of the particles in the graph. As the robot starts indenting the foam, it iteratively matches the real-world position of the particles to the targeted position of the particles. Whenever the particles don’t align, it sends an error signal to the model. That signal tweaks the model to better match the real-world physics of the material.

Next, the researchers aim to improve the model to help robots better predict interactions with partially observable scenarios, such as knowing how a pile of boxes will move when pushed, even if only the boxes at the surface are visible and most of the other boxes are hidden.

The researchers are also exploring ways to combine the model with an end-to-end perception module by operating directly on images. This will be a joint project with Dan Yamins’s group; Yamin recently completed his postdoc at MIT and is now an assistant professor at Stanford University. “You’re dealing with these cases all the time where there’s only partial information,” Wu says. “We’re extending our model to learn the dynamics of all particles, while only seeing a small portion.”

Page 312 of 400
1 310 311 312 313 314 400