World’s First Self-driving, Commercial-class Snow Clearing Robot from Left Hand Robotics Operating at Pilot Customer Locations Across North America
Snake-inspired robot uses kirigami to move
By Leah Burrows
Who needs legs? With their sleek bodies, snakes can slither up to 14 miles-per-hour, squeeze into tight spaces, scale trees, and swim. How do they do it? It’s all in the scales. As a snake moves, its scales grip the ground and propel the body forward — similar to how crampons help hikers establish footholds in slippery ice. This so-called “friction-assisted locomotion” is possible because of the shape and positioning of snake’s scales.
Now, a team of researchers from the Wyss Institute at Harvard University and the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) has developed a soft robot that uses those same principles of locomotion to crawl without any rigid components. The soft robotic scales are made using kirigami — an ancient Japanese paper craft that relies on cuts, rather than origami folds, to change the properties of a material. As the robot stretches, the flat kirigami surface is transformed into a 3D-textured surface, which grips the ground just like snake skin.
The research is published in Science Robotics.
“There has been a lot of research in recent years into how to fabricate these kinds of morphable, stretchable structures,” said Ahmad Rafsanjani, Ph.D., a postdoctoral fellow at SEAS and first author of the paper. “We have shown that kirigami principles can be integrated into soft robots to achieve locomotion in a way that is simpler, faster, and cheaper than most previous techniques.”
The researchers started with a simple, flat plastic sheet. Using a laser cutter, they embedded an array of centimeter-scale cuts, experimenting with different shapes and sizes. Once the sheet was cut, the researchers wrapped it around a tube-like elastomer actuator, which expands and contracts with air like a balloon.
When the actuator expands, the kirigami cuts pop out, forming a rough surface that grips the ground. When the actuator deflates, the cuts fold flat, propelling the crawler forward.
The researchers built a fully untethered robot, with its integrated on-board control, sensing, actuation, and power supply all packed into a tiny tail. They tested it crawling throughout Harvard’s campus.
The team experimented with various-shaped cuts, including triangular, circular, and trapezoidal. They found that trapezoidal cuts — which most closely resemble the shape of snake scales —gave the robot a longer stride.
“We show that the locomotive properties of these kirigami-skins can be harnessed by properly balancing the cut geometry and the actuation protocol,” said Rafsanjani. “Moving forward, these components can be further optimized to improve the response of the system.”
“We believe that our kirigami-based strategy opens avenues for the design of a new class of soft crawlers,” said the paper’s senior author Katia Bertoldi, Ph.D., an Associate Faculty member of the Wyss Institute and the William and Ami Kuan Danoff Professor of Applied Mechanics at SEAS. “These all-terrain soft robots could one day travel across difficult environments for exploration, inspection, monitoring, and search and rescue missions, or perform complex, laparoscopic medical procedures.”
This research was co-authored by Yuerou Zhang; Bangyuan Liu, a visiting student in the Bertoldi lab; and Shmuel M. Rubinstein, Ph.D., Associate Professor of Applied Physics at SEAS. It was supported by the National Science Foundation.
Custom carpentry with help from robots
By Adam Conner-Simons and Rachel Gordon
Every year thousands of carpenters injure their hands and fingers doing dangerous tasks such as sawing.
In an effort to minimize injury and let carpenters focus on design and other bigger-picture tasks, a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has created AutoSaw, a system that lets nonexperts customize different items that can then be constructed with the help of robots.
Users can choose from a range of carpenter-designed templates for chairs, desks, and other furniture. The team says that AutoSaw could eventually be used for projects as large as a deck or a porch.
“If you’re building a deck, you have to cut large sections of lumber to length, and that’s often done on site,” says CSAIL postdoc Jeffrey Lipton, who was a lead author on a related paper about the system. “Every time you put a hand near a blade, you’re at risk. To avoid that, we’ve largely automated the process using a chop-saw and jigsaw.”
The system also offers flexibility for designing furniture to fit space-constrained houses and apartments. For example, it could allow a user to modify a desk to squeeze into an L-shaped living room, or customize a table to fit in a microkitchen.
“Robots have already enabled mass production, but with artificial intelligence (AI) they have the potential to enable mass customization and personalization in almost everything we produce,” says CSAIL director and co-author Daniela Rus. “AutoSaw shows this potential for easy access and customization in carpentry.”
The paper, which will be presented in May at the International Conference on Robotics and Automation (ICRA) in Brisbane, Australia, was co-written by Lipton, Rus, and PhD student Adriana Schulz. Other co-authors include MIT Professor Wojciech Matusik, PhD student Andrew Spielberg, and undergraduate Luis Trueba.
How it works
Software isn’t a foreign concept for some carpenters. “Computer Numerical Control” (CNC) can convert designs into numbers that are fed to specially programmed tools to execute. However, the machines used for CNC fabrication are usually large and cumbersome, and users are limited to the size of the existing CNC tools.
As a result, many carpenters continue to use chop-saws, jigsaws, and other hand tools that are low cost, easy to move, and simple to use. These tools, while useful for customization, still put people at a high risk of injury.
AutoSaw draws on expert knowledge for designing, and robotics for the more risky cutting tasks. Using the existing CAD system OnShape with an interface of design templates, users can customize their furniture for things like size, sturdiness, and aesthetics. Once the design is finalized, it’s sent to the robots to assist in the cutting process using the jigsaw and chop-saw.
To cut lumber the team used motion-tracking software and small mobile robots — an approach that takes up less space and is more cost-effective than large robotic arms.
Specifically, the team used a modified Roomba with a jigsaw attached to cut lumber of any shape on a plank. For the chopping, the team used two Kuka youBots to lift the beam, place it on the chop saw, and cut.
“We added soft grippers to the robots to give them more flexibility, like that of a human carpenter,” says Lipton. “This meant we could rely on the accuracy of the power tools instead of the rigid-bodied robots.”
After the robots finish with cutting, the user then assembles the new piece of furniture using step-by-step directions from the system.
Democratizing custom furniture
When testing the system, the teams’ simulations showed that they could build a chair, shed, and deck. Using the robots, the team also made a table with an accuracy comparable to that of a human, without a real hand ever getting near a blade.
“There have been many recent AI achievements in virtual environments, like playing Go and composing music,” says Hod Lipson, a professor of mechanical engineering and data science at Columbia University. “Systems that can work in unstructured physical environments, such as this carpentry system, are notoriously difficult to make. This is truly a fascinating step forward.”
While AutoSaw is still a research platform, in the future the team plans to use materials such as wood, and integrate complex tasks such as drilling and gluing.
“Our aim is to democratize furniture-customization,” says Schulz. “We’re trying to open up a realm of opportunities so users aren’t bound to what they’ve bought at Ikea. Instead, they can make what best fits their needs.”
The project was supported in part by the National Science Foundation.
Cafe X Technologies Launches Robotic Coffeebar 2.0
Novel 3-D printing method embeds sensing capabilities within robotic actuators
Modular-based Systems for Defense Drones
U.T.SEC: UAS Will Change Our Lives Fundamentally – Including in the Security Field
Lack of Machine Guarding Again Named to OSHA’S Top 10 Most Cited Violations List
Designers envision robots helping chronically ill children
The Machines Are Taking Over Space
Humanoid robot supports emergency response teams
Robots in Depth with Henrik Christensen
In this episode of Robots in Depth, Per Sjöborg speaks with Henrik Christensen, the Qualcomm Chancellor’s Chair of Robot Systems and a Professor of Computer Science at Dept. of Computer Science and Engineering UC San Diego. He is also the director of the Institute for Contextual Robotics. Prior to UC San Diego he was the founding director of Institute for Robotics and Intelligent machines (IRIM) at Georgia Institute of Technology (2006-2016).
Christensen shares stories from his life in European robotics research, his views on the robot revolution, and experience developing robotics roadmaps.
IDS NXT – Novel Vision app-based sensors and cameras
New brain computer interfaces lead many to ask, is Black Mirror real?
It’s called the “grain,” a small IoT device implanted into the back of people’s skulls to record their memories. Human experiences are simply played back on “redo mode” using a smart button remote. The technology promises to reduce crime, terrorism and simplify human relationships with greater transparency. While this is a description of Netflix’s Black Mirror episode, “The Entire History of You,” in reality the concept is not as far-fetched as it may seem. This week life came closer to imitating art with the $19 million grant by the US Department of Defense to a group of six universities to begin work on “neurograins.”
In the past, Brian Computer Interfaces (BCIs) have utilized wearable technologies, such as headbands and helmets, to control robots, machines and various household appliances for people with severe disabilities. This new DARPA grant is focused on developing a “cortical intranet” for uplinks and downlinks directly to the cerebral cortex, potentially taking mind control to the next level. According to lead researcher Arto Nurmikko of Brown University, “What we’re developing is essentially a micro-scale wireless network in the brain enabling us to communicate directly with neurons on a scale that hasn’t previously been possible.”
Nurmikko boasts of the numerous medicinal outcomes of the research, “The understanding of the brain we can get from such a system will hopefully lead to new therapeutic strategies involving neural stimulation of the brain, which we can implement with this new neurotechnology.” The technology being developed by Nurmikko’s international team will eventually create a wireless neural communication platform that will be able to record and stimulate brian activity at an unprecedented level of detail and precision. This will be accomplished by implanting a mesh network of tens of thousands of granular micro-devices into a person’s cranium. The surgeons will place this layer of neurograins around the cerebral cortex that will be controlled by a nearby electronic patch just below a person’s skin.
In describing how it will work, Nurmikko explains, “We aim to be able to read out from the brain how it processes, for example, the difference between touching a smooth, soft surface and a rough, hard one and then apply microscale electrical stimulation directly to the brain to create proxies of such sensation. Similarly, we aim to advance our understanding of how the brain processes and makes sense of the many complex sounds we listen to every day, which guide our vocal communication in a conversation and stimulate the brain to directly experience such sounds.”
Nurmikko further describes, “We need to make the neurograins small enough to be minimally invasive but with extraordinary technical sophistication, which will require state-of-the-art microscale semiconductor technology. Additionally, we have the challenge of developing the wireless external hub that can process the signals generated by large populations of spatially distributed neurograins at the same time.”
While current BCIs are able to process the activity of 100 neurons at once, Nurmikko’s objective is to work at a level of 100,000 simultaneous inputs. “When you increase the number of neurons tenfold, you increase the amount of data you need to manage by much more than that because the brain operates through nested and interconnected circuits,” Nurmikko remarks. “So this becomes an enormous big data problem for which we’ll need to develop new computational neuroscience tools.” The researchers plan to first test their theory in the sensory and auditory functions of mammals.
Brain-Computer Interfaces is one of the fastest growing areas of healthcare technologies; while today it is valued at just under a billion dollars, it is forecasted to grow to $2 billion in the next five years. According to the report, the uptick in the market will be driven by an estimated increase in treating aging, fatal diseases and people with disabilities. The funder of Nurmikko’s project is DARPA’s Neural Engineering System Design program, which was formed to treat injured military personnel by “creating novel hardware and algorithms to understand how various forms of neural sensing and actuation might improve restorative therapeutic outcomes.” While DARPA’s project will provide numerous discoveries that will improve the quality of life for society’s most vulnerable, it also opens a Pandora’s box of ethical issues with the prospect of the US military potentially funding armies of cyborgs.
In response to rising ethical concerns, last month ethicists from the University of Basel in Switzerland drafted a new biosecurity framework for research in neurotechnology. The biggest concern expressed in the report was the implementation of “dual-use” technologies that have both military and medical benefits. The ethicists called for a complete ban on such innovations and strongly recommended fast tracking regulations to protect “the mental privacy and integrity of humans,”
The ethicists raise important questions about taking grant money from groups like DARPA, as “This military research has raised concern about the risks associated with the weaponization of neurotechnology, sparking a debate about controversial questions: Is it legitimate to conduct military research on brain technology? And how should policy-makers regulate dual-use neurotechnology?” The suggested framework reads like a science fiction novel, “This has resulted in a rapid growth in brain technology prototypes aimed at modulating the emotions, cognition, and behavior of soldiers. These include neurotechnological applications for deception detection and interrogation as well as brain-computer interfaces for military purposes.” However, one is reminded of the fact that the development of BCIs is moving more quickly than public policy is able to debate its merits.
The framework’s lead author Marcello Ienca of Basel’s Institute for Biomedical Ethics understands the tremendous positive benefits of BCIs for a global aging population, especially for people suffering from Alzheimer’s and spinal cord injuries. In fact, the Swiss team calls for increased private investment of these neurotechnologies, not an outright prohibition. At the same time, Ienca stresses that in order to protect against misuse, such as brain manipulation by nefarious global actors, it is critical to raise awareness and debate surrounding the ethical issues of implanting neurograins into populations of humans. In an interview with the Guardian last year, Ienca summed up his concern very succinctly by saying, “The information in our brains should be entitled to special protections in this era of ever-evolving technology. When that goes, everything goes.”
In the spirit of open debate our next RobotLab forum will be on “The Future of Robotic Medicine” with Dr. Joel Stein of Columbia University and Kate Merton of JLabs on March 6th @ 6pm in New York City, RSVP.