For decades, researchers worldwide have been trying to develop robots that can efficiently assist humans and work alongside them as they tackle a variety of everyday tasks. To do this effectively, however, the robots should be able to interact naturally with humans, including handing them and receiving objects from them.
With e-commerce orders pouring in, a warehouse robot picks mugs off a shelf and places them into boxes for shipping. Everything is humming along, until the warehouse processes a change and the robot must now grasp taller, narrower mugs that are stored upside down.
In a global first, scientists have demonstrated that molecular robots are able to accomplish cargo delivery by employing a strategy of swarming, achieving a transport efficiency five times greater than that of single robots.
When artificial intelligence systems encounter scenes where objects are not fully visible, they have to make estimations based only on the visible parts of the objects. This partial information leads to detection errors, and large training data is required to correctly recognize such scenes. Now, researchers at the Gwangju Institute of Science and Technology have developed a framework that allows robot vision to detect such objects successfully in the same way that we perceive them
A team of researchers at Seoul National University has created a stronger and faster hydrogel actuator by combining turgor design and electro-osmosis. In their paper published in the journal Science, the group describes their approach and how well the resulting actuator performed when tested in a real-world experiment. Zhen Jiang and Pingan Song, with the University of Southern Queensland, outline some of the difficulties researchers have faced in trying to create hydrogels that imitate biological organisms and comment on the work done by the team in Korea in a Perspective article published in the same journal issue.
The notion of a large metallic robot that speaks in monotone and moves in lumbering, deliberate steps is somewhat hard to shake. But practitioners in the field of soft robotics have an entirely different image in mind—autonomous devices composed of compliant parts that are gentle to the touch, more closely resembling human fingers than R2-D2 or Robby the Robot.
Can robots adapt their own working methods to solve complex tasks? Researchers at Chalmers University of Technology, Sweden, have developed a new form of AI, which, by observing human behavior, can adapt to perform its tasks in a changeable environment. The hope is that robots that can be flexible in this way will be able to work alongside humans to a much greater degree.
Recent trends in healthcare innovation have reflected a drastic increase in the autonomy levels of surgical robots. Despite the many clear benefits of promoting constant innovation in the field of healthcare robotics, its application in the real world presents multiple gaps that can cause harm in a way that humans cannot necessarily correct or oversee. While the benefits of autonomous surgical robots are abundant, the interplay between robot manufacturers, healthcare providers, and patients poses new risks with the surgical procedure's outcome being no longer limited to the skill of the surgeon. This necessarily begs the question: who is responsible when something goes wrong?
MIT engineers have developed a telerobotic system to help surgeons quickly and remotely treat patients experiencing a stroke or aneurysm. With a modified joystick, surgeons in one hospital may control a robotic arm at another location to safely operate on a patient during a critical window of time that could save the patient's life and preserve their brain function.
When it comes to the future of intelligent robots, the first question people ask is often: How many jobs will they make disappear? Whatever the answer, the second question is likely to be: How can I make sure that my job is not among them?
In a lab at the University of Washington, robots are playing air hockey.
In early November 2013, the news wasn't looking great for Tesla. A series of reports had documented instances of Tesla Model S sedans catching on fire, causing the electric carmaker's share price to tumble.
New research from the University of Hertfordshire reveals how humans could develop more natural, social interactions with robots in the future.
Yes, those 7-foot-tall machines at Dallas Love Field are watching you. They want to make sure you're wearing a mask if you're boarding a flight or not parking too long at the curb if you're picking up a returning traveler.
Materials scientists aim to develop biomimetic soft robotic crawlers including earthworm-like and inchworm-like crawlers to realize locomotion via in-plane and out-of-plane contractions for a variety of engineering applications. While such devices can show effective motion in confined spaces, it is challenging to miniaturize the concept due to complex and limited actuation. In a new report now published in Science Advances, Qiji Ze and a team of scientists in mechanical engineering and aerospace engineering at Stanford University and the Ohio State University, U.S., described a magnetically actuated, small-scale origami crawler exhibiting in-plane contraction. The team achieved contraction mechanisms via a four-unit Kresling origami assembly to facilitate the motion of an untethered robot with crawling or steering capacity. The crawler overcame large resistances in severely confined spaces due to its magnetically tunable structural stiffness and anisotropy. The setup provided a contraption for drug storage and release with potential to serve as a minimally invasive device in biomedicine.