Engineers at the University of California San Diego have created a four-legged soft robot that doesn't need any electronics to work. The robot only needs a constant source of pressurized air for all its functions, including its controls and locomotion systems.
A new study shows that the combined use of fixed acoustic reception stations and underwater robots for the study of deep-sea species allows for a better understanding of their ecology. These technological advances could improve the recovery of deep-sea demersal populations.
Soft robots are better suited to certain situations than traditional robots. When interacting with an environment, humans or other living things, the inherent softness built into the structure of a robot made of rubber, for example, is safer than metal. Soft robots are also better at interacting with an unstable or uncertain environment—if a robot contacts an unpredicted object, it can simply deform to the object rather than crashing.
The Army of the future will involve humans and autonomous machines working together to accomplish the mission. According to Army researchers, this vision will only succeed if artificial intelligence is perceived to be ethical.
In the 2012 film "Robot and Frank", the protagonist, a retired cat burglar named Frank, is suffering the early symptoms of dementia. Concerned and guilty, his son buys him a "home robot" that can talk, do household chores like cooking and cleaning, and reminds Frank to take his medicine. It's a robot the likes of which we're getting closer to building in the real world.
Imagine you are playing an immersive game in which you are dropped into an unknown landscape with a directive to find a certain location. To advance forward in the game, you must also map the terrain so that you can then share your initial location and your map with another remote player. You have now been given a problem that, within the world of robotics is called SLAM. You have been asked to simultaneously localize and map an unknown environment.
Reservoir computing is a highly promising computational framework based on artificial recurrent neural networks (RNNs). Over the past few years, this framework was successfully applied to a variety of tasks, ranging from time-series predictions (i.e., stock market or weather forecasting), to robotic motion planning and speech recognition.
Researchers at the Max Planck Institute for Intelligent Systems (MPI-IS) and ETH Zürich have recently created HuggieBot 2.0, a robot that can hug users at their request. This robot, set to be presented at the ACM/IEEE International Conference on Human-Robot Interaction (HRI) in March, builds on a previous robotic system created by Alexis E. Block, one of the authors, during her Master's degree.
Autos today warn us of potential collisions, park themselves in tight spots, drive up to us from the parking lot to where we exit from a store on a rainy day, and they steer and change lanes for us as we coast along major highways. When it seems there's not much left for a smart car to do for us, someone usually comes along and gets a leg up on the competition.
From "Star Trek" replicators to Richie Rich's wishing machine, popular culture has a long history of parading flashy machines that can instantly output any item. While 3-D printers have now made it possible to produce a range of objects that include product models, jewelry, and novelty toys, we still lack the ability to fabricate more complex devices that are essentially ready-to-go right out of the printer.
Soft robots may not be in touch with human feelings, but they are getting better at feeling human touch.
As human interaction with robots and artificial intelligence increases exponentially in areas like healthcare, manufacturing, transportation, space exploration, defense technologies, information about how humans and autonomous systems work within teams remains scarce.
Boston Dynamics has given Spot, its robotic canine, a leg up over the competition. Or more precisely, an arm.
Today, neuroscience and robotics are developing hand in hand. Mikhail Lebedev, Academic Supervisor at HSE University's Centre for Bioelectric Interfaces, spoke about how studying the brain inspires the development of robots.
Modern-day robots are often required to interact with humans intelligently and efficiently, which can be enabled by providing them the ability to perceive touch. However, previous attempts at mimicking human skin have involved bulky and complex electronics, wiring, and a risk of damage. In a recent study, researchers from Japan sidestep these difficulties by constructing a 3-D vision-guided artificial skin that enables tactile sensing with high performance, opening doors to innumerable applications in medicine, healthcare, and industry.