Page 288 of 400
1 286 287 288 289 290 400

An origami robot for touching virtual reality objects

A group of EPFL researchers have developed a foldable device that can fit in a pocket and can transmit touch stimuli when used in a human-machine interface.

When browsing an e-commerce site on your smartphone, or a music streaming service on your laptop, you can see pictures and hear sound snippets of what you are going to buy. But sometimes it would be great to touch it too – for example to feel the texture of a garment, or the stiffness of a material. The problem is that there are no miniaturized devices that can render touch sensations the way screens and loudspeakers render sight and sound, and that can easily be coupled to a computer or a mobile device.

Researchers in Professor Jamie Paik’s lab at EPFL have made a step towards creating just that – a foldable device that can fit in a pocket and can transmit touch stimuli when used in a human-machine interface. Called Foldaway, this miniature robot is based on the origami robotic technology, which makes it easy to miniaturize and manufacture. Because it starts off as a flat structure, it can be printed with a technique similar to the one employed for electronic circuits, and can be easily stored and transported. At the time of deployment, the flat structure folds along a pre-defined pattern of joints to take the desired 3D, button-like shape. The device includes three actuators that generate movements, forces and vibrations in various directions; a moving origami mechanism on the tip that transmit sensations to the user’s finger; sensors that track the movements of the finger and electronics to control the whole system. This way the device can render different touch sensations that reproduce the physical interaction with objects or forms.

The Foldaway device, that is described in an article in the December issue of Nature Machine Intelligence and is featured on the journal’s cover, comes in two versions, called Delta and Pushbutton. “The first one is more suitable for applications that require large movements of the user’s finger as input” says Stefano Mintchev, a member of Jamie Paik’s lab and co-author of the paper. “The second one is smaller, therefore pushing portability even further without sacrificing the force sensations transmitted to the user’s finger”.

Education, virtual reality and drone control
The researchers have tested their devices in three situations. In an educational context, they have shown that a portable interface, measuring less than 10 cm in length and width and 3 cm in height, can be used to navigate an atlas of human anatomy: the Foldaway device gives the user different sensations upon passing on various organs: the different stiffness of soft lungs and hard bones at the rib cage; the up-and-down movement of the heartbeat; sharp variations of stiffness on the trachea.

As a virtual reality joystick, the Foldaway can give the user the sensation of grasping virtual objects and perceiving their stiffness, modulating the force generated when the interface is pressed 


As a control interface for drones, the device can contribute to solve the sensory mismatch created when users control the drone with their hands, but can perceive the response of drones only through visual feedback. Two Pushbuttons can be combined, and their rotation can be mapped into commands for altitude, lateral and forward/backward movements. The interface also provides force feedback to the user’s fingertips in order to increase the pilot’s awareness on drone’s behaviour and of the effects of wind or other environmental factors.

The Foldaway device was developed by the Reconfigurable Robotics Lab at EPFL, and is currently being developed for commercialisation by a spin-off (called FOLDAWAY Haptics) that was supported by NCCR Robotics’s Spin Fund grant.

“Now that computing devices and robots are more ubiquitous than ever, the quest for human machine-interactions is growing rapidly” adds Marco Salerno, a member of Jamie Paik’s lab and co-author of the paper. “The miniaturization offered by origami robots can finally allow the integration of rich touch feedback into everyday interfaces, from smartphones to joysticks, or the development of completely new ones such as interactive morphing surfaces”.

Literature
Mintchev, S., Salerno, M., Cherpillod, A. et al., “A portable three-degrees-of-freedom force feedback origami robot for human–robot interactions“, Nature Machine Intelligence 1, 584–593 (2019) doi:10.1038/s42256-019-0125-1

Intelligent Towing Tank propels human-robot-computer research

A researcher’s hand hovers over the water’s surface in the Intelligent Towing Tank (ITT), an automated experimental facility guided by active learning to explore vortex-induced vibrations (VIVs), revealing a path to accelerated scientific discovery.
Image: Dixia Fan and Lily Keyes/MIT Sea Grant

By Lily Keyes/MIT Sea Grant

In its first year of operation, the Intelligent Towing Tank (ITT) conducted about 100,000 total experiments, essentially completing the equivalent of a PhD student’s five years’ worth of experiments in a matter of weeks.

The automated experimental facility, developed in the MIT Sea Grant Hydrodynamics Laboratory, automatically and adaptively performs, analyzes, and designs experiments exploring vortex-induced vibrations (VIVs). Important for engineering offshore ocean structures like marine drilling risers that connect underwater oil wells to the surface, VIVs remain somewhat of a phenomenon to researchers due to the high number of parameters involved.

Guided by active learning, the ITT conducts series of experiments wherein the parameters of each next experiment are selected by a computer. Using an “explore-and-exploit” methodology, the system dramatically reduces the number of experiments required to explore and map the complex forces governing VIVs.

What began as then-PhD candidate Dixia Fan’s quest to cut back on conducting a thousand or so laborious experiments — by hand — led to the design of the innovative system and a paper recently published in the journal Science Robotics.

Fan, now a postdoc, and a team of researchers from the MIT Sea Grant College Program and MIT’s Department of Mechanical Engineering, École Normale Supérieure de Rennes, and Brown University, reveal a potential paradigm shift in experimental research, where humans, computers, and robots can collaborate more effectively to accelerate scientific discovery.

The 33-foot whale of a tank comes alive, working without interruption or supervision on the venture at hand — in this case, exploring a canonical problem in the field of fluid-structure interactions. But the researchers envision applications of the active learning and automation approach to experimental research across disciplines, potentially leading to new insights and models in multi-input/multi-output nonlinear systems.

VIVs are inherently-nonlinear motions induced on a structure in an oncoming irregular cross-stream, which prove vexing to study. The researchers report that the number of experiments completed by the ITT is already comparable to the total number of experiments done to date worldwide on the subject of VIVs.

The reason for this is the large number of independent parameters, from flow velocity to pressure, involved in studying the complex forces at play. According to Fan, a systematic brute-force approach — blindly conducting 10 measurements per parameter in an eight-dimensional parametric space — would require 100 million experiments.

With the ITT, Fan and his collaborators have taken the problem into a wider parametric space than previously practicable to explore. “If we performed traditional techniques on the problem we studied,” he explains, “it would take 950 years to finish the experiment.” Clearly infeasible, so Fan and the team integrated a Gaussian process regression learning algorithm into the ITT. In doing so, the researchers reduced the experimental burden by several orders of magnitude, requiring only a few thousand experiments.

The robotic system automatically conducts an initial sequence of experiments, periodically towing a submerged structure along the length of the tank at a constant velocity. Then, the ITT takes partial control over the parameters of each next experiment by minimizing suitable acquisition functions of quantified uncertainties and adapting to achieve a range of objectives, like reduced drag.

Earlier this year, Fan was awarded an MIT Mechanical Engineering de Florez Award for “Outstanding Ingenuity and Creative Judgment” in the development of the ITT. “Dixia’s design of the Intelligent Towing Tank is an outstanding example of using novel methods to reinvigorate mature fields,” says Michael Triantafyllou, Henry L. and Grace Doherty Professor in Ocean Science and Engineering, who acted as Fan’s doctoral advisor.

Triantafyllou, a co-author on this paper and the director of the MIT Sea Grant College Program, says, “MIT Sea Grant has committed resources and funded projects using deep-learning methods in ocean-related problems for several years that are already paying off.” Funded by the National Oceanic and Atmospheric Administration and administered by the National Sea Grant Program, MIT Sea Grant is a federal-Institute partnership that brings the research and engineering core of MIT to bear on ocean-related challenges.

Fan’s research points to a number of others utilizing automation and artificial intelligence in science: At Caltech, a robot scientist named “Adam” generates and tests hypotheses; at the Defense Advanced Research Projects Agency, the Big Mechanism program reads tens of thousands of research papers to generate new models.

Similarly, the ITT applies human-computer-robot collaboration to accelerate experimental efforts. The system demonstrates a potential paradigm shift in conducting research, where automation and uncertainty quantification can considerably accelerate scientific discovery. The researchers assert that the machine learning methodology described in this paper can be adapted and applied in and beyond fluid mechanics, to other experimental fields.

Other contributors to the paper include George Karniadakis from Brown University, who is also affiliated with MIT Sea Grant; Gurvan Jodin from ENS Rennes; MIT PhD candidate in mechanical engineering Yu Ma; and Thomas Consi, Luca Bonfiglio, and Lily Keyes from MIT Sea Grant.

This work was supported by DARPA, Fariba Fahroo, and Jan Vandenbrande through an EQUiPS (Enabling Quantification of Uncertainty in Physical Systems) grant, as well as Shell, Subsea 7, and the MIT Sea Grant College Program.

Industrial Robots market slows in 2019 but long-term forecast is positive – Interact Analysis

New market research from Interact Analysis reveals that the growth of industrial robot revenues slowed in 2019, but is forecast to pick up again towards late 2020 and accelerate in 2021. -Automotive and smartphone production declines played a significant part in 2019 slowdown -New applications, lower prices and wider use cases will lead to a significant upturn by 2023 -China shows its strength, both domestically and in

Machine Automation Taken to the Next Level

Around 20 million cutter blades for hair clippers and beard trimmers leave the Philips plant at Klagenfurt every year. In order to make the complex grinding process as efficient as possible, the company has opted for a pioneering automation concept, where the loading of the machine and the delivery of the parts is fully automated.

How Are Robots Tested for Harsh Conditions?

Advanced robots can spare human workers from dangerous or life-threatening conditions and environments — like the intricate underwater terrain of a search-and-rescue mission or extreme pressures faced by oil and gas workers. Robots aren't invincible, however, and they need to be carefully designed to handle these extreme conditions. Here are some of the extreme environments that robots face — and how designers test them

Robotics Industry Set for Seismic Change as Growth Shifts from Fixed Automation to Mobile Systems in Enterprise

By 2022, the burgeoning mobile robotics space will start to overtake the traditional industrial robotics market. Currently, mobile autonomy is concentrated in material handling within the supply chain, but mobile robots are set to touch every sector of the global economy for a wide range of use-cases.

#299: On the Novelty Effect in Human-Robot Interaction, with Catharina Vesterager Smedegaard

From Robert the Robot, 1950s toy ad

In this episode, we take a closer look at the effect of novelty in human-robot interaction. Novelty is the quality of being new or unusual.

The typical view is that while something is new, or “a novelty”, it will initially make us behave differently than we would normally. But over time, as the novelty wears off, we will likely return to our regular behaviors. For example, a new robot may cause a person to behave differently initially, as its introduced into the person’s life, but after some time, the robot won’t be as exciting, novel and motivating, and the person might return to their previous behavioral patterns, interacting less with the robot.

To find out more about the concept of novelty in human-robot interactions, our interviewer Audrow caught up with Catharina Vesterager Smedegaard, a PhD-student at Aarhus University in Denmark, whose field of study is Philosophy.

Catharina sees novelty differently to how we typically see it. She thinks of it as projecting what we don’t know onto what we already know, which has implications for how human-robot interactions are designed and researched. She also speaks about her experience in philosophy more generally, and gives us advice on philosophical thinking.

Catharina Vesterager Smedegaard

Catharina Vesterager Smedegaard started as a PhD-student the 1st of December 2017. She has a BA and MA in philosophy. In autumn 2015, Catharina interned at the research group PENSOR (the present RUR), where she first became interested in Social Robotics and formed the idea for her MA thesis.

Links

Page 288 of 400
1 286 287 288 289 290 400