Category Robotics Classification

Page 481 of 523
1 479 480 481 482 483 523

Researchers develop virtual-reality testing ground for drones

MIT engineers have developed a new virtual-reality training system for drones that enables a vehicle to “see” a rich, virtual environment while flying in an empty physical space.
Image: William Litant
By Jennifer Chu

Training drones to fly fast, around even the simplest obstacles, is a crash-prone exercise that can have engineers repairing or replacing vehicles with frustrating regularity.

Now MIT engineers have developed a new virtual-reality training system for drones that enables a vehicle to “see” a rich, virtual environment while flying in an empty physical space.

The system, which the team has dubbed “Flight Goggles,” could significantly reduce the number of crashes that drones experience in actual training sessions. It can also serve as a virtual testbed for any number of environments and conditions in which researchers might want to train fast-flying drones.

“We think this is a game-changer in the development of drone technology, for drones that go fast,” says Sertac Karaman, associate professor of aeronautics and astronautics at MIT. “If anything, the system can make autonomous vehicles more responsive, faster, and more efficient.”

Karaman and his colleagues will present details of their virtual training system at the IEEE International Conference on Robotics and Automation next week. Co-authors include Thomas Sayre-McCord, Winter Guerra, Amado Antonini, Jasper Arneberg, Austin Brown, Guilherme Cavalheiro, Dave McCoy, Sebastian Quilter, Fabian Riether, Ezra Tal, Yunus Terzioglu, and Luca Carlone of MIT’s Laboratory for Information and Decision Systems, along with Yajun Fang of MIT’s Computer Science and Artificial Intelligence Laboratory, and Alex Gorodetsky of Sandia National Laboratories.

Pushing boundaries

Karaman was initially motivated by a new, extreme robo-sport: competitive drone racing, in which remote-controlled drones, driven by human players, attempt to out-fly each other through an intricate maze of windows, doors, and other obstacles. Karaman wondered: Could an autonomous drone be trained to fly just as fast, if not faster, than these human-controlled vehicles, with even better precision and control?

“In the next two or three years, we want to enter a drone racing competition with an autonomous drone, and beat the best human player,” Karaman says. To do so, the team would have to develop an entirely new training regimen.

Currently, training autonomous drones is a physical task: Researchers fly drones in large, enclosed testing grounds, in which they often hang large nets to catch any careening vehicles. They also set up props, such as windows and doors, through which a drone can learn to fly. When vehicles crash, they must be repaired or replaced, which delays development and adds to a project’s cost.

Karaman says testing drones in this way can work for vehicles that are not meant to fly fast, such as drones that are programmed to slowly map their surroundings. But for fast-flying vehicles that need to process visual information quickly as they fly through an environment, a new training system is necessary.

“The moment you want to do high-throughput computing and go fast, even the slightest changes you make to its environment will cause the drone to crash,” Karaman says. “You can’t learn in that environment. If you want to push boundaries on how fast you can go and compute, you need some sort of virtual-reality environment.”

Flight Goggles

The team’s new virtual training system comprises a motion capture system, an image rendering program, and electronics that enable the team to quickly process images and transmit them to the drone.

The actual test space — a hangar-like gymnasium in MIT’s new drone-testing facility in Building 31 — is lined with motion-capture cameras that track the orientation of the drone as it’s flying.

With the image-rendering system, Karaman and his colleagues can draw up photorealistic scenes, such as a loft apartment or a living room, and beam these virtual images to the drone as it’s flying through the empty facility.    

“The drone will be flying in an empty room, but will be ‘hallucinating’ a completely different environment, and will learn in that environment,” Karaman explains.

The virtual images can be processed by the drone at a rate of about 90 frames per second — around three times as fast as the human eye can see and process images. To enable this, the team custom-built circuit boards that integrate a powerful embedded supercomputer, along with an inertial measurement unit and a camera. They fit all this hardware into a small, 3-D-printed nylon and carbon-fiber-reinforced drone frame. 

A crash course

The researchers carried out a set of experiments, including one in which the drone learned to fly through a virtual window about twice its size. The window was set within a virtual living room. As the drone flew in the actual, empty testing facility, the researchers beamed images of the living room scene, from the drone’s perspective, back to the vehicle. As the drone flew through this virtual room, the researchers tuned a navigation algorithm, enabling the drone to learn on the fly.

Over 10 flights, the drone, flying at around 2.3 meters per second (5 miles per hour), successfully flew through the virtual window 361 times, only “crashing” into the window three times, according to positioning information provided by the facility’s motion-capture cameras. Karaman points out that, even if the drone crashed thousands of times, it wouldn’t make much of an impact on the cost or time of development, as it’s crashing in a virtual environment and not making any physical contact with the real world.

In a final test, the team set up an actual window in the test facility, and turned on the drone’s onboard camera to enable it to see and process its actual surroundings. Using the navigation algorithm that the researchers tuned in the virtual system, the drone, over eight flights, was able to fly through the real window 119 times, only crashing or requiring human intervention six times.

“It does the same thing in reality,” Karaman says. “It’s something we programmed it to do in the virtual environment, by making mistakes, falling apart, and learning. But we didn’t break any actual windows in this process.”

He says the virtual training system is highly malleable. For instance, researchers can pipe in their own scenes or layouts in which to train drones, including detailed, drone-mapped replicas of actual buildings — something the team is considering doing with MIT’s Stata Center. The training system may also be used to test out new sensors, or specifications for existing sensors, to see how they might handle on a fast-flying drone.

“We could try different specs in this virtual environment and say, ‘If you build a sensor with these specs, how would it help a drone in this environment?’’ Karaman says.

The system can also be used to train drones to fly safely around humans. For instance, Karaman envisions splitting the actual test facility in two, with a drone flying in one half, while a human, wearing a motion-capture suit, walks in the other half. The drone would “see” the human in virtual reality as it flies around its own space. If it crashes into the person, the result is virtual, and harmless.

“One day, when you’re really confident, you can do it in reality, and have a drone flying around a person as they’re running, in a safe way,” Karaman says. “There are a lot of mind-bending experiments you can do in this whole virtual reality thing. Over time, we will showcase all the things you can do.”

This research was supported, in part, by U.S. Office of Naval Research, MIT Lincoln Laboratory, and the NVIDIA Corporation.

Albatross robot takes flight

An albatross glider, designed by MIT engineers, skims the Charles River.
Photo: Gabriel Bousquet

By Jennifer Chu

MIT engineers have designed a robotic glider that can skim along the water’s surface, riding the wind like an albatross while also surfing the waves like a sailboat.

In regions of high wind, the robot is designed to stay aloft, much like its avian counterpart. Where there are calmer winds, the robot can dip a keel into the water to ride like a highly efficient sailboat instead.

The robotic system, which borrows from both nautical and biological designs, can cover a given distance using one-third as much wind as an albatross and traveling 10 times faster than a typical sailboat. The glider is also relatively lightweight, weighing about 6 pounds. The researchers hope that in the near future, such compact, speedy robotic water-skimmers may be deployed in teams to survey large swaths of the ocean.

“The oceans remain vastly undermonitored,” says Gabriel Bousquet, a former postdoc in MIT’s Department of Aeronautics and Astronautics, who led the design of the robot as part of his graduate thesis. “In particular, it’s very important to understand the Southern Ocean and how it is interacting with climate change. But it’s very hard to get there. We can now use the energy from the environment in an efficient way to do this long-distance travel, with a system that remains small-scale.”

Bousquet will present details of the robotic system this week at IEEE’s International Conference on Robotics and Automation, in Brisbane, Australia. His collaborators on the project are Jean-Jacques Slotine, professor of mechanical engineering and information sciences and of brain sciences; and Michael Triantafyllou, the Henry L. and Grace Doherty Professor in Ocean Science and Engineering.

The physics of speed

Last year, Bousquet, Slotine, and Triantafyllou published a study on the dynamics of albatross flight, in which they identified the mechanics that enable the tireless traveler to cover vast distances while expending minimal energy. The key to the bird’s marathon voyages is its ability to ride in and out of high- and low-speed layers of air.

Specifically, the researchers found the bird is able to perform a mechanical process called a “transfer of momentum,” in which it takes momentum from higher, faster layers of air, and by diving down transfers that momentum to lower, slower layers, propelling itself without having to continuously flap its wings.

Interestingly, Bousquet observed that the physics of albatross flight is very similar to that of sailboat travel. Both the albatross and the sailboat transfer momentum in order to keep moving. But in the case of the sailboat, that transfer occurs not between layers of air, but between the air and water.

“Sailboats take momentum from the wind with their sail, and inject it into the water by pushing back with their keel,” Bousquet explains. “That’s how energy is extracted for sailboats.”

An albatross glider, designed by MIT engineers, skims the Charles River.

Bousquet also realized that the speed at which both an albatross and a sailboat can travel depends upon the same general equation, related to the transfer of momentum. Essentially, both the bird and the boat can travel faster if they can either stay aloft easily or interact with two layers, or mediums, of very different speeds.

The albatross does well with the former, as its wings provide natural lift, though it flies between air layers with a relatively small difference in windspeeds. Meanwhile, the sailboat excels at the latter, traveling between two mediums of very different speeds — air versus water — though its hull creates a lot of friction and prevents it from getting much speed.  Bousquet wondered: What if a vehicle could be designed to perform well in both metrics, marrying the high-speed qualities of both the albatross and the sailboat?

“We thought, how could we take the best from both worlds?” Bousquet says.

Out on the water

The team drafted a design for such a hybrid vehicle, which ultimately resembled an autonomous glider with a 3-meter wingspan, similar to that of a typical albatross. They added a tall, triangular sail, as well as a slender, wing-like keel. They then performed some mathematical modeling to predict how such a design would travel.

According to their calculations, the wind-powered vehicle would only need relatively calm winds of about 5 knots to zip across waters at a velocity of about 20 knots, or 23 miles per hour.

“We found that in light winds you can travel about three to 10 times faster than a traditional sailboat, and you need about half as much wind as an albatross, to reach 20 knots,” Bousquet says. “It’s very efficient, and you can travel very fast, even if there is not too much wind.”

The team built a prototype of their design, using a glider airframe designed by Mark Drela, professor of aeronautics and astronautics at MIT. To the bottom of the glider they added a keel, along with various instruments, such as GPS, inertial measurement sensors, auto-pilot instrumentation, and ultrasound, to track the height of the glider above the water.

“The goal here was to show we can control very precisely how high we are above the water, and that we can have the robot fly above the water, then down to where the keel can go under the water to generate a force, and the plane can still fly,” Bousquet says.

The researchers decided to test this “critical maneuver” — the act of transitioning between flying in the air and dipping the keel down to sail in the water. Accomplishing this move doesn’t necessarily require a sail, so Bousquet and his colleagues decided not to include one in order to simplify preliminary experiments.

In the fall of 2016, the team put its design to the test, launching the robot from the MIT Sailing Pavilion out onto the Charles River. As the robot lacked a sail and any mechanism to get it started, the team hung it from a fishing rod attached to a whaler boat. With this setup, the boat towed the robot along the river until it reached about 20 miles per hour, at which point the robot autonomously “took off,” riding the wind on its own.

Once it was flying autonomously, Bousquet used a remote control to give the robot a “down” command, prompting it to dip low enough to submerge its keel in the river. Next, he adjusted the direction of the keel, and observed that the robot was able to steer away from the boat as expected. He then gave a command for the robot to fly back up, lifting the keel out of the water.

“We were flying very close to the surface, and there was very little margin for error — everything had to be in place,” Bousquet says. “So it was very high stress, but very exciting.”

The experiments, he says, prove that the team’s conceptual device can travel successfully, powered by the wind and the water. Eventually, he envisions fleets of such vehicles autonomously and efficiently monitoring large expanses of the ocean.

“Imagine you could fly like an albatross when it’s really windy, and then when there’s not enough wind, the keel allows you to sail like a sailboat,” Bousquet says. “This dramatically expands the kinds of regions where you can go.”

This research was supported, in part, by the Link Ocean Instrumentation fellowship.

#260: Hyundai’s Exoskeletons, with Sangin Park

In this interview, Audrow Nash speaks with Sangin Park, Senior Research Engineer at Hyundai, about exoskeletons. Park describes three exoskeleton prototypes: one for helping workers reduce back pain, one for assisting a person with paraplegia, and an exoskeleton for soldiers. Park discusses the sensors and actuators of each exoskeleton, as well as Hyundai’s exoskeleton ambitions.

Sangin Park

Sangin Park is a Senior Research Engineer at Hyundai.

Links

Robot transitions from soft to rigid

As the vacuum is applied to the flexible material, it becomes stiff and able to support the weight of the drone. Credit: Yashraj Narang

By Leah Burrows

Even octopuses understand the importance of elbows. When these squishy, loose-limbed cephalopods need to make a precise movement — such as guiding food into their mouth — the muscles in their tentacles contract to create a temporary revolute joint. These joints limit the wobbliness of the arm, enabling more controlled movements.

Now, researchers from the Wyss Institute for Biologically Inspired Engineering at Harvard University and the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have shown how a multi-layered structure can allow robots to mimic the octopus’ kinematics, creating and eliminating joints on command. The structure can also allow robots to rapidly change their stiffness, damping, and dynamics.

The research is published in two papers in Advanced Functional Materials and IEEE Robotics and Automation Letters.

“This research helps bridge the gap between soft robotics and traditional rigid robotics,” said Yashraj Narang, first author of both studies and graduate student at SEAS. “We believe that this class of technology may foster a new generation of machines and structures that cannot simply be classified as soft or rigid.”

When a vacuum is applied, the layers of flexible material becomes stiff and can hold arbitrary shapes, and be molded into additional forms. Credit: Yashraj Narang/Harvard SEAS

The structure is surprisingly simple, consisting of multiple layers of flexible material wrapped in a plastic envelope and connected to a vacuum source. When the vacuum is off, the structure behaves exactly as you would expect, bending, twisting and flopping without holding shape. But when a vacuum is applied, it becomes stiff and can hold arbitrary shapes, and it can be molded into additional forms.

This transition is the result of a phenomenon called laminar jamming, in which the application of pressure creates friction that strongly couples a group of flexible materials.

“The frictional forces generated by the pressure act like glue,” said Narang. “We can control the stiffness, damping, kinematics, and dynamics of the structure by changing the number of layers, tuning the pressure applied to it, and adjusting the spacing between multiple stacks of layers.”

The research team, which included Wyss Associate Faculty member Robert Howe, Ph.D., the Abbott and James Lawrence Professor of Engineering at SEAS; Joost Vlassak, Ph.D., the Abbott and James Lawrence Professor of Materials Engineering at SEAS; and Alperen Degirmenci, a SEAS graduate student, extensively modeled the mechanical behavior of laminar jamming to better control its capabilities.

Next, they built real-world devices using the structures, including a two-fingered gripper that, without a vacuum, could wrap around and hold onto large objects and, with a vacuum, could pinch and hold onto small objects about the size of a marble.

The researchers also demonstrated the structure’s capabilities as shock absorbers by attaching them to a drone as a landing gear. The team tuned the stiffness and damping of the structures to absorb the impact of landing.

The structure is a proof-of-concept that could have many applications in the future, from surgical robots to wearable devices and flexible speakers.

“Our work has explained the phenomenon of laminar jamming and shown how it can provide robots with highly versatile mechanical behavior,” said Howe, who is the senior author of the paper. “We believe that this technology will eventually lead to robots that can change state between soft, continuous devices that can safely interact with humans, and rigid, discrete devices that can meet the demands of industrial automation.”

 This research was supported in part by the National Science Foundation.

Page 481 of 523
1 479 480 481 482 483 523