Archive 04.02.2021

Page 53 of 58
1 51 52 53 54 55 58

Wielding a laser beam deep inside the body

A laser projected on a white surface
The laser steering device is able to trace complex trajectories such as an exposed wire as well as a word within geometrical shapes. Credit: Wyss Institute at Harvard University

A microrobotic opto-electro-mechanical device able to steer a laser beam with high speed and a large range of motion could enhance the possibilities of minimally invasive surgeries

By Benjamin Boettner

Minimally invasive surgeries in which surgeons gain access to internal tissues through natural orifices or small external excisions are common practice in medicine. They are performed for problems as diverse as delivering stents through catheters, treating abdominal complications, and performing transnasal operations at the skull base in patients with neurological conditions.

The ends of devices for such surgeries are highly flexible (or “articulated”) to enable the visualization and specific manipulation of the surgical site in the target tissue. In the case of energy-delivering devices that allow surgeons to cut or dry (desiccate) tissues, and stop internal bleeds (coagulate) deep inside the body, a heat-generating energy source is added to the end of the device. However, presently available energy sources delivered via a fiber or electrode, such as radio frequency currents, have to be brought close to the target site, which limits surgical precision and can cause unwanted burns in adjacent tissue sections and smoke development.

Laser technology, which already is widely used in a number of external surgeries, such as those performed in the eye or skin, would be an attractive solution. For internal surgeries, the laser beam needs to be precisely steered, positioned and quickly repositioned at the distal end of an endoscope, which cannot be accomplished with the currently available relatively bulky technology.


Responding to an unmet need for a robotic surgical device that is flexible enough to access hard to reach areas of the G.I. tract while causing minimal peripheral tissue damage, Researchers at the Wyss Institute and Harvard SEAS have developed a laser steering device that has the potential to improve surgical outcomes for patients. Credit: Wyss Institute at Harvard University

Now, robotic engineers led by Wyss Associate Faculty member Robert Wood, Ph.D., and postdoctoral fellow Peter York, Ph.D., at Harvard University’s Wyss Institute for Biologically Inspired Engineering and John A. Paulson School for Engineering and Applied Science (SEAS) have developed a laser-steering microrobot in a miniaturized 6×16 millimeter package that operates with high speed and precision, and can be integrated with existing endoscopic tools. Their approach, reported in Science Robotics, could help significantly enhance the capabilities of numerous minimally invasive surgeries.

A prototype of the laser steering device creating a star trajectory
This collage shows a prototype of the laser steering device creating a star trajectory at 5000 mm/s. Credit: Wyss Institute at Harvard University

In this multi-disciplinary approach, we managed to harness our ability to rapidly prototype complex microrobotic mechanisms…provide clinicians with a non-disruptive solution that could allow them to advance the possibilities of minimally invasive surgeries in the human body with life-altering or potentially life-saving impact.

Robert Wood

“To enable minimally invasive laser surgery inside the body, we devised a microrobotic approach that allows us to precisely direct a laser beam at small target sites in complex patterns within an anatomical area of interest,” said York, the first and corresponding author on the study and a postdoctoral fellow on Wood’s microrobotics team. “With its large range of articulation, minimal footprint, and fast and precise action, this laser-steering end-effector has great potential to enhance surgical capabilities simply by being added to existing endoscopic devices in a plug-and-play fashion.”

The team needed to overcome the basic challenges in design, actuation, and microfabrication of the optical steering mechanism that enables tight control over the laser beam after it has exited from an optical fiber. These challenges, along with the need for speed and precision, were exacerbated by the size constraints – the entire mechanism had to be housed in a cylindrical structure with roughly the diameter of a drinking straw to be useful for endoscopic procedures.

“We found that for steering and re-directing the laser beam, a configuration of three small mirrors that can rapidly rotate with respect to one another in a small ‘galvanometer’ design provided a sweet spot for our miniaturization effort,” said second author Rut Peña, a mechanical engineer with micro-manufacturing expertise in Wood’s group. “To get there, we leveraged methods from our microfabrication arsenal in which modular components are laminated step-wise onto a superstructure on the millimeter scale – a highly effective fabrication process when it comes to iterating on designs quickly in search of an optimum, and delivering a robust strategy for mass-manufacturing a successful product.”

An endoscope with laser as end-effector
The microrobotic laser-steering end-effector (on the right) can be used as a fitted add-on accessory for existing endoscopic systems (on the left) for use in minimally invasive surgery. Credit: Wyss Institute at Harvard University

The team demonstrated that their laser-steering end-effector, miniaturized to a cylinder measuring merely 6 mm in diameter and 16 mm in length, was able to map out and follow complex trajectories in which multiple laser ablations could be performed with high speed, over a large range, and be repeated with high accuracy.

To further show that the device, when attached to the end of a common colonoscope, could be applied to a life-like endoscopic task, York and Peña, advised by Wyss Clinical Fellow Daniel Kent, M.D., successfully simulated the resection of polyps by navigating their device via tele-operation in a benchtop phantom tissue made of rubber. Kent also is a resident physician in general surgery at the Beth Israel Deaconess Medical Center.

“In this multi-disciplinary approach, we managed to harness our ability to rapidly prototype complex microrobotic mechanisms that we have developed over the past decade to provide clinicians with a non-disruptive solution that could allow them to advance the possibilities of minimally invasive surgeries in the human body with life-altering or potentially life-saving impact,” said senior author Wood, Ph.D., who also is the Charles River Professor of Engineering and Applied Sciences at SEAS.

Laser inside a colon
The laser steering device performing a colonoscope demo in a life-size model of the colon. Credit: Wyss Institute at Harvard University

Wood’s microrobotics team together with technology translation experts at the Wyss Institute have patented their approach and are now further de-risking their medical technology (MedTech) as an add-on for surgical endoscopes.

“The Wyss Institute’s focus on microrobotic devices and this new laser-steering device developed by Robert Wood’s team working across disciplines with clinicians and experts in translation will hopefully revolutionize how minimally invasive surgical procedures are carried out in a number of disease areas,” said Wyss Founding Director Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at Harvard Medical School and Boston Children’s Hospital, and Professor of Bioengineering at SEAS.

The study was funded by the National Science Foundation under award #CMMI-1830291, and the Wyss Institute for Biologically Inspired Engineering.

Artificial skin brings robots closer to ‘touching’ human lives

Modern-day robots are often required to interact with humans intelligently and efficiently, which can be enabled by providing them the ability to perceive touch. However, previous attempts at mimicking human skin have involved bulky and complex electronics, wiring, and a risk of damage. In a recent study, researchers from Japan sidestep these difficulties by constructing a 3-D vision-guided artificial skin that enables tactile sensing with high performance, opening doors to innumerable applications in medicine, healthcare, and industry.

A new bio-inspired joint model to design robotic exoskeletons

Recent advances in the field of robotics have enabled the fabrication of increasingly sophisticated robotic limbs and exoskeletons. Robotic exoskeletons are essentially wearable 'shells' made of different robotic parts. Exoskeletons can improve the strength, capabilities and stability of users, helping them to tackle heavy physical tasks with less effort or aiding their rehabilitation after accidents.

We’re teaching robots to evolve autonomously so they can adapt to life alone on distant planets 

It's been suggested that an advance party of robots will be needed if humans are ever to settle on other planets. Sent ahead to create conditions favorable for humankind, these robots will need to be tough, adaptable and recyclable if they're to survive within the inhospitable cosmic climates that await them.

Women in Robotics Update: Andra Keay, Nguyen Sao Mai and Selin Alara Örnek

Here’s a Women in Robotics Spotlight, where we share stories from women who are working on all sorts of interesting projects who haven’t yet been featured in our Annual Showcase. We hope these stories provide inspiration to everyone to join us working in the field of robotics. And if you’re a woman working in robotics, why not contribute your story too!

“I love robots however I do find it frustrating when the code that was working the day before doesn’t work. I also find it hard supplying my robots with power. I learn online although I do have a few mentors that help me but it’s really not easy learning on my own. My favourite thing about robotics is making them, and when they work like they should. My robots make people really happy so I love that. I also love succeeding – the feeling when my robots come to life is unbelievable.” says Selin Alara Örnek a high school student who has built five robots, including a robot guide dog for the blind.

Andra Keay  Managing Director at Silicon Valley Robotics, Visiting Scholar at CITRIS People and Robots Lab and Founder at Women in Robotics

Why robots?

“My background is Human-Robot Interaction, Design, and Communications Technology – which seems a long way from robotics, but our mass communication technologies (including internet) were the most powerful and creative technologies of the 20th century.

I’ve always been interested in robots, firstly as a philosophical thing, then as interesting robots became possible, fascinated by the way in which this latest evolution of technology is spreading into society.

The sheer scope of the technology is my favorite thing, but historically, the incredible homogeneity or lack of diversity in robotics is my least favorite thing. Fortunately, we’re changing that!

We have technology that can solve the world’s greatest challenges, if we can continue to find ways to market and avoid frittering away our advances on novelty devices, games or advertising.”

What suggestions do you have for other Women in Robotics?

“Every time I was trusted to lead a project I grew a lot and gained confidence. Until then, I hadn’t even realized that I was lacking in confidence. At the time, it just seemed expected of young women to follow other people, not go my own way.

I’d like to call out people on being too self deprecating, putting themselves down, or apologizing for themselves. If I had a dollar for every time a women in robotics has said “but I’m not really a.” then I’d be able to fund a great robotics company! I also call on people to stop blaming women for not speaking up or ‘leaning in’ when no one in industry is listening to them!”

Nguyen Sao Mai   Assistant Professor at ENSTA-Paris, IP-Paris and also affiliated at IMT Atlantique

Why robots?

Nguyen Sao Mai has enabled a robot to coach physical rehabilitation in the projects RoKInter and the experiment KERAAL she coordinated, funded by the European Union through FP-7 project ECHORD++. She has participated in project AMUSAAL, for analysing human activities of daily living through cameras, and CPER VITAAL for developing assistive technologies for the elderly and disabled. She has developed machine learning algorithms combining reinforcement learning and active imitation learning for interactive and multi-task learning. She is currently associate editor of the journal IEEE TCDS and co-chair of the Task force “Action and Perception” du IEEE Technical Committee on Cognitive and Developmental Systems.

“Cognitive Developmental Robotics is a wonderful field to allow us to build new assistive robots that can evolve in interaction with humans and adapt to the needs of its users by continual learning. It is also an amazing tool to understand and model biological cognition and learning processes.

Robotics is unleashing little by little its potential as a tool to address humankind’s challenges, such as the medical advances, environmental issues or social assistance for the elderly and the disabled. This year’s situation has for instance has shown the usefulness of robotics in medical environments and nursing homes.”

What suggestions do you have for other Women in Robotics?

“I would be happy to mentor or build a network to support younger women in robotics.”

Selin Alara Örnek High School Student and Inventor

Why robots?

“I am a 14 year old high school student, I have been coding since I was 8 and building robots since I was 10 years old. I have built 5 robots till now one is a guide dog for blind people, ic4u and ic4u2, I have built 2 versions. I have another robot BB4All which is a school aid robot to help students, teachers and staff its main aim is to prevent bullying. I have also built an Android robot and a Starwars Droid DO as I love both of them.

When I was 9 we lost our family dog I was really upset as I don’t have any brothers or sisters and he was like my brother. I wanted to bring him back to life as I was little I dreamed of bringing one of my soft toys to life. Whilst on holiday with my family I saw a guide dog with its blind owner. I love dogs and was really happy to see a dog helping in such way. But then I remembered how sad I was and I thought if the blind person’s dog was to die they would not only lose their best friend but their eyes again too. So I decided to build my robot guide dog ic4u.

I am currently rebuilding BB4All as the first one I built was not very strong and made of cardboard now I am printing the pieces with a 3d printer and adding some more features. In my robots, I use image recognition, object detection, face recognition, voice commands, dialogflow, omnidirection movements, Google maps api and Google assistant integration and various sensors.

I believe that robots will be part of our everyday life and that we will need them more and more. I love robots so that makes me really happy. My dream is to build a humanoid in the future and send it to space so that it can do research on the black hole.”

What suggestions do you have for other Women in Robotics?

“I used to love playing games minecraft, etc then in English class my teacher started making games for us to play whilst learning and I really wanted to do so. I asked him how he did it and he told me to have a look at MIT Scratch and encouraged me to code my own games. He told me I could do it and that I should try. That is how I started to learn how to code and I am very happy to have a teacher like him.

A lot of girls are not very interested in coding robotics and find what I do very boring. I spend a lot of time taking my robots to events to talk about my robots and show how coding and robotics can be fun and also how technology can be used for good. I recently did a TEDx talk and have given presentations at plenty of local and international events. A lot of other kids get in touch with me to ask questions which I also like to answer. Especially little girls that get in touch as it makes me happy to see them so interested and excited. I also try to send messages to parents in my presentations and interviews, pointing out that they should respect their children’s choices and that they need to give equal opportunities to both their daughters and sons.”

We encourage #womeninrobotics and women who’d like to work in robotics to join our professional network at http://womeninrobotics.org

Quest Industrial – Robotic Packaging Solutions

Quest is a leading manufacturer of industrial automation equipment focusing on robotics and vision-guidance. With expertise in the food, beverage, and dairy, Quest optimizes floor space for customers experiencing growing demand and helps improve their overall production line flexibility and efficiency. Quest offers application-specific software on their robotic products, including pick and place, case packing, and palletizing systems to simplify system setup and streamline configurability. Quest is a product brand of ProMach, a global leader in packaging line solutions.

Brenton – Case Packing and Palletizing using both Robotics and other Automation.

Brenton is a leader in robotic solutions with standard product offerings including palletizing and depalletizing for efficient an option for carrying hard-to-handle products, as well as larger scale systems with an optimized footprint; and case and carton loading solutions for numerous industries. In addition, Brenton works with customers on integrated solutions, offering robotics in a broad spectrum of robotics to fully integrate end of line systems. Brenton helps packaging customers protect their reputation and grow the trust of their consumers.

Artificial intelligence must not be allowed to replace the imperfection of human empathy

At the heart of the development of AI appears to be a search for perfection. And it could be just as dangerous to humanity as the one that came from philosophical and pseudoscientific ideas of the 19th and early 20th centuries and led to the horrors of colonialism, world war and the Holocaust. Instead of a human ruling "master race", we could end up with a machine one.

Robotic swarm swims like a school of fish

A Bluebot
Bluebots are fish-shaped robots that can coordinate their movements in three dimensions underwater, rather than the two dimensions previously achieved by Kilobots. Credit: Harvard SEAS

By Leah Burrows / SEAS Communications

Schools of fish exhibit complex, synchronized behaviors that help them find food, migrate, and evade predators. No one fish or sub-group of fish coordinates these movements, nor do fish communicate with each other about what to do next. Rather, these collective behaviors emerge from so-called implicit coordination — individual fish making decisions based on what they see their neighbors doing.

This type of decentralized, autonomous self-organization and coordination has long fascinated scientists, especially in the field of robotics.

Now, a team of researchers at Harvard’s Wyss Institute and John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed fish-inspired robots that can synchronize their movements like a real school of fish, without any external control. It is the first time researchers have demonstrated complex 3D collective behaviors with implicit coordination in underwater robots.

“Robots are often deployed in areas that are inaccessible or dangerous to humans, areas where human intervention might not even be possible,” said Florian Berlinger, a Ph.D. Candidate at the Wyss Institute and SEAS and first author of the paper. “In these situations, it really benefits you to have a highly autonomous robot swarm that is self-sufficient. By using implicit rules and 3D visual perception, we were able to create a system that has a high degree of autonomy and flexibility underwater where things like GPS and WiFi are not accessible.”

The research is published in Science Robotics.

The fish-inspired robotic swarm, dubbed Blueswarm, was created in the lab of Wyss Associate Faculty member Radhika Nagpal, Ph.D., who is also the Fred Kavli Professor of Computer Science at SEAS. Nagpal’s lab is a pioneer in self-organizing systems, from their 1,000 robot Kilobot swarm to their termite-inspired robotic construction crew.

However, most previous robotic swarms operated in two-dimensional space. Three-dimensional spaces, like air and water, pose significant challenges to sensing and locomotion.

To overcome these challenges, the researchers developed a vision-based coordination system in their fish robots based on blue LED lights. Each underwater robot, called a Bluebot, is equipped with two cameras and three LED lights. The on-board, fisheye-lens cameras detect the LEDs of neighboring Bluebots and use a custom algorithm to determine their distance, direction and heading. Based on the simple production and detection of LED light, the researchers demonstrated that the Blueswarm could exhibit complex self-organized behaviors, including aggregation, dispersion, and circle formation.

A Blueswarm robot flashing the LEDs
These fish-inspired robots can synchronize their movements without any outside control. Based on the simple production and detection of LED light, the robotic collective exhibits complex self-organized behaviors, including aggregation, dispersion, and circle formation. Credit: Harvard University’s Self-organizing Systems Research Group

“Each Bluebot implicitly reacts to its neighbors’ positions,” said Berlinger. “So, if we want the robots to aggregate, then each Bluebot will calculate the position of each of its neighbors and move towards the center. If we want the robots to disperse, the Bluebots do the opposite. If we want them to swim as a school in a circle, they are programmed to follow lights directly in front of them in a clockwise direction.”

The researchers also simulated a simple search mission with a red light in the tank. Using the dispersion algorithm, the Bluebots spread out across the tank until one comes close enough to the light source to detect it. Once the robot detects the light, its LEDs begin to flash, which triggers the aggregation algorithm in the rest of the school. From there, all the Bluebots aggregate around the signaling robot.


Blueswarm, a Harvard Wyss- and SEAS-developed underwater robot collective, uses a 3D vision-based coordination system and 3D locomotion to coordinate the movements of its individual Bluebots autonomously, mimicking the behavior of schools of fish. Credit: Harvard SEAS

“Our results with Blueswarm represent a significant milestone in the investigation of underwater self-organized collective behaviors,” said Nagpal. “Insights from this research will help us develop future miniature underwater swarms that can perform environmental monitoring and search in visually-rich but fragile environments like coral reefs. This research also paves a way to better understand fish schools, by synthetically recreating their behavior.”

The research was co-authored by Melvin Gauci, Ph.D., a former Wyss Technology Development Fellow. It was supported in part by the Office of Naval Research, the Wyss Institute for Biologically Inspired Engineering, and an Amazon AWS Research Award.

Page 53 of 58
1 51 52 53 54 55 58