Archive 28.03.2022

Page 53 of 65
1 51 52 53 54 55 65

Handheld surgical robot can help stem fatal blood loss

Matt Johnson (right) and Laura Brattain (left) test a new medical device on an artificial model of human tissue and blood vessels. The device helps users to insert a needle and guidewire quickly and accurately into a vessel, a crucial first step to halting rapid blood loss. Photo: Nicole Fandel.

By Anne McGovern | MIT Lincoln Laboratory

After a traumatic accident, there is a small window of time when medical professionals can apply lifesaving treatment to victims with severe internal bleeding. Delivering this type of care is complex, and key interventions require inserting a needle and catheter into a central blood vessel, through which fluids, medications, or other aids can be given. First responders, such as ambulance emergency medical technicians, are not trained to perform this procedure, so treatment can only be given after the victim is transported to a hospital. In some instances, by the time the victim arrives to receive care, it may already be too late.

A team of researchers at MIT Lincoln Laboratory, led by Laura Brattain and Brian Telfer from the Human Health and Performance Systems Group, together with physicians from the Center for Ultrasound Research and Translation (CURT) at Massachusetts General Hospital, led by Anthony Samir, have developed a solution to this problem. The Artificial Intelligence–Guided Ultrasound Intervention Device (AI-GUIDE) is a handheld platform technology that has the potential to help personnel with simple training to quickly install a catheter into a common femoral vessel, enabling rapid treatment at the point of injury.

“Simplistically, it’s like a highly intelligent stud-finder married to a precision nail gun.” says Matt Johnson, a research team member from the laboratory’s Human Health and Performance Systems Group.

AI-GUIDE is a platform device made of custom-built algorithms and integrated robotics that could pair with most commercial portable ultrasound devices. To operate AI-GUIDE, a user first places it on the patient’s body, near where the thigh meets the abdomen. A simple targeting display guides the user to the correct location and then instructs them to pull a trigger, which precisely inserts the needle into the vessel. The device verifies that the needle has penetrated the blood vessel, and then prompts the user to advance an integrated guidewire, a thin wire inserted into the body to guide a larger instrument, such as a catheter, into a vessel. The user then manually advances a catheter. Once the catheter is securely in the blood vessel, the device withdraws the needle and the user can remove the device.

With the catheter safely inside the vessel, responders can then deliver fluid, medicine, or other interventions.

AI-GUIDE automates nearly every step of the process to locate and insert a needle, guidewire, and catheter into a blood vessel to facilitate lifesaving treatment. The version of the device shown here is optimized to locate the femoral blood vessels, which are in the upper thigh. Image courtesy of the researchers.

As easy as pressing a button

The Lincoln Laboratory team developed the AI in the device by leveraging technology used for real-time object detection in images.

“Using transfer learning, we trained the algorithms on a large dataset of ultrasound scans acquired by our clinical collaborators at MGH,” says Lars Gjesteby, a member of the laboratory’s research team. “The images contain key landmarks of the vascular anatomy, including the common femoral artery and vein.”

These algorithms interpret the visual data coming in from the ultrasound that is paired with AI-GUIDE and then indicate the correct blood vessel location to the user on the display.

“The beauty of the on-device display is that the user never needs to interpret, or even see, the ultrasound imagery,” says Mohit Joshi, the team member who designed the display. “They are simply directed to move the device until a rectangle, representing the target vessel, is in the center of the screen.”

For the user, the device may seem as easy to use as pressing a button to advance a needle, but to ensure rapid and reliable success, a lot is happening behind the scenes. For example, when a patient has lost a large volume of blood and becomes hypotensive, veins that would typically be round and full of blood become flat. When the needle tip reaches the center of the vein, the wall of the vein is likely to “tent” inward, rather than being punctured by the needle. As a result, though the needle was injected to the proper location, it fails to enter the vessel.

To ensure that the needle reliably punctures the vessel, the team engineered the device to be able to check its own work.

“When AI-GUIDE injects the needle toward the center of the vessel, it searches for the presence of blood by creating suction,” says Josh Werblin, the program’s mechanical engineer. “Optics in the device’s handle trigger when blood is present, indicating that the insertion was successful.” This technique is part of why AI-GUIDE has shown very high injection success rates, even in hypotensive scenarios where veins are likely to tent.

Lincoln Laboratory researchers and physicians from the Massachusetts General Hospital Center for Ultrasound Research and Translation collaborated to build the AI-GUIDE system. Photo courtesy of Massachusetts General Hospital.

Recently, the team published a paper in the journal Biosensors that reports on AI-GUIDE’s needle insertion success rates. Users with medical experience ranging from zero to greater than 15 years tested AI-GUIDE on an artificial model of human tissue and blood vessels and one expert user tested it on a series of live, sedated pigs. The team reported that after only two minutes of verbal training, all users of the device on the artificial human tissue were successful in placing a needle, with all but one completing the task in less than one minute. The expert user was also successful in quickly placing both the needle and the integrated guidewire and catheter in about a minute. The needle insertion speed and accuracy were comparable to that of experienced clinicians operating in hospital environments on human patients. 

Theodore Pierce, a radiologist and collaborator from MGH, says AI-GUIDE’s design, which makes it stable and easy to use, directly translates to low training requirements and effective performance. “AI-GUIDE has the potential to be faster, more precise, safer, and require less training than current manual image-guided needle placement procedures,” he says. “The modular design also permits easy adaptation to a variety of clinical scenarios beyond vascular access, including minimally invasive surgery, image-guided biopsy, and imaging-directed cancer therapy.”

In 2021, the team received an R&D 100 Award for AI-GUIDE, recognizing it among the year’s most innovative new technologies available for license or on the market. 

What’s next?

Right now, the team is continuing to test the device and work on fully automating every step of its operation. In particular, they want to automate the guidewire and catheter insertion steps to further reduce risk of user error or potential for infection.

“Retraction of the needle after catheter placement reduces the chance of an inadvertent needle injury, a serious complication in practice which can result in the transmission of diseases such as HIV and hepatitis,” says Pierce. “We hope that a reduction in manual manipulation of procedural components, resulting from complete needle, guidewire, and catheter integration, will reduce the risk of central line infection.”

AI-GUIDE was built and tested within Lincoln Laboratory’s new Virtual Integration Technology Lab (VITL). VITL was built in order to bring a medical device prototyping capability to the laboratory.

“Our vision is to rapidly prototype intelligent medical devices that integrate AI, sensing — particularly portable ultrasound — and miniature robotics to address critical unmet needs for both military and civilian care,” says Laura Brattain, who is the AI-GUIDE project co-lead and also holds a visiting scientist position at MGH. “In working closely with our clinical collaborators, we aim to develop capabilities that can be quickly translated to the clinical setting. We expect that VITL’s role will continue to grow.”

AutonomUS, a startup company founded by AI-GUIDE’s MGH co-inventors, recently secured an option for the intellectual property rights for the device. AutonomUS is actively seeking investors and strategic partners.

“We see the AI-GUIDE platform technology becoming ubiquitous throughout the health-care system,” says Johnson, “enabling faster and more accurate treatment by users with a broad range of expertise, for both pre-hospital emergency interventions and routine image-guided procedures.”

This work was supported by the U.S. Army Combat Casualty Care Research Program and Joint Program Committee – 6. Nancy DeLosa, Forrest Kuhlmann, Jay Gupta, Brian Telfer, David Maurer, Wes Hill, Andres Chamorro, and Allison Cheng provided technical contributions, and Arinc Ozturk, Xiaohong Wang, and Qian Li provided guidance on clinical use.

Careers in robotics: What is a robotics PhD?

This relatively general post focuses on robotics-related PhD programs in the American educational system. Much of this will not apply to universities in other countries, or to other departments in American universities. This post will take you through the overall life cycle of a PhD and is intended as a basic overview for anyone unfamiliar with the process, whether they are considering a PhD or have a loved one who is currently in a PhD program and just want to learn more about what they are doing. 

The basics

A PhD (doctoral degree) in engineering or a DEng (Doctorate of Engineering) is the highest degree that you can earn in engineering. This is generally a degree that people only earn one of, if they earn one at all. Unlike a bachelor’s degree or a master’s degree, a PhD studying a topic relevant to robotics should be free and students should receive a modest stipend for their living expenses. There are very few stand-alone robotics PhDs programs, so people generally join robotics labs through PhD programs in electrical engineering, computer science, or mechanical engineering. 

Joining a lab

In some programs, students are matched with a lab when they are accepted to the university. This matching is not random: If a university works this way, a professor has to have a space in their lab, see the application, and decide that the student would be a good fit for their lab. Essentially, the professor “hires” the student to join their lab. 

Other programs accept cohorts of students who take courses in the first few years and pick professors to work with by some deadline in the program. The mechanism through which students and professors pair up is usually rotations: Students perform a small research project in each of several labs and then join one of the labs they rotated in. 

The advisor

Regardless of how a student gets matched up with their advisor, the advisor has a lot of power to make their graduate school experience a positive one or a negative one. Someone who is a great advisor for one student may not be a great advisor for another. If you are choosing an advisor, it pays to pay attention to the culture in a lab, and whether you personally feel supported by that environment and the type of mentorship that your advisor offers. In almost every case, this is more important for your success in the PhD program than the specifics of the project you will work on or the prestige of the project, collaborators, or lab. 

Qualifiers

PhD programs typically have qualifiers at some point in the first three years. Some programs use a test-based qualifier system, either creating a specific qualifier test or using tests from final exams of required courses. In some programs, you are tested by a panel of faculty who ask the student questions about course material that they are expected to have learned by this point. In other programs, the student performs a research project and then presents it to a panel of faculty. 

Some universities view the qualifiers as a hurdle that almost all of the admitted PhD students should be able to pass, and some universities view them as a method to weed out students from the PhD program. If you are considering applying to PhD programs, it is worth paying attention to this cultural difference between programs, and not taking it too personally if you do not pass the qualifiers at a school that weeds out half of their students. After all, you were qualified enough to be accepted. It is also important to remember, if you join either kind of program, that if you do not pass your qualifiers, usually what happens is that you leave the program with a free master’s degree. Your time in the program will not be wasted!

The author testing a robot on a steep dune face on a research field trip at Oceano Dunes.

Research

Some advisors will start students on a research project as soon as they join the lab, typically by attaching them to an existing project so that they can get a little mentorship before starting their own project. Some advisors will wait until the student is finished with qualifiers. Either way, it is worth knowing that a PhD student’s relationship to their PhD project is likely different from any project they have ever been involved with before. 

For any other research project, there is another person – the advisor, an older graduate student, a post doc – who has designed the project or at least worked with the student to come up with parameters for success. The scope of previous research projects would typically be one semester or one summer, resulting in one or two papers at most. In contrast, a PhD student’s research project is expected to span multiple years (at least three), should result in multiple publications, and is designed primarily by the student. It is not just that the student has ownership over their project, but that they are responsible for it in a way that they have never been responsible for such a large open-ended project before. It is also their primary responsibility – not one project alongside many others. This can be overwhelming for a lot of students, which is why it is impolite to ask a PhD student how much longer they expect their PhD to take. 

The committee

The “committee” is a group of professors that work in a related area to the student’s. Their advisor is on the committee, but it must include other professors as well. Typically, students need to have a mix of professors from their school and at least one other institution. These professors provide ongoing mentorship on the thesis project. They are the primary audience for the thesis proposal and defense, and will ultimately decide what is sufficient for the student to do in order to graduate. If you are a student choosing your committee, keep in mind that you will benefit greatly from having supportive professors on your committee, just like you will benefit from having a supportive advisor. 

Proposing and defending the thesis

When students are expected to propose a thesis project varies widely by program. In some programs, students propose a topic as part of their qualifier process. In others, students have years after finishing their qualifiers to propose a topic – and can propose as little as a semester before they defend! 

The proposal and defense both typically take the form of a presentation followed by questions from the committee and the audience. In the proposal, the student outlines the project they plan to do, and presents criteria that they and their committee should agree on as the required minimum for them to do in order to graduate. The defense makes the case that the student has hit those requirements. 

After the student presents, the committee will ask them some questions, will confer, and then will either tell the student that they passed or failed. It is very uncommon for a PhD student to fail their defense, and it is generally considered a failure on the part of the advisor rather than the student if this happens, because the advisor shouldn’t have let the student present an unfinished thesis. After the defense, there may be some corrections to the written thesis document or even a few extra experiments, but typically the student does not need to present their thesis again in order to graduate.

The bottom line

A PhD is a long training process to teach students how to become independent researchers. Students will take classes and perform research, and will also likely teach or develop coursework. If this is something you’re thinking about, it’s important to learn about what you might be getting yourself into – and if it’s a journey one of your loved ones is starting on, you should know that it’s not just more school!

How to investigate when a robot causes an accident, and why it’s important that we do

Robots are featuring more and more in our daily lives. They can be incredibly useful (bionic limbs, robotic lawnmowers, or robots which deliver meals to people in quarantine), or merely entertaining (robotic dogs, dancing toys, and acrobatic drones). Imagination is perhaps the only limit to what robots will be able to do in the future.

Mimicking the Five Senses, On Chip

Machine Learning at the edge is gaining steam. BrainChip is accelerating this with their Akida architecture, which is mimicking the human brain by incorporating the 5 human senses on a machine learning-enabled chip.

Their chips will let roboticists and IoT developers run ML on device for low latency, low power, and low-cost machine learning-enabled products. This opens up a new product category where everyday devices can affordably become smart devices.

Rob Telson

Rob is an AI thought-leader and Vice President of Worldwide Sales at BrainChip, a global tech company that has developed artificial intelligence that learns like a brain, whilst prioritizing efficiency, ultra-low power consumption, and continuous learning. Rob has over 20 years of sales expertise in licensing intellectual property and selling EDA technology and attended Harvard Business School.

Links

Robotic exoskeleton uses machine learning to help users with mobility impairments

Researchers from the RIKEN Guardian Robot Project and collaborators have used a combination of lightweight material engineering and artificial intelligence to create an exoskeleton robot that could help people with mobility impairments. An important element of the new device is technology that allows the skeleton to effectively guess the intentions of the user.

Social robots for elder care

Robots have come a long way. For years, they have been supporting human activity—enabling exploration in dangerous and unreachable environments like out in space and deep in the oceans. A new generation of robots are being designed to stay closer to home—caring for aging adults and young children.

Using Rubik’s cube to improve and evaluate robot manipulation

Researchers at University of Washington have recently developed a new protocol to train robots and test their performance on tasks that involve object manipulation. This protocol, presented in a paper published in IEEE Robotics and Automation Letters, is based on the Rubik's Cube, the well-known 3D combination puzzle invented by the Hungarian sculpture and architect Ernő Rubik.

BirdBot is energy-efficient thanks to nature as a model

By Alexander Badri-Sprowitz, Alborz Aghamaleki Sarvestani, Metin Sitti and Linda Behringer

If a Tyrannosaurus Rex living 66 million years ago featured a similar leg structure as an ostrich running in the savanna today, then we can assume bird legs stood the test of time – a good example of evolutionary selection.

Graceful, elegant, powerful – flightless birds like the ostrich are a mechanical wonder. Ostriches, some of which weigh over 100kg, run through the savanna at up to 55km/h. The ostrich’s outstanding locomotor performance is thought to be enabled by the animal’s leg structure. Unlike humans, birds fold their feet back when pulling their legs up towards their bodies. Why do the animals do this? Why is this foot movement pattern energy-efficient for walking and running? And can the bird’s leg structure with all its bones, muscles, and tendons be transferred to walking robots?

Alexander Badri-Spröwitz has spent more than five years on these questions. At the Max Planck Institute for Intelligent Systems (MPI-IS), he leads the Dynamic Locomotion Group. His team works at the interface between biology and robotics in the field of biomechanics and neurocontrol. The dynamic locomotion of animals and robots is the group’s main focus.

Together with his doctoral student Alborz Aghamaleki Sarvestani, Badri-Spröwitz has constructed a robot leg that, like its natural model, is energy-efficient: BirdBot needs fewer motors than other machines and could, theoretically, scale to large size. On March 16th, Badri-Spröwitz, Aghamaleki Sarvestani, the roboticist Metin Sitti, a director at MPI-IS, and biology professor Monica A. Daley of the University of California, Irvine, published their research in the renowned journal Science Robotics.

Compliant spring-tendon network made of muscles and tendons

When walking, humans pull their feet up and bend their knees, but feet and toes point forward almost unchanged. It is known that Birds are different — in the swing phase, they fold their feet backward. But what is the function of this motion? Badri-Spröwitz and his team attribute this movement to a mechanical coupling. “It’s not the nervous system, it’s not electrical impulses, it’s not muscle activity,” Badri-Spröwitz explains. “We hypothesized a new function of the foot-leg coupling through a network of muscles and tendons that extends across multiple joints”. These multi-joint muscle-tendon coordinate foot folding in the swing phase. In our robot, we have implemented the coupled mechanics in the leg and foot, which enables energy-efficient and robust robot walking. Our results demonstrating this mechanism in a robot lead us to believe that similar efficiency benefits also hold true for birds,” he explains.

The coupling of the leg and foot joints and the forces and movements involved could be the reason why a large animal like an ostrich can not only run fast but also stand without tiring, the researchers speculate. A person weighing over 100kg can also stand well and for a long time, but only with the knees ‘locked’ in an extended position. If the person were to squat slightly, it becomes strenuous after a few minutes. The bird, however, does not seem to mind its bent leg structure; many birds even stand upright while sleeping. A robotic bird’s leg should be able to do the same: no motor power should be needed to keep the structure standing upright.

Robot walks on treadmill

To test their hypothesis, the researchers built a robotic leg modeled after the leg of a flightless bird. They constructed their artificial bird leg so that its foot features no motor, but instead a joint equipped with a spring and cable mechanism. The foot is mechanically coupled to the rest of the leg’s joints through cables and pulleys. Each leg contains only two motors— the hip joints motor, which swings the leg back and forth, and a small motor that flexes the knee joint to pull the leg up. After assembly, the researchers walked BirdBot on a treadmill to observe the robot’s foot folding and unfolding. “The foot and leg joints don’t need actuation in the stance phase,” says Aghamaleki Sarvestani. “Springs power these joints, and the multi-joint spring-tendon mechanism coordinates joint movements. When the leg is pulled into swing phase, the foot disengages the leg’s spring – or the muscle-tendon spring, as we believe it happens in animals,” Badri-Spröwitz adds. A video shows BirdBot walking in the research group’s laboratory.

Zero effort when standing, and when flexing the leg and knee

When standing, the leg expends zero energy. “Previously, our robots had to work against the spring or with a motor either when standing or when pulling the leg up, to prevent the leg from colliding with the ground during leg swing. This energy input is not necessary in BirdBot’s legs,” says Badri-Spröwitz and Aghamaleki Sarvestani adds: “Overall, the new robot requires a mere quarter of the energy of its predecessor.”

The treadmill is now switched back on, the robot starts running, and with each leg swing, the foot disengages the leg’s spring. To disengage, the large foot movement slacks the cable and the remaining leg joints swing loosely. This transition of states, between standing and leg swing, is provided in most robots by a motor at the joint. And a sensor sends a signal to a controller, which turns the robot’s motors on and off. “Previously, motors were switched depending on whether the leg was in the swing or stance phase. Now the foot takes over this function in the walking machine, mechanically switching between stance and swing. We only need one motor at the hip joint and one motor to bend the knee in the swing phase. We leave leg spring engagement and disengagement to the bird-inspired mechanics. This is robust, fast, and energy-efficient,” says Badri-Spröwitz.

Motion sequence of BirdBot’s leg; left is touch-down, then stance (the three first snapshots on the left), then mid-swing with the leg’s characteristic leg flexing posture, and back to touch-down on the right. Springs and spring-tendons are shown on top.

Monica Daley observed in several of her earlier biology studies that the bird’s leg structure not only saves energy during walking and standing but is also adapted by nature so that the animal hardly stumbles and injures itself. In experiments with guineafowls running over hidden potholes, she quantified the birds’ remarkable locomotion robustness. A morphological intelligence is built into the system that allows the animal to act quickly – without having to think about it. Daley had shown that the animals control their legs during locomotion not only with the help of the nervous system. If an obstacle unexpectedly lies in the way, it is not always the animal’s sense of touch or sight that comes into play.

“The structure with its multi-jointed muscle-tendons and its unique foot movement can explain why even heavy, large birds run so quickly, robustly, and energy-efficient. If I assume that everything in the bird is based on sensing and action, and the animal steps onto an unexpected obstacle, the animal might not be able to react quickly enough. Perception and sensing, even the transmission of the stimuli, and the reaction cost time,” Daley says.

Yet Daley’s work on running birds over 20 years demonstrates that birds respond more rapidly than the nervous system allows, indicating mechanical contributions to control. Now that the team developed BirdBot, which is a physical model that directly demonstrates how these mechanisms work, it all makes more sense: the leg switches mechanically if there is a bump in the ground. The switch happens immediately and without time delay. Like birds, the robot features high locomotion robustness.

Whether it’s on the scale of a Tyrannosaurus Rex or a small quail, or a small or large robotic leg. Theoretically, meter-high legs can now be implemented to carry robots with the weight of several tons, that walk around with little power input.

The knowledge gained through BirdBot developed at the Dynamic Locomotion Group and the University of California, Irvine, leads to new insights about animals, which are adapted by evolution. Robots allow testing and sometimes confirming hypotheses from Biology, and advancing both fields.


Page 53 of 65
1 51 52 53 54 55 65