Using tiny combustion engines to power very tiny robots
How do robots collaborate to achieve consensus?
Why Rotary Encoders Might Be the Best Fit for Your Industrial Robotics Design
High-tech microscope with ML software for detecting malaria in returning travellers
By Deborah Pirchner
Malaria is an infectious disease claiming more than half a million lives each year. Because traditional diagnosis takes expertise and the workload is high, an international team of researchers investigated if diagnosis using a new system combining an automatic scanning microscope and AI is feasible in clinical settings. They found that the system identified malaria parasites almost as accurately as experts staffing microscopes used in standard diagnostic procedures. This may help reduce the burden on microscopists and increase the feasible patient load.
Each year, more than 200 million people fall sick with malaria and more than half a million of these infections lead to death. The World Health Organization recommends parasite-based diagnosis before starting treatment for the disease caused by Plasmodium parasites. There are various diagnostic methods, including conventional light microscopy, rapid diagnostic tests and PCR.
The standard for malaria diagnosis, however, remains manual light microscopy, during which a specialist examines blood films with a microscope to confirm the presence of malaria parasites. Yet, the accuracy of the results depends critically on the skills of the microscopist and can be hampered by fatigue caused by excessive workloads of the professionals doing the testing.
Now, writing in Frontiers in Malaria, an international team of researchers has assessed whether a fully automated system, combining AI detection software and an automated microscope, can diagnose malaria with clinically useful accuracy.
“At an 88% diagnostic accuracy rate relative to microscopists, the AI system identified malaria parasites almost, though not quite, as well as experts,” said Dr Roxanne Rees-Channer, a researcher at The Hospital for Tropical Diseases at UCLH in the UK, where the study was performed. “This level of performance in a clinical setting is a major achievement for AI algorithms targeting malaria. It indicates that the system can indeed be a clinically useful tool for malaria diagnosis in appropriate settings.”
AI delivers accurate diagnosis
The researchers sampled more than 1,200 blood samples of travelers who had returned to the UK from malaria-endemic countries. The study tested the accuracy of the AI and automated microscope system in a true clinical setting under ideal conditions.
They evaluated samples using both manual light microscopy and the AI-microscope system. By hand, 113 samples were diagnosed as malaria parasite positive, whereas the AI-system correctly identified 99 samples as positive, which corresponds to an 88% accuracy rate.
“AI for medicine often posts rosy preliminary results on internal datasets, but then falls flat in real clinical settings. This study independently assessed whether the AI system could succeed in a true clinical use case,” said Rees-Channer, who is also the lead author of the study.
Automated vs manual
The fully automated malaria diagnostic system the researchers put to the test includes hard- as well as software. An automated microscopy platform scans blood films and malaria detection algorithms process the image to detect parasites and the quantity present.
Automated malaria diagnosis has several potential benefits, the scientists pointed out. “Even expert microscopists can become fatigued and make mistakes, especially under a heavy workload,” Rees-Channer explained. “Automated diagnosis of malaria using AI could reduce this burden for microscopists and thus increase the feasible patient load.” Furthermore, these systems deliver reproducible results and can be widely deployed, the scientists wrote.
Despite the 88% accuracy rate, the automated system also falsely identified 122 samples as positive, which can lead to patients receiving unnecessary anti-malarial drugs. “The AI software is still not as accurate as an expert microscopist. This study represents a promising datapoint rather than a decisive proof of fitness,” Rees-Channer concluded.
Read the research in full
Evaluation of an automated microscope using machine learning for the detection of malaria in travelers returned to the UK, Roxanne R. Rees-Channer, Christine M. Bachman, Lynn Grignard, Michelle L. Gatton, Stephen Burkot, Matthew P. Horning, Charles B. Delahunt, Liming Hu, Courosh Mehanian, Clay M. Thompson, Katherine Woods, Paul Lansdell, Sonal Shah, Peter L. Chiodini, Frontiers in Malaria (2023).
Battery-free robots use origami to change shape in mid-air
ABB to invest $280 million in its European Robotics hub in Sweden
An embodied conversational agent that merges large language models and domain-specific assistance
How drones are used during earthquakes
In the realm of disaster response, technology plays a pivotal role in aiding communities during challenging times. In this exploration, we turn our attention to drones and their application in earthquake response, especially as how they are being used in the recent Morocco earthquake. This concise video offers valuable insights into the practical uses of drones and the considerations surrounding their deployment during earthquake-related crises.
Humans can feel empathic embarrassment towards robots, finds virtual reality study
Case Study: Delta Line Motion System Solutions for Robotics
Using technological innovation for environmental benefits
Making life friendlier with personal robots
By Dorothy Hanna | Department of Mechanical Engineering
“As a child, I wished for a robot that would explain others’ emotions to me” says Sharifa Alghowinem, a research scientist in the Media Lab’s Personal Robots Group (PRG). Growing up in Saudi Arabia, Alghowinem says she dreamed of coming to MIT one day to develop Arabic-based technologies, and of creating a robot that could help herself and others navigate a complex world.
In her early life, Alghowinem faced difficulties with understanding social cues and never scored well on standardized tests, but her dreams carried her through. She earned an undergraduate degree in computing before leaving home to pursue graduate education in Australia. At the Australian National University, she discovered affective computing for the first time and began working to help AI detect human emotions and moods, but it wasn’t until she came to MIT as a postdoc with the Ibn Khaldun Fellowship for Saudi Arabian Women, which is housed in the MIT Department of Mechanical Engineering, that she was finally able to work on a technology with the potential to explain others’ emotions in English and Arabic. Today, she says her work is so fun that she calls the lab “my playground.”
Alghowinem can’t say no to an exciting project. She found one with great potential to make robots more helpful to people by working with Jibo, a friendly robot companion developed by the founder of the Personal Robots Group (PRG) and the social robot startup Jibo Inc., MIT Professor and Dean for Digital Learning Cynthia Breazeal. Breazeal’s research explores the potential for companion robots to go far beyond assistants who obey transactional commands, like requests for the daily weather, adding items to shopping lists, or controlling lighting. At the MIT Media Lab, the PRG team designs Jibo to make him an insightful coach and companion to advance social robotics technologies and research. Visitors to the MIT Museum can experience Jibo’s charming personality.
Alghowinem’s research has focused on mental health care and education, often working with other graduate students and Undergraduate Research Opportunity Program students in the group. In one study, Jibo coached young and older adults via positive psychology. He adapted his interventions based on the verbal and non-verbal responses he observed in the participants. For example, Jibo takes in the verbal content of a participant’s speech and combines it with non-verbal information like prolonged pauses and self-hugs. If he concludes that deep emotions have been disclosed, Jibo responds with empathy. When the participant doesn’t disclose, Jibo asks a gentle follow up question like, “Can you tell me more?”
Another project studied how a robot can effectively support high-quality parent and child interactions while reading a storybook together. Multiple PRG studies work together to learn what types of data are needed for a robot to understand people’s social and emotional states.
“I would like to see Jibo become a companion for the whole household,” says Alghowinem. Jibo can take on different roles with different family members such as a companion, reminding elders to take medication, or as a playmate for children. Alghowinem is especially motivated by the unique role Jibo could play in emotional wellness, and playing a preventative role in depression or even suicide. Integrating Jibo into daily life provides the opportunity for Jibo to detect emerging concerns and intervene, acting as a confidential resource or mental health coach.
Alghowinem is also passionate about teaching and mentoring others, and not only via robots. She makes sure to meet individually with the students she mentors every week and she was instrumental earlier this year in bringing two visiting undergraduate students from Prince Sultan University in Saudi Arabia. Mindful of their social-emotional experience, she worked hard to create the opportunity for the two students, together, to visit MIT so they could support each other. One of the visiting students, Tasneem Burghleh, says she was curious to meet the person who went out of her way to make opportunities for strangers and discovered in her an “endless passion that makes her want to pass it on and share it with everyone else.”
Next, Alghowinem is working to create opportunities for children who are refugees from Syria. Still in the fundraising stage, the plan is to equip social robots to teach the children English language and social-emotional skills and provide activities to preserve cultural heritage and Arabic abilities.
“We’ve laid the groundwork by making sure Jibo can speak Arabic as well as several other languages,” says Alghowinem. “Now I hope we can learn how to make Jibo really useful to kids like me who need some support as they learn how to interact with the world around them.”
Robot fried chicken: entrepreneur seeks to improve S. Korea’s favorite food
Fiber-infused ink enables 3D-printed heart muscle to beat
By Kat J. McAlpine / SEAS Communications
Over the last decade, advances in 3D printing have unlocked new possibilities for bioengineers to build heart tissues and structures. Their goals include creating better in vitro platforms for discovering new therapeutics for heart disease, the leading cause of death in the United States, responsible for about one in every five deaths nationally, and using 3D-printed cardiac tissues to evaluate which treatments might work best in individual patients. A more distant aim is to fabricate implantable tissues that can heal or replace faulty or diseased structures inside a patient’s heart.
In a paper published in Nature Materials, researchers from Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and the Wyss Institute for Biologically Inspired Engineering at Harvard University report the development of a new hydrogel ink infused with gelatin fibers that enables 3D printing of a functional heart ventricle that mimics beating like a human heart. They discovered the fiber-infused gel (FIG) ink allows heart muscle cells printed in the shape of a ventricle to align and beat in coordination like a human heart chamber.
“People have been trying to replicate organ structures and functions to test drug safety and efficacy as a way of predicting what might happen in the clinical setting,” says Suji Choi, research associate at SEAS and first author on the paper. But until now, 3D printing techniques alone have not been able to achieve physiologically-relevant alignment of cardiomyocytes, the cells responsible for transmitting electrical signals in a coordinated fashion to contract heart muscle.
“We started this project to address some of the inadequacies in 3D printing of biological tissues.”
– Kevin “Kit” Parker
The innovation lies in the addition of fibers within a printable ink. “FIG ink is capable of flowing through the printing nozzle but, once the structure is printed, it maintains its 3D shape,” says Choi. “Because of those properties, I found it’s possible to print a ventricle-like structure and other complex 3D shapes without using extra support materials or scaffolds.”
This video shows the spontaneous beating of a 3D-printed heart muscle. Credit: Harvard SEAS.
To create the FIG ink, Choi leveraged a rotary jet spinning technique developed in the lab of Kevin “Kit” Parker, Ph.D. that fabricates microfiber materials using an approach similar to the way cotton candy is spun. Postdoctoral researcher and Wyss Lumineer Luke MacQueen, a co-author on the paper, proposed the idea that fibers created by the rotary jet spinning technique could be added to an ink and 3D printed. Parker is a Wyss Associate Faculty member and the Tarr Family Professor of Bioengineering and Applied Physics at SEAS.
“When Luke developed this concept, the vision was to broaden the range of spatial scales that could be printed with 3D printers by dropping the bottom out of the lower limits, taking it down to the nanometer scale,” Parker says. “The advantage of producing the fibers with rotary jet spinning rather than electrospinning” – a more conventional method for generating ultrathin fibers – “is that we can use proteins that would otherwise be degraded by the electrical fields in electrospinning.”
Using the rotary jet to spin gelatin fibers, Choi produced a sheet of material with a similar appearance to cotton. Next, she used sonification – sound waves – to break that sheet into fibers about 80 to 100 micrometers long and about 5 to 10 micrometers in diameter. Then, she dispersed those fibers into a hydrogel ink.
“This concept is broadly applicable – we can use our fiber-spinning technique to reliably produce fibers in the lengths and shapes we want.”
– Suji Choi
The most difficult aspect was troubleshooting the desired ratio between fibers and hydrogel in the ink to maintain fiber alignment and the overall integrity of the 3D-printed structure.
As Choi printed 2D and 3D structures using FIG ink, the cardiomyocytes lined up in tandem with the direction of the fibers inside the ink. By controlling the printing direction, Choi could therefore control how the heart muscle cells would align.
When she applied electrical stimulation to 3D-printed structures made with FIG ink, she found it triggered a coordinated wave of contractions in alignment with the direction of those fibers. In a ventricle-shaped structure, “it was very exciting to see the chamber actually pumping in a similar way to how real heart ventricles pump,” Choi says.
As she experimented with more printing directions and ink formulas, she found she could generate even stronger contractions within ventricle-like shapes.
“Compared to the real heart, our ventricle model is simplified and miniaturized,” she says. The team is now working toward building more life-like heart tissues with thicker muscle walls that can pump fluid more strongly. Despite not being as strong as real heart tissue, the 3D-printed ventricle could pump 5-20 times more fluid volume than previous 3D-printed heart chambers.
The team says the technique can also be used to build heart valves, dual-chambered miniature hearts, and more.
“FIGs are but one tool we have developed for additive manufacturing,” Parker says. “We have other methods in development as we continue our quest to build human tissues for regenerative therapeutics. The goal is not to be tool driven – we are tool agnostic in our search for a better way to build biology.”
Additional authors include Keel Yong Lee, Sean L. Kim, Huibin Chang, John F. Zimmerman, Qianru Jin, Michael M. Peters, Herdeline Ann M. Ardoña, Xujie Liu, Ann-Caroline Heiler, Rudy Gabardi, Collin Richardson, William T. Pu, and Andreas Bausch.
This work was sponsored by SEAS; the National Science Foundation through the Harvard University Materials Research Science and Engineering Center (DMR-1420570, DMR-2011754); the National Institutes of Health and National Center for Advancing Translational Sciences (UH3HL141798, 225 UG3TR003279); the Harvard University Center for Nanoscale Systems (CNS), a member of the National Nanotechnology Coordinated Infrastructure Network (NNCI) which is supported by the National Science Foundation (ECCS-2025158, S10OD023519); and the American Chemical Society’s Irving S. Sigal Postdoctoral Fellowships.