Page 106 of 431
1 104 105 106 107 108 431

Researcher team develops soft valve technology to enable sensing and control integration in soft robots

Soft inflatable robots have emerged as a promising paradigm for applications that require inherent safety and adaptability. However, the integration of sensing and control systems in these robots has posed significant challenges without compromising their softness, form factor, or capabilities.

Using tiny combustion engines to power very tiny robots

A team of mechanical engineers at Cornell University, working with a colleague from Technion-Israel Institute of Technology, has designed and built a tiny robot that is powered by a combustion engine. In their paper published in the journal Science, the group describes how they built their tiny engine and possible uses for it. Ryan Truby, with Northwestern University, has published a Perspective piece in the same journal issue outlining the work done by the team on this new effort.

How do robots collaborate to achieve consensus?

Making group decisions is no easy task, especially when the decision makers are a swarm of robots. To increase swarm autonomy in collective perception, a research team at the IRIDIA artificial intelligence research laboratory at the Université Libre de Bruxelles proposed an innovative self-organizing approach in which one robot at a time works temporarily as the "brain" to consolidate information on behalf of the group.

High-tech microscope with ML software for detecting malaria in returning travellers

suitcase with passport tucked under handle

By Deborah Pirchner

Malaria is an infectious disease claiming more than half a million lives each year. Because traditional diagnosis takes expertise and the workload is high, an international team of researchers investigated if diagnosis using a new system combining an automatic scanning microscope and AI is feasible in clinical settings. They found that the system identified malaria parasites almost as accurately as experts staffing microscopes used in standard diagnostic procedures. This may help reduce the burden on microscopists and increase the feasible patient load.

Each year, more than 200 million people fall sick with malaria and more than half a million of these infections lead to death. The World Health Organization recommends parasite-based diagnosis before starting treatment for the disease caused by Plasmodium parasites. There are various diagnostic methods, including conventional light microscopy, rapid diagnostic tests and PCR.

The standard for malaria diagnosis, however, remains manual light microscopy, during which a specialist examines blood films with a microscope to confirm the presence of malaria parasites. Yet, the accuracy of the results depends critically on the skills of the microscopist and can be hampered by fatigue caused by excessive workloads of the professionals doing the testing.

Now, writing in Frontiers in Malaria, an international team of researchers has assessed whether a fully automated system, combining AI detection software and an automated microscope, can diagnose malaria with clinically useful accuracy.

“At an 88% diagnostic accuracy rate relative to microscopists, the AI system identified malaria parasites almost, though not quite, as well as experts,” said Dr Roxanne Rees-Channer, a researcher at The Hospital for Tropical Diseases at UCLH in the UK, where the study was performed. “This level of performance in a clinical setting is a major achievement for AI algorithms targeting malaria. It indicates that the system can indeed be a clinically useful tool for malaria diagnosis in appropriate settings.”

AI delivers accurate diagnosis

The researchers sampled more than 1,200 blood samples of travelers who had returned to the UK from malaria-endemic countries. The study tested the accuracy of the AI and automated microscope system in a true clinical setting under ideal conditions.

They evaluated samples using both manual light microscopy and the AI-microscope system. By hand, 113 samples were diagnosed as malaria parasite positive, whereas the AI-system correctly identified 99 samples as positive, which corresponds to an 88% accuracy rate.

“AI for medicine often posts rosy preliminary results on internal datasets, but then falls flat in real clinical settings. This study independently assessed whether the AI system could succeed in a true clinical use case,” said Rees-Channer, who is also the lead author of the study.

Automated vs manual

The fully automated malaria diagnostic system the researchers put to the test includes hard- as well as software. An automated microscopy platform scans blood films and malaria detection algorithms process the image to detect parasites and the quantity present.

Automated malaria diagnosis has several potential benefits, the scientists pointed out. “Even expert microscopists can become fatigued and make mistakes, especially under a heavy workload,” Rees-Channer explained. “Automated diagnosis of malaria using AI could reduce this burden for microscopists and thus increase the feasible patient load.” Furthermore, these systems deliver reproducible results and can be widely deployed, the scientists wrote.

Despite the 88% accuracy rate, the automated system also falsely identified 122 samples as positive, which can lead to patients receiving unnecessary anti-malarial drugs. “The AI software is still not as accurate as an expert microscopist. This study represents a promising datapoint rather than a decisive proof of fitness,” Rees-Channer concluded.

Read the research in full

Evaluation of an automated microscope using machine learning for the detection of malaria in travelers returned to the UK, Roxanne R. Rees-Channer, Christine M. Bachman, Lynn Grignard, Michelle L. Gatton, Stephen Burkot, Matthew P. Horning, Charles B. Delahunt, Liming Hu, Courosh Mehanian, Clay M. Thompson, Katherine Woods, Paul Lansdell, Sonal Shah, Peter L. Chiodini, Frontiers in Malaria (2023).

An embodied conversational agent that merges large language models and domain-specific assistance

Large language models (LLMs) are advanced deep learning techniques that can interact with humans in real-time and respond to prompts about a wide range of topics. These models have gained much popularity after the release of ChatGPT, a model created by OpenAI that surprised many users for its ability to generate human-like answers to their questions.

How drones are used during earthquakes

In the realm of disaster response, technology plays a pivotal role in aiding communities during challenging times. In this exploration, we turn our attention to drones and their application in earthquake response, especially as how they are being used in the recent Morocco earthquake. This concise video offers valuable insights into the practical uses of drones and the considerations surrounding their deployment during earthquake-related crises.

Humans can feel empathic embarrassment towards robots, finds virtual reality study

In a virtual reality study that sheds light on the intricacies of human-robot interactions, researchers have discovered that humans are capable of experiencing empathic embarrassment when witnessing robots go through embarrassing situations.

Making life friendlier with personal robots

Sharifa Alghowinem, a research scientist in the Media Lab’s Personal Robots Group, poses with Jibo, a friendly robot companion developed by Professor Cynthia Breazeal. Credits: Gretchen Ertl

By Dorothy Hanna | Department of Mechanical Engineering

“As a child, I wished for a robot that would explain others’ emotions to me” says Sharifa Alghowinem, a research scientist in the Media Lab’s Personal Robots Group (PRG). Growing up in Saudi Arabia, Alghowinem says she dreamed of coming to MIT one day to develop Arabic-based technologies, and of creating a robot that could help herself and others navigate a complex world.

In her early life, Alghowinem faced difficulties with understanding social cues and never scored well on standardized tests, but her dreams carried her through. She earned an undergraduate degree in computing before leaving home to pursue graduate education in Australia. At the Australian National University, she discovered affective computing for the first time and began working to help AI detect human emotions and moods, but it wasn’t until she came to MIT as a postdoc with the Ibn Khaldun Fellowship for Saudi Arabian Women, which is housed in the MIT Department of Mechanical Engineering, that she was finally able to work on a technology with the potential to explain others’ emotions in English and Arabic. Today, she says her work is so fun that she calls the lab “my playground.” 

Alghowinem can’t say no to an exciting project. She found one with great potential to make robots more helpful to people by working with Jibo, a friendly robot companion developed by the founder of the Personal Robots Group (PRG) and the social robot startup Jibo Inc., MIT Professor and Dean for Digital Learning Cynthia Breazeal. Breazeal’s research explores the potential for companion robots to go far beyond assistants who obey transactional commands, like requests for the daily weather, adding items to shopping lists, or controlling lighting. At the MIT Media Lab, the PRG team designs Jibo to make him an insightful coach and companion to advance social robotics technologies and research. Visitors to the MIT Museum can experience Jibo’s charming personality.

Alghowinem’s research has focused on mental health care and education, often working with other graduate students and Undergraduate Research Opportunity Program students in the group. In one study, Jibo coached young and older adults via positive psychology. He adapted his interventions based on the verbal and non-verbal responses he observed in the participants. For example, Jibo takes in the verbal content of a participant’s speech and combines it with non-verbal information like prolonged pauses and self-hugs. If he concludes that deep emotions have been disclosed, Jibo responds with empathy. When the participant doesn’t disclose, Jibo asks a gentle follow up question like, “Can you tell me more?” 

Another project studied how a robot can effectively support high-quality parent and child interactions while reading a storybook together. Multiple PRG studies work together to learn what types of data are needed for a robot to understand people’s social and emotional states.

Research Scientist Sharifa Alghowinem (left) and visiting students Deim Alfozan and Tasneem Burghleh from Saudi Arabia’s Prince Sultan University, interact with Jibo. Credits: Gretchen Ertl

“I would like to see Jibo become a companion for the whole household,” says Alghowinem. Jibo can take on different roles with different family members such as a companion, reminding elders to take medication, or as a playmate for children. Alghowinem is especially motivated by the unique role Jibo could play in emotional wellness, and playing a preventative role in depression or even suicide. Integrating Jibo into daily life provides the opportunity for Jibo to detect emerging concerns and intervene, acting as a confidential resource or mental health coach. 

Alghowinem is also passionate about teaching and mentoring others, and not only via robots. She makes sure to meet individually with the students she mentors every week and she was instrumental earlier this year in bringing two visiting undergraduate students from Prince Sultan University in Saudi Arabia. Mindful of their social-emotional experience, she worked hard to create the opportunity for the two students, together, to visit MIT so they could support each other. One of the visiting students, Tasneem Burghleh, says she was curious to meet the person who went out of her way to make opportunities for strangers and discovered in her an “endless passion that makes her want to pass it on and share it with everyone else.”

Next, Alghowinem is working to create opportunities for children who are refugees from Syria. Still in the fundraising stage, the plan is to equip social robots to teach the children English language and social-emotional skills and provide activities to preserve cultural heritage and Arabic abilities.

“We’ve laid the groundwork by making sure Jibo can speak Arabic as well as several other languages,” says Alghowinem. “Now I hope we can learn how to make Jibo really useful to kids like me who need some support as they learn how to interact with the world around them.”

Page 106 of 431
1 104 105 106 107 108 431