Archive 16.09.2023

Page 3 of 5
1 2 3 4 5

Virtual-reality tech is fast becoming more real

Virtual-reality technology could help cure people of phobias including about spiders. © Leena Robinson, Shutterstock.com

By Helen Massy-Beresford

Imagine a single technology that could help a robot perform safety checks at a nuclear plant, cure a person’s arachnophobia and simulate the feeling of a hug from a distant relative.

Welcome to the world of “extended reality”. Researchers funded by the EU have sought to demonstrate its enormous potential.

Relevant research

Their goal was to make augmented reality, in which the real world is digitally enhanced, and virtual reality – a fully computer-generated environment – more immersive for users.

One of the researchers, Erik Hernandez Jimenez, never imagined the immediate relevance of a project that he led when it started in mid-2019. Within a year, the Covid-19 pandemic had triggered countless lockdowns that left people working and socialising through video connections from home.   

‘We thought about how to apply this technology, how to feel human touch even at a distance, when we were all locked at home and contact with others was through a computer,’ said Hernandez Jimenez. 

He coordinated the EU research initiative, which was named TACTILITY and ran from July 2019 until the end of September 2022. 

The TACTILITY team developed a glove that simulates the sense of touch. Users have the sensation of touching virtual objects through electrical pulses delivered by electrodes embedded in the glove.

The sensations range from pushing a button and feeling pressure on the finger to handling a solid object and feeling its shape, dimensions and texture. 

Glove and suit

‘TACTILITY is about including tactile feedback in a virtual-reality scenario,’ said Hernandez Jimenez, who is a project manager at Spanish research institute TECNALIA.

He said the principle could be extended from the glove to a whole body suit. 

Compared with past attempts to simulate touch sensations with motors, the electro-tactile feedback technique produces a more realistic result at a lower cost, according to Hernandez Jimenez. 

This opens up the possibility of making the technology more widely accessible. 

The research bolsters European Commission efforts to develop the virtual-worlds domain, which could provide 860 000 new jobs in Europe this decade as the worldwide sector grows from €27 billion in 2022. 

The EU has around 3 700 companies, research organisations and governmental bodies that operate in this sphere, according to the Commission.

Phobias to factories

The TACTILITY researchers looked at potential healthcare applications. 

“We thought about how to apply this technology, how to feel human touch even at a distance.”

– Erik Hernandez Jimenez, TACTILITY

That’s where spiders come into the picture. They were among the objects in the project’s experiments to mimic touch.

‘One that was quite impressive – although I didn’t like it at all – was feeling a spider or a cockroach crawling over your hand,’ Hernandez Jimenez said.  

A potential use for the technology is treating phobias through exposure therapy in which patients are gradually desensitised to the source of their fear. That could start by virtually “touching” cartoon-like creepy crawlies before progressing to more lifelike versions.  

The tactile glove can also be used in the manufacturing industry, helping the likes of car manufacturers train their workers to perform tricky manoeuvres on the factory floor.

Furthermore, it can help people collaborate more effectively with remotely controlled robots in hazardous environments. An example is a nuclear power plant, where a person in a control room can virtually “feel” what a robot is touching. 

‘They get another sense and another kind of feedback, with more information to perform better checks,’ Hernandez Jimenez said. 

Joyful and playful

Wearables for virtual reality. © Oğuz ‘Oz’ Buruk, 2021

Wearable technologies for virtual-reality environments are also being inspired by the gaming industry. 

Researchers in a second EU-funded project sought to expand the prospects for technologies already widely used for professional purposes. The initiative, called WEARTUAL, ran from May 2019 until late 2021.

“Wearables are fashion items – they’re part of the way we construct our identity.”

– Oğuz ‘Oz’ Buruk, WEARTUAL

‘Our project focused on the more experiential side – joyful and playful activities,’ said Oğuz ‘Oz’ Buruk, who coordinated WEARTUAL and is assistant professor of gameful experience at Tampere University in Finland. 

Until recently, experiencing a virtual-reality environment involved a hand-held controller or head-mounted display. 

The WEARTUAL researchers looked at ways of incorporating wearables worn, for example, on the wrist or ankle into virtual reality to give people a sense of greater immersion. 

That could mean having their avatar – a representative icon or figure in the virtual world – blush when nervous or excited to enhance their ability to express themselves.

On the cusp

The team developed a prototype that could integrate varying physical sensations into the virtual world by transferring to it real-life data such as heart rate.  

Buruk is interested in how games will look in the “posthuman” era, when people and machines increasingly converge through bodily implants, robotics and direct communication between the human brain and computers.  

He signals that it’s hard to overestimate the eventual impact of advances in this area on everyday life, albeit over varying timescales: wearables are likely to be much more widely used in virtual reality in the next decade, while widespread use of bodily implants is more likely to take 50 to 100 years.

As technology and human bodies become ever more closely linked, the experience of transferring them to a virtual world will be enhanced, encouraging people to spend increasing amounts of time there, according to Buruk.

Virtual-reality technologies are already being used for practical purposes such as gamifying vital information including fire-safety procedures, making it more interactive and easier to learn. This type of use could expand to many areas.

On a very different front, several fashion houses already sell clothes that can be worn in virtual environments, allowing people to express their identity and creativity.  

‘Wearables are fashion items – they’re part of the way we construct our identity,’ Buruk said. ‘Investments in virtual reality, extended reality and augmented reality are increasing every day.’

Research in this article was funded by the EU via the Marie Skłodowska-Curie Actions (MSCA).


This article was originally published in Horizon, the EU Research and Innovation magazine.

Researcher team develops soft valve technology to enable sensing and control integration in soft robots

Soft inflatable robots have emerged as a promising paradigm for applications that require inherent safety and adaptability. However, the integration of sensing and control systems in these robots has posed significant challenges without compromising their softness, form factor, or capabilities.

Using tiny combustion engines to power very tiny robots

A team of mechanical engineers at Cornell University, working with a colleague from Technion-Israel Institute of Technology, has designed and built a tiny robot that is powered by a combustion engine. In their paper published in the journal Science, the group describes how they built their tiny engine and possible uses for it. Ryan Truby, with Northwestern University, has published a Perspective piece in the same journal issue outlining the work done by the team on this new effort.

How do robots collaborate to achieve consensus?

Making group decisions is no easy task, especially when the decision makers are a swarm of robots. To increase swarm autonomy in collective perception, a research team at the IRIDIA artificial intelligence research laboratory at the Université Libre de Bruxelles proposed an innovative self-organizing approach in which one robot at a time works temporarily as the "brain" to consolidate information on behalf of the group.

High-tech microscope with ML software for detecting malaria in returning travellers

suitcase with passport tucked under handle

By Deborah Pirchner

Malaria is an infectious disease claiming more than half a million lives each year. Because traditional diagnosis takes expertise and the workload is high, an international team of researchers investigated if diagnosis using a new system combining an automatic scanning microscope and AI is feasible in clinical settings. They found that the system identified malaria parasites almost as accurately as experts staffing microscopes used in standard diagnostic procedures. This may help reduce the burden on microscopists and increase the feasible patient load.

Each year, more than 200 million people fall sick with malaria and more than half a million of these infections lead to death. The World Health Organization recommends parasite-based diagnosis before starting treatment for the disease caused by Plasmodium parasites. There are various diagnostic methods, including conventional light microscopy, rapid diagnostic tests and PCR.

The standard for malaria diagnosis, however, remains manual light microscopy, during which a specialist examines blood films with a microscope to confirm the presence of malaria parasites. Yet, the accuracy of the results depends critically on the skills of the microscopist and can be hampered by fatigue caused by excessive workloads of the professionals doing the testing.

Now, writing in Frontiers in Malaria, an international team of researchers has assessed whether a fully automated system, combining AI detection software and an automated microscope, can diagnose malaria with clinically useful accuracy.

“At an 88% diagnostic accuracy rate relative to microscopists, the AI system identified malaria parasites almost, though not quite, as well as experts,” said Dr Roxanne Rees-Channer, a researcher at The Hospital for Tropical Diseases at UCLH in the UK, where the study was performed. “This level of performance in a clinical setting is a major achievement for AI algorithms targeting malaria. It indicates that the system can indeed be a clinically useful tool for malaria diagnosis in appropriate settings.”

AI delivers accurate diagnosis

The researchers sampled more than 1,200 blood samples of travelers who had returned to the UK from malaria-endemic countries. The study tested the accuracy of the AI and automated microscope system in a true clinical setting under ideal conditions.

They evaluated samples using both manual light microscopy and the AI-microscope system. By hand, 113 samples were diagnosed as malaria parasite positive, whereas the AI-system correctly identified 99 samples as positive, which corresponds to an 88% accuracy rate.

“AI for medicine often posts rosy preliminary results on internal datasets, but then falls flat in real clinical settings. This study independently assessed whether the AI system could succeed in a true clinical use case,” said Rees-Channer, who is also the lead author of the study.

Automated vs manual

The fully automated malaria diagnostic system the researchers put to the test includes hard- as well as software. An automated microscopy platform scans blood films and malaria detection algorithms process the image to detect parasites and the quantity present.

Automated malaria diagnosis has several potential benefits, the scientists pointed out. “Even expert microscopists can become fatigued and make mistakes, especially under a heavy workload,” Rees-Channer explained. “Automated diagnosis of malaria using AI could reduce this burden for microscopists and thus increase the feasible patient load.” Furthermore, these systems deliver reproducible results and can be widely deployed, the scientists wrote.

Despite the 88% accuracy rate, the automated system also falsely identified 122 samples as positive, which can lead to patients receiving unnecessary anti-malarial drugs. “The AI software is still not as accurate as an expert microscopist. This study represents a promising datapoint rather than a decisive proof of fitness,” Rees-Channer concluded.

Read the research in full

Evaluation of an automated microscope using machine learning for the detection of malaria in travelers returned to the UK, Roxanne R. Rees-Channer, Christine M. Bachman, Lynn Grignard, Michelle L. Gatton, Stephen Burkot, Matthew P. Horning, Charles B. Delahunt, Liming Hu, Courosh Mehanian, Clay M. Thompson, Katherine Woods, Paul Lansdell, Sonal Shah, Peter L. Chiodini, Frontiers in Malaria (2023).

An embodied conversational agent that merges large language models and domain-specific assistance

Large language models (LLMs) are advanced deep learning techniques that can interact with humans in real-time and respond to prompts about a wide range of topics. These models have gained much popularity after the release of ChatGPT, a model created by OpenAI that surprised many users for its ability to generate human-like answers to their questions.

How drones are used during earthquakes

In the realm of disaster response, technology plays a pivotal role in aiding communities during challenging times. In this exploration, we turn our attention to drones and their application in earthquake response, especially as how they are being used in the recent Morocco earthquake. This concise video offers valuable insights into the practical uses of drones and the considerations surrounding their deployment during earthquake-related crises.

Humans can feel empathic embarrassment towards robots, finds virtual reality study

In a virtual reality study that sheds light on the intricacies of human-robot interactions, researchers have discovered that humans are capable of experiencing empathic embarrassment when witnessing robots go through embarrassing situations.

Making life friendlier with personal robots

Sharifa Alghowinem, a research scientist in the Media Lab’s Personal Robots Group, poses with Jibo, a friendly robot companion developed by Professor Cynthia Breazeal. Credits: Gretchen Ertl

By Dorothy Hanna | Department of Mechanical Engineering

“As a child, I wished for a robot that would explain others’ emotions to me” says Sharifa Alghowinem, a research scientist in the Media Lab’s Personal Robots Group (PRG). Growing up in Saudi Arabia, Alghowinem says she dreamed of coming to MIT one day to develop Arabic-based technologies, and of creating a robot that could help herself and others navigate a complex world.

In her early life, Alghowinem faced difficulties with understanding social cues and never scored well on standardized tests, but her dreams carried her through. She earned an undergraduate degree in computing before leaving home to pursue graduate education in Australia. At the Australian National University, she discovered affective computing for the first time and began working to help AI detect human emotions and moods, but it wasn’t until she came to MIT as a postdoc with the Ibn Khaldun Fellowship for Saudi Arabian Women, which is housed in the MIT Department of Mechanical Engineering, that she was finally able to work on a technology with the potential to explain others’ emotions in English and Arabic. Today, she says her work is so fun that she calls the lab “my playground.” 

Alghowinem can’t say no to an exciting project. She found one with great potential to make robots more helpful to people by working with Jibo, a friendly robot companion developed by the founder of the Personal Robots Group (PRG) and the social robot startup Jibo Inc., MIT Professor and Dean for Digital Learning Cynthia Breazeal. Breazeal’s research explores the potential for companion robots to go far beyond assistants who obey transactional commands, like requests for the daily weather, adding items to shopping lists, or controlling lighting. At the MIT Media Lab, the PRG team designs Jibo to make him an insightful coach and companion to advance social robotics technologies and research. Visitors to the MIT Museum can experience Jibo’s charming personality.

Alghowinem’s research has focused on mental health care and education, often working with other graduate students and Undergraduate Research Opportunity Program students in the group. In one study, Jibo coached young and older adults via positive psychology. He adapted his interventions based on the verbal and non-verbal responses he observed in the participants. For example, Jibo takes in the verbal content of a participant’s speech and combines it with non-verbal information like prolonged pauses and self-hugs. If he concludes that deep emotions have been disclosed, Jibo responds with empathy. When the participant doesn’t disclose, Jibo asks a gentle follow up question like, “Can you tell me more?” 

Another project studied how a robot can effectively support high-quality parent and child interactions while reading a storybook together. Multiple PRG studies work together to learn what types of data are needed for a robot to understand people’s social and emotional states.

Research Scientist Sharifa Alghowinem (left) and visiting students Deim Alfozan and Tasneem Burghleh from Saudi Arabia’s Prince Sultan University, interact with Jibo. Credits: Gretchen Ertl

“I would like to see Jibo become a companion for the whole household,” says Alghowinem. Jibo can take on different roles with different family members such as a companion, reminding elders to take medication, or as a playmate for children. Alghowinem is especially motivated by the unique role Jibo could play in emotional wellness, and playing a preventative role in depression or even suicide. Integrating Jibo into daily life provides the opportunity for Jibo to detect emerging concerns and intervene, acting as a confidential resource or mental health coach. 

Alghowinem is also passionate about teaching and mentoring others, and not only via robots. She makes sure to meet individually with the students she mentors every week and she was instrumental earlier this year in bringing two visiting undergraduate students from Prince Sultan University in Saudi Arabia. Mindful of their social-emotional experience, she worked hard to create the opportunity for the two students, together, to visit MIT so they could support each other. One of the visiting students, Tasneem Burghleh, says she was curious to meet the person who went out of her way to make opportunities for strangers and discovered in her an “endless passion that makes her want to pass it on and share it with everyone else.”

Next, Alghowinem is working to create opportunities for children who are refugees from Syria. Still in the fundraising stage, the plan is to equip social robots to teach the children English language and social-emotional skills and provide activities to preserve cultural heritage and Arabic abilities.

“We’ve laid the groundwork by making sure Jibo can speak Arabic as well as several other languages,” says Alghowinem. “Now I hope we can learn how to make Jibo really useful to kids like me who need some support as they learn how to interact with the world around them.”

Page 3 of 5
1 2 3 4 5