All posts by Artificial Intelligence News -- ScienceDaily

Page 2 of 7
1 2 3 4 7

Like human brains, large language models reason about diverse data in a general way

Researchers find large language models process diverse types of data, like different languages, audio inputs, images, etc., similarly to how humans reason about complex problems. Like humans, LLMs integrate data inputs across modalities in a central hub that processes data in an input-type-agnostic fashion.

Groundbreaking study reveals how topology drives complexity in brain, climate, and AI

Researchers have unveiled a transformative framework for understanding complex systems. This pioneering study establishes the new field of higher-order topological dynamics, revealing how the hidden geometry of networks shapes everything from brain activity to the climate and artificial intelligence (AI).

Automatic speech recognition on par with humans in noisy conditions

Are humans or machines better at recognizing speech? A new study shows that in noisy conditions, current automatic speech recognition (ASR) systems achieve remarkable accuracy and sometimes even surpass human performance. However, the systems need to be trained on an incredible amount of data, while humans acquire comparable skills in less time.

Brain-inspired nanotech points to a new era in electronics

Imagine a future where your phone, computer or even a tiny wearable device can think and learn like the human brain -- processing information faster, smarter and using less energy. A breakthrough approach brings this vision closer to reality by electrically 'twisting' a single nanoscale ferroelectric domain wall.

Crossing the Uncanny Valley: Breakthrough in technology for lifelike facial expressions in androids

Even highly realistic androids can cause unease when their facial expressions lack emotional consistency. Traditionally, a 'patchwork method' has been used for facial movements, but it comes with practical limitations. A team developed a new technology using 'waveform movements' to create real-time, complex expressions without unnatural transitions. This system reflects internal states, enhancing emotional communication between robots and humans, potentially making androids feel more humanlike.

Breaking barriers: Study uses AI to interpret American Sign Language in real-time

A study is the first-of-its-kind to recognize American Sign Language (ASL) alphabet gestures using computer vision. Researchers developed a custom dataset of 29,820 static images of ASL hand gestures. Each image was annotated with 21 key landmarks on the hand, providing detailed spatial information about its structure and position. Combining MediaPipe and YOLOv8, a deep learning method they trained, with fine-tuning hyperparameters for the best accuracy, represents a groundbreaking and innovative approach that hasn't been explored in previous research.

Empowering older adults with home-care robots

The rapidly increasing aging population will lead to a shortage of care providers in the future. While robotic technologies are a potential alternative, their widespread use is limited by poor acceptance. In a new study, researchers have examined a user-centric approach to understand the factors influencing user willingness among caregivers and recipients in Japan, Ireland, and Finland. Users' perspectives can aid the development of home-care robots with better acceptance.
Page 2 of 7
1 2 3 4 7