Researchers are blurring the lines between robotics and materials, with a proof-of-concept material-like collective of robots with behaviors inspired by biology.
Groundbreaking study shows machine learning can decode emotions in seven ungulate species. A game-changer for animal welfare? Can artificial intelligence help us understand what animals feel? A pioneering study suggests the answer is yes. Researchers have successfully trained a machine-learning model to distinguish between positive and negative emotions in seven different ungulate species, including cows, pigs, and wild boars. By analyzing the acoustic patterns of their vocalizations, the model achieved an impressive accuracy of 89.49%, marking the first cross-species study to detect emotional valence using AI.
Engineers have developed a versatile swimming robot that nimbly navigates cluttered water surfaces. Inspired by marine flatworms, the innovative device offers new possibilities for environmental monitoring and ecological research.
Researchers find large language models process diverse types of data, like different languages, audio inputs, images, etc., similarly to how humans reason about complex problems. Like humans, LLMs integrate data inputs across modalities in a central hub that processes data in an input-type-agnostic fashion.
Researchers have unveiled a transformative framework for understanding complex systems. This pioneering study establishes the new field of higher-order topological dynamics, revealing how the hidden geometry of networks shapes everything from brain activity to the climate and artificial intelligence (AI).
A study showed that chatbots alone outperformed doctors when making nuanced clinical decisions, but when supported by artificial intelligence, doctors performed as well as the chatbots.
Researchers developed an automated system to help programmers increase the efficiency of their deep learning algorithms by simultaneously leveraging two types of redundancy in complex data structures: sparsity and symmetry.
AI agents trained in simulations that differ from the environments where they are deployed sometimes perform better than agents trained and deployed in the same environment, research shows.
In the same way that terrestrial life evolved from ocean swimmers to land walkers, soft robots are progressing, too, thanks to recent research in battery development and design.
Scientists have developed a computing chip that can learn, correct errors, and process AI tasks.
Are humans or machines better at recognizing speech? A new study shows that in noisy conditions, current automatic speech recognition (ASR) systems achieve remarkable accuracy and sometimes even surpass human performance. However, the systems need to be trained on an incredible amount of data, while humans acquire comparable skills in less time.
Imagine a future where your phone, computer or even a tiny wearable device can think and learn like the human brain -- processing information faster, smarter and using less energy. A breakthrough approach brings this vision closer to reality by electrically 'twisting' a single nanoscale ferroelectric domain wall.
AI-powered algorithm can analyze video recordings of clinical sleep tests and more accurately diagnose REM sleep behavior disorder.
Researchers have harnessed artificial intelligence to take a key step toward slashing the time and cost of designing new wireless chips and discovering new functionalities to meet expanding demands for better wireless speed and performance.
Even highly realistic androids can cause unease when their facial expressions lack emotional consistency. Traditionally, a 'patchwork method' has been used for facial movements, but it comes with practical limitations. A team developed a new technology using 'waveform movements' to create real-time, complex expressions without unnatural transitions. This system reflects internal states, enhancing emotional communication between robots and humans, potentially making androids feel more humanlike.