A research team demonstrated the 'world's smallest shooting game,' a unique nanoscale game inspired by classic arcade games. This achievement was made possible by real-time control of the force fields between nanoparticles using focused electron beams. This research has practical applications, as the manipulation of nanoscale objects could revolutionize biomedical engineering and nanotechnology.
A team of scientists has now created a computer model that can represent and generate human-like goals by learning from how people create games. The work could lead to AI systems that better understand human intentions and more faithfully model and align with our goals. It may also lead to AI systems that can help us design more human-like games.
Researchers are blurring the lines between robotics and materials, with a proof-of-concept material-like collective of robots with behaviors inspired by biology.
Groundbreaking study shows machine learning can decode emotions in seven ungulate species. A game-changer for animal welfare? Can artificial intelligence help us understand what animals feel? A pioneering study suggests the answer is yes. Researchers have successfully trained a machine-learning model to distinguish between positive and negative emotions in seven different ungulate species, including cows, pigs, and wild boars. By analyzing the acoustic patterns of their vocalizations, the model achieved an impressive accuracy of 89.49%, marking the first cross-species study to detect emotional valence using AI.
Engineers have developed a versatile swimming robot that nimbly navigates cluttered water surfaces. Inspired by marine flatworms, the innovative device offers new possibilities for environmental monitoring and ecological research.
Researchers find large language models process diverse types of data, like different languages, audio inputs, images, etc., similarly to how humans reason about complex problems. Like humans, LLMs integrate data inputs across modalities in a central hub that processes data in an input-type-agnostic fashion.
Researchers have unveiled a transformative framework for understanding complex systems. This pioneering study establishes the new field of higher-order topological dynamics, revealing how the hidden geometry of networks shapes everything from brain activity to the climate and artificial intelligence (AI).
A study showed that chatbots alone outperformed doctors when making nuanced clinical decisions, but when supported by artificial intelligence, doctors performed as well as the chatbots.
Researchers developed an automated system to help programmers increase the efficiency of their deep learning algorithms by simultaneously leveraging two types of redundancy in complex data structures: sparsity and symmetry.
AI agents trained in simulations that differ from the environments where they are deployed sometimes perform better than agents trained and deployed in the same environment, research shows.
In the same way that terrestrial life evolved from ocean swimmers to land walkers, soft robots are progressing, too, thanks to recent research in battery development and design.
Scientists have developed a computing chip that can learn, correct errors, and process AI tasks.
Are humans or machines better at recognizing speech? A new study shows that in noisy conditions, current automatic speech recognition (ASR) systems achieve remarkable accuracy and sometimes even surpass human performance. However, the systems need to be trained on an incredible amount of data, while humans acquire comparable skills in less time.
Imagine a future where your phone, computer or even a tiny wearable device can think and learn like the human brain -- processing information faster, smarter and using less energy. A breakthrough approach brings this vision closer to reality by electrically 'twisting' a single nanoscale ferroelectric domain wall.
AI-powered algorithm can analyze video recordings of clinical sleep tests and more accurately diagnose REM sleep behavior disorder.