In the midst of the co-development of artificial intelligence and robotic advancements, developing technologies that enable robots to efficiently perceive and respond to their surroundings like humans has become a crucial task. In this context, Korean researchers are gaining attention for newly implementing an artificial sensory nervous system that mimics the sensory nervous system of living organisms without the need for separate complex software or circuitry. This breakthrough technology is expected to be applied in fields such as in ultra-small robots and robotic prosthetics, where intelligent and energy-efficient responses to external stimuli are essential.
How do you develop an AI system that perfectly mimics the way humans speak? Researchers at Nagoya University in Japan have taken a significant step forward to achieve this. They have created J-Moshi, the first publicly available AI system specifically designed for Japanese conversational patterns.
The slimy, segmented, bottom-dwelling California blackworm is about as unappealing as it gets—but get a few dozen or thousand together, and they form a massive, entangled blob that seems to take on a life of its own.
When ChatGPT or Gemini give what seems to be an expert response to your burning questions, you may not realize how much information it relies on to give that reply. Like other popular generative artificial intelligence (AI) models, these chatbots rely on backbone systems called foundation models that train on billions, or even trillions, of data points.
Researchers have developed an artificial intelligence (AI) system that enables a four-legged robot to adapt its gait to different, unfamiliar terrain, just like a real animal, in what is believed to be a world first. The work has been published in Nature Machine Intelligence.
Since early January 2025, residents of Birmingham in the UK have been caught in the dispute between the city council and the Unite union over pay, terms and conditions for waste and recycling collectors. The latest attempt at talks broke down in acrimony.
When successful artist Ai-Da unveiled a new portrait of King Charles this week, the humanoid robot described what inspired the layered and complex piece, and insisted it had no plans to "replace" humans.
Imagine a physician attempting to reach a cancerous nodule deep within a patient's lung—a target the size of a pea, hidden behind a maze of critical blood vessels and airways that shift with every breath. Straying one millimeter off course could puncture a major artery, and falling short could mean missing the cancer entirely, allowing it to spread untreated.
A robot trained on videos of surgeries performed a lengthy phase of a gallbladder removal without human help. The robot operated for the first time on a lifelike patient, and during the operation, responded to and learned from voice commands from the team—like a novice surgeon working with a mentor.
As waiting rooms fill up, doctors get increasingly burned out, and surgeries take longer to schedule and more get canceled, humanoid surgical robots offer a solution. That's the argument that UC San Diego robotics expert Michael Yip makes in a perspective piece in Science Robotics.
Conventional robots, like those used in industry and hazardous environments, are easy to model and control, but are too rigid to operate in confined spaces and uneven terrain. Soft, bio-inspired robots are far better at adapting to their environments and maneuvering in otherwise inaccessible places.
Oblivious to the punishing midday heat, a wheeled robot powered by the sun and infused with artificial intelligence carefully combs a cotton field in California, plucking out weeds.
Scientists are striving to discover new semiconductor materials that could boost the efficiency of solar cells and other electronics. But the pace of innovation is bottlenecked by the speed at which researchers can manually measure important material properties.
The more we interact with robots, the more human we perceive them to become—according to new research from the University of East Anglia, published in the Journal of Experimental Psychology: Human Perception and Performance.
The future of moon exploration may be rolling around a nondescript office on the CU Boulder campus. Here, a robot about as wide as a large pizza scoots forward on three wheels. It uses an arm with a claw at one end to pick up a plastic block from the floor, then set it back down.