A research team has addressed the long-standing challenge of creating artificial olfactory sensors with arrays of diverse high-performance gas sensors. Their newly developed biomimetic olfactory chips (BOC) are able to integrate nanotube sensor arrays on nanoporous substrates with up to 10,000 individually addressable gas sensors per chip, a configuration that is similar to how olfaction works for humans and other animals.
The new algorithm and technology have the potential to drastically reduce complex fiber alignment and photonics alignment procedure times by several orders of magnitude, surpassing any other existing technique used for automated fiber optic alignment in the market.
What would you do if you walked up to a robot with a human-like head and it smiled at you first? You'd likely smile back and perhaps feel the two of you were genuinely interacting. But how does a robot know how to do this? Or a better question, how does it know to get you to smile back?
What would you do if you walked up to a robot with a human-like head and it smiled at you first? You'd likely smile back and perhaps feel the two of you were genuinely interacting. But how does a robot know how to do this? Or a better question, how does it know to get you to smile back?
Umwelt is a new a system that enables blind and low-vision users to author accessible, interactive charts representing data in three modalities: visualization, textual description, and sonification.
A new study has shown that people modify their behavior to accommodate autonomous delivery robots, and it is this invisible "human work" that allows robots to run smoothly on the streets and needs to be considered when designing their routes.
The integration of moral robots in customer service does more than just streamline operations; it fundamentally alters the relationship between businesses and their customers.
Robots equipped with AI vision systems, such as MIRAI, can perceive and adapt to their surroundings. This allows them to handle variance while performing complex tasks, making it possible to automate operations otherwise considered too complex to solve.
Engineers aim to give robots a bit of common sense when faced with situations that push them off their trained path, so they can self-correct after missteps and carry on with their chores. The team's method connects robot motion data with the common sense knowledge of large language models, or LLMs.
Bushfires can move at astonishing speeds. The land, amount of vegetation, and the weather all have a big impact on how a fire spreads. Staying one step ahead is no easy task, but our bushfire researchers are working on it.
From wiping up spills to serving up food, robots are being taught to carry out increasingly complicated household tasks. Many such home-bot trainees are learning through imitation; they are programmed to copy the motions that a human physically guides them through.
Researchers at McMaster University and Stanford University have invented a new generative artificial intelligence model which can design billions of new antibiotic molecules that are inexpensive and easy to build in the laboratory.
Isaac Robotics Platform Now Provides Developers New Robot Training Simulator, Jetson Thor Robot Computer, Generative AI Foundation Models, and CUDA-Accelerated Perception and Manipulation Libraries
Unmanned aerial vehicles (UAVs), also known as drones, have already proved to be valuable tools for tackling a wide range of real-world problems, ranging from the monitoring of natural environments and agricultural plots to search and rescue missions and the filming of movie scenes from above. So far, most of these problems have been tackled using one drone at a time, rather than teams of multiple autonomous or semi-autonomous UAVs.
Researchers at the Indian Institute of Science (IISc) are using multiple swarms of drones to tackle natural disasters like forest fires. Forest fires are becoming increasingly catastrophic across the world, accelerated by climate change.