Page 2 of 5
1 2 3 4 5

Vultures and artificial intelligence(s) as death detectors: High-tech approach for wildlife research and conservation

In order to use remote locations to record and assess the behavior of wildlife and environmental conditions, the GAIA Initiative developed an artificial intelligence (AI) algorithm that reliably and automatically classifies behaviors of white-backed vultures using animal tag data. As scavengers, vultures always look for the next carcass. With the help of tagged animals and a second AI algorithm, the scientists can now automatically locate carcasses across vast landscapes.

Robot identifies plants by ‘touching’ their leaves

Researchers have developed a robot that identifies different plant species at various stages of growth by 'touching' their leaves with an electrode. The robot can measure properties such as surface texture and water content that cannot be determined using existing visual approaches. The robot identified ten different plant species with an average accuracy of 97.7% and identified leaves of the flowering bauhinia plant with 100% accuracy at various growth stages.

Robot learns how to clean a washbasin

Scientists have created a robot that can learn tasks like cleaning a washbasin just by watching humans. A special sponge with sensors is used to show the robot how to clean. Using an advanced machine learning system, the robot learns how it is supposed to behave and can apply this knowledge to cleaning different washbasins.

AI-driven mobile robots team up to tackle chemical synthesis

Researchers have developed AI-driven mobile robots that can carry out chemical synthesis research with extraordinary efficiency. Researchers show how mobile robots that use AI logic to make decisions were able to perform exploratory chemistry research tasks to the same level as humans, but much faster.

Researchers develop system cat’s eye-inspired vision for autonomous robotics

Researchers have unveiled a vision system inspired by feline eyes to enhance object detection in various lighting conditions. Featuring a unique shape and reflective surface, the system reduces glare in bright environments and boosts sensitivity in low-light scenarios. By filtering unnecessary details, this technology significantly improves the performance of single-lens cameras, representing a notable advancement in robotic vision capabilities.

Language model ‘UroBot’ surpasses the accuracy of experienced urologists

Scientists have developed and successfully tested a new chatbot based on artificial intelligence: 'UroBot' was able to answer questions from the urology specialist examination with a high degree of accuracy, surpassing both other language models and the accuracy of experienced urologists. The model justifies its answers in detail based on the guidelines.

Shrinking AR displays into eyeglasses to expand their use

Augmented reality (AR) takes digital images and superimposes them onto real-world views. But AR is more than a new way to play video games; it could transform surgery and self-driving cars. To make the technology easier to integrate into common personal devices, researchers report how to combine two optical technologies into a single, high-resolution AR display. In an eyeglasses prototype, the researchers enhanced image quality with a computer algorithm that removed distortions.

Autonomous vehicles could understand their passengers better with ChatGPT

Imagine simply telling your vehicle, 'I'm in a hurry,' and it automatically takes you on the most efficient route to where you need to be. Engineers found that an autonomous vehicle (AV) can do this with the help of ChatGPT or other chatbots made possible by artificial intelligence algorithms called large language models.
Page 2 of 5
1 2 3 4 5