Meet the AI-powered robotic dog ready to help with emergency response

Prototype robotic dogs built by Texas A&M University engineering students and powered by artificial intelligence demonstrate their advanced navigation capabilities. Photo credit: Logan Jinks/Texas A&M University College of Engineering.

By Jennifer Nichols

Meet the robotic dog with a memory like an elephant and the instincts of a seasoned first responder.

Developed by Texas A&M University engineering students, this AI-powered robotic dog doesn’t just follow commands. Designed to navigate chaos with precision, the robot could help revolutionize search-and-rescue missions, disaster response and many other emergency operations.

Sandun Vitharana, an engineering technology master’s student, and Sanjaya Mallikarachchi, an interdisciplinary engineering doctoral student, spearheaded the invention of the robotic dog. It can process voice commands and uses AI and camera input to perform path planning and identify objects.

A roboticist would describe it as a terrestrial robot that uses a memory-driven navigation system powered by a multimodal large language model (MLLM). This system interprets visual inputs and generates routing decisions, integrating environmental image capture, high-level reasoning, and path optimization, combined with a hybrid control architecture that enables both strategic planning and real-time adjustments.

A pair of robotic dogs with the ability to navigate through artificial intelligence climb concrete obstacles during a demonstration of their capabilities. Photo credit: Logan Jinks/Texas A&M University College of Engineering.

Robot navigation has evolved from simple landmark-based methods to complex computational systems integrating various sensory sources. However, navigating in unpredictable and unstructured environments like disaster zones or remote areas has remained difficult in autonomous exploration, where efficiency and adaptability are critical.

While robot dogs and large language model-based navigation exist in different contexts, it is a unique concept to combine a custom MLLM with a visual memory-based system, especially in a general-purpose and modular framework.

“Some academic and commercial systems have integrated language or vision models into robotics,” said Vitharana. “However, we haven’t seen an approach that leverages MLLM-based memory navigation in the structured way we describe, especially with custom pseudocode guiding decision logic.”

Mallikarachchi and Vitharana began by exploring how an MLLM could interpret visual data from a camera in a robotic system. With support from the National Science Foundation, they combined this idea with voice commands to build a natural and intuitive system to show how vision, memory and language can come together interactively. The robot can quickly respond to avoid a collision and handles high-level planning by using the custom MLLM to analyze its current view and plan how best to proceed.

“Moving forward, this kind of control structure will likely become a common standard for human-like robots,” Mallikarachchi explained.

The robot’s memory-based system allows it to recall and reuse previously traveled paths, making navigation more efficient by reducing repeated exploration. This ability is critical in search-and-rescue missions, especially in unmapped areas and GPS-denied environments.

The potential applications could extend well beyond emergency response. Hospitals, warehouses and other large facilities could use the robots to improve efficiency. Its advanced navigation system might also assist people with visual impairments, explore minefields or perform reconnaissance in hazardous areas.

Nuralem Abizov, Amanzhol Bektemessov and Aidos Ibrayev from Kazakhstan’s International Engineering and Technological University developed the ROS2 infrastructure for the project. HG Chamika Wijayagrahi from the UK’s Coventry University supported the map design and the analysis of experimental results.

Vitharana and Mallikarachchi presented the robot and demonstrated its capabilities at the recent 22nd International Conference on Ubiquitous Robots. The research was published in A Walk to Remember: MLLM Memory-Driven Visual Navigation.

From sea to space, this robot is on a roll

Rishi Jangale and Derek Pravecek with RoboBall III. Image credit: Emily Oswald/Texas A&M Engineering.

By Alyssa Schaechinger

While working at NASA in 2003, Dr. Robert Ambrose, director of the Robotics and Automation Design Lab (RAD Lab), designed a robot with no fixed top or bottom. A perfect sphere, the RoboBall could not flip over, and its shape promised access to places wheeled or legged machines could not reach — from the deepest lunar crater to the uneven sands of a beach. Two of his students built the first prototype, but then Ambrose shelved the idea to focus on drivable rovers for astronauts.

When Ambrose arrived at Texas A&M University in 2021, he saw a chance to reignite his idea. With funding from the Chancellor’s Research Initiative and Governor’s University Research Initiative, Ambrose brought RoboBall back to life.

Now, two decades after the original idea, RoboBall is rolling across Texas A&M University.

Driven by graduate students Rishi Jangale and Derek Pravecek, the RAD Lab is intent on sending RoboBall, a novel spherical robot, into uncharted terrain.

Jangale and Pravecek, both Ph.D. students in the J. Mike Walker ’66 Department of Mechanical Engineering, have played a significant part in getting the ball rolling once again.

“Dr. Ambrose has given us such a cool opportunity. He gives us the chance to work on RoboBall however we want,” said Jangale, who began work on RoboBall in 2022. “We manage ourselves, and we get to take RoboBall in any direction we want.”

Pravecek echoed that sense of freedom. “We get to work as actual engineers doing engineering tasks. This research teaches us things beyond what we read in textbooks,” he said. “It really is the best of both worlds.”

Robot in an airbag

At the heart of the project is the simple concept of a “robot in an airbag.” Two versions now exist in tandem. RoboBall II, a 2-foot-diameter prototype, is tuned for trial runs, monitoring power output and control algorithms. RoboBall III has a diameter of 6 feet across and is built with plans to carry payloads such as sensors, cameras or sampling tools, for real-world missions.

Upcoming tests will continue to take RoboBall into outdoor environments. RAD Lab researchers are planning field trials on the beaches of Galveston to demonstrate a water-to-land transition, testing the robot’s buoyancy and terrain adaptability in a real-world setting.

“Traditional vehicles stall or tip over in abrupt transitions,” Jangale explained. “This robot can roll out of water onto sand without worrying about orientation. It’s going where other robots can’t.”

The factors that create the versatility of RoboBall also lead to some of its challenges. Once sealed inside its protective shell, the robot can only be accessed electronically. Any mechanical failure means disassembly and digging through layers of wiring and actuators.

“Diagnostics can be a headache,” said Pravacek. “If a motor fails or a sensor disconnects, you can’t just pop open a panel. You have to take apart the whole robot and rebuild. It’s like open-heart surgery on a rolling ball.”

RoboBall’s novelty means the team often operates without a blueprint.

“Every task is new,” Jangale said. “We’re very much on our own. There’s no literature on soft-shelled spherical robots of this size that roll themselves.”

Despite those hurdles, the students find themselves surprised every time the robot outperforms expectations.

“When it does something we didn’t think was possible, I’m always surprised,” Pravecek said. “It still feels like magic.”

Student-led innovation

The team set a new record when RoboBall II reached 20 miles per hour, roughly half its theoretical power output. “We didn’t anticipate hitting that speed so soon,” Pravecek said. “It was thrilling, and it opened up new targets. Now we’re pushing even further.”

Ambrose sees these reactions as proof that student-led innovation thrives when engineers have room to explore.

“The autonomy Rishi and Derek have is exactly what a project like this needs,” he said. “They’re not just following instructions — they’re inventing the next generation of exploration tools.”

Long-term goals include autonomous navigation and remote deployment. The team hopes to see RoboBall dispatched from a lunar lander to chart steep crater walls or launched from an unmanned drone to survey post-disaster landscapes on Earth. Each ball could map terrain, transmit data back to operators and even deploy instruments in hard-to-reach spots.

“Imagine a swarm of these balls deployed after a hurricane,” Jangale said. “They could map flooded areas, find survivors and bring back essential data — all without risking human lives.”

As the RoboBall project rolls on, student-driven research stands on full display.

“Engineering is problem solving at its purest,” Ambrose said. “Give creative minds a challenge and the freedom to explore, and you’ll see innovation roll into reality.”