Robot Talk Episode 146 – Embodied AI on the ISS, with Jamie Palmer
Claire chatted to Jamie Palmer from Icarus Robotics about building a robotic labour force to perform routine and risky tasks in orbit.
Jamie Palmer is co-founder and CTO of Icarus Robotics. He earned a Master’s in Robotics from Columbia University on a full scholarship, researching intelligent, dexterous manipulation in the ROAM lab. Jamie developed and deployed autonomous hospital robots during the pandemic and worked as a race-winning engineer for the Mercedes-AMG Petronas Formula One team.
From Cobots to Decision Makers: How Agentic AI Is Rewiring Industrial Robotics
Microsoft Corporation (MSFT) — Independent Equity Research Report
February 27, 2026 | Lead Equity Research Analyst | Independent Analysis This report is independent analytical research produced for informational and educational purposes only. It is not the product of a FINRA-registered broker-dealer, does not constitute investment advice, and should not be the sole basis for any investment decision. All price targets, valuation estimates, and...
The post Microsoft Corporation (MSFT) — Independent Equity Research Report appeared first on 1redDrop.
Alphabet Inc. (NASDAQ: GOOG) — Independent Equity Research
Rating: BUY | 12-Month Price Target: $390 | Current Price: ~$306 | Implied Upside: ~27% Report Date: February 27, 2026 | Analyst: Independent Equity Research This report is independent analytical research produced for informational and educational purposes only. It is not the product of a FINRA-registered broker-dealer, does not constitute investment advice, and should not...
The post Alphabet Inc. (NASDAQ: GOOG) — Independent Equity Research appeared first on 1redDrop.
APPLE INC. (NASDAQ: AAPL)
Institutional Equity Research Report Report Date: February 27, 2026 Analyst: Lead Equity Research Analyst Rating: HOLD 12-Month Price Target: $295 All data sourced from SEC EDGAR, Apple Investor Relations (investor.apple.com), Macrotrends, Yahoo Finance, Trading Economics, federalreserve.gov, home.treasury.gov, GuruFocus, and StockTitan/Stocktitan.net EDGAR summaries. Every key figure is cited inline by source and publication/filing date. SECTION 1...
The post APPLE INC. (NASDAQ: AAPL) appeared first on 1redDrop.
Snake-like robot unveiled for Fukushima debris removal
Nano Banana 2: Combining Pro capabilities with lightning-fast speed
I developed an app that uses drone footage to track plastic litter on beaches
By Gerard Dooly, University of Limerick
Plastic pollution is one of those problems everyone can see, yet few know how to tackle it effectively. I grew up walking the beaches around Tramore in County Waterford, Ireland, where plastic debris has always been part of the coastline, including bottles, fragments of fishing gear and food packaging.
According to the UN, every year 19-23 million tonnes of plastic lands up in lakes, rivers and seas, and it has a huge impact on ecosystems, creating pollution and damaging animal habitats.
Community groups do tremendous work cleaning these beaches, but they’re essentially walking blind, guessing where plastic accumulates, missing hot spots, repeating the same stretches while problem areas may go untouched.
Years later, working in marine robotics at the University of Limerick, I began developing tools to support marine clean-up and help communities find plastic pollution along our coastline.
The question seemed straightforward: could we use drones to show people exactly where the plastic is? And could we turn finding the plastic littered on beaches and cleaning it up into something people enjoy – in other words, “gamify” it? Could we also build on other ways that drones have been used previously such as tracking wildfires or identifying shipwrecks.
Building the technology
At the University of Limerick’s Centre for Robotics and Intelligent Systems, my team combined drone-based aerial surveillance work with machine-learning algorithms (a type of artificial intelligence) to map where plastic was being littered, and this paired with a free mobile app that provides volunteers with precise GPS coordinates for targeted clean-up.
The technical challenge was more complex than it appeared. Training computer vision models to detect a bottle cap from 30 metres altitude, while distinguishing it from similar objects like seaweed, driftwood, shells and weathered rocks, required extensive field testing and checks of the accuracy of the detection system.
The development hasn’t been straightforward. Early versions of the algorithm struggled with shadows and confused driftwood for plastic bottles. We spent months refining the system through trial and error on beaches around Clare and Galway so the system can now spot plastic as small as 1cm.
We conducted hundreds of test flights across Irish coastlines under varying environmental conditions, different lighting, tidal states, weather patterns, building a robust training dataset.
Ireland’s plastic problem
The urgency of this work becomes clear when you look at the Marine Institute’s work. Ireland’s 3,172 kilometres of coastline, the longest per capita in Europe, faces a deepening crisis.
A 2018 study found that 73% of deep-sea fish in Irish waters had ingested plastic particles. More than 250 species, including seabirds, fish, marine turtles and mammals have all been reported to ingest large items of plastics.
The costs go beyond harming wildlife, and the economic impact can be significant.
Our drone surveys revealed that some stretches of coast accumulate plastic at rates five to ten times higher than neighbouring areas, driven by ocean currents and river mouths. Without systematic monitoring, these hotspots go unaddressed.
Making the technology accessible
The plastic detection platform accepts drone imagery from any source, such as ordinary people flying their own drones.
Processing requires only standard laptop software. Users upload footage and receive GPS coordinates showing detected plastic locations. The mobile app, available free on iOS and Android, displays these locations as an interactive map.

Community groups, schools and individuals can see nearby plastic pollution and find it, saving a lot of time.
It has already been tested with five community groups around Ireland with positive results, averaging 30 plastics spotted per ten-minute drone flight, varying by location.
Working through the EU-funded BluePoint project, which is tackling plastic pollution of coastlines around Europe, we’ve distributed over 30 drones to partners across Ireland and Europe, including county councils and environmental organisations.
The technology has been deployed in areas including Spanish Point in County Clare, where the local Tidy Towns group (litter-picking volunteers), were named joint Clean Coast Community Group of the Year 2024.
Organising a litter pick. Video by Propeller BIC (Waterford).
The wider waste story
This is part of a broader European effort to address plastic pollution. Partners such as the sports store Decathlon are exploring how to transform recovered beach plastics into new consumer products – sports equipment, textiles and components.
The challenge isn’t just collection. Beach plastics arrive contaminated with sand and salt, in mixed types and grades. Our ongoing research characterises what’s actually found on Irish coastlines, providing manufacturers with data to design appropriate sorting and recycling processes.
The open source software platforms and the drone technology have already been used in nine countries, engaging more than 2,000 people. Pilot programmes are running in France, Spain, Portugal, Brazil and the UK. What began as a question about making beach clean-ups more effective has evolved into a practical system connecting citizen action to environmental outcomes.
Community feedback from pilots has been overwhelmingly positive. Groups report that the drone-derived GPS coordinates transform clean-up work. One participating Tidy Towns group said that volunteers now head straight to flagged locations.
Groups have also reported increased participation, the gamification aspect appeals to families and participants who might not volunteer otherwise. Additionally, the data we’ve gathered so far is being used by local authorities to understand litter patterns and inform policy decisions around waste management and coastal protection.![]()
Gerard Dooly, Assistant Professor in Engineering, University of Limerick
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Accelerating Digital Transformation in Automotive Parts Manufacturing with Autonomous Forklifts
NVIDIA CORPORATION (NASDAQ: NVDA)
Institutional Equity Research Report Date of Report: February 26, 2026 Analyst: Lead Equity Research Analyst Rating: BUY | Price Target: $265 | Current Price: ~$196 | Implied Upside: ~35% Data sourced from: NVIDIA Q4 FY2026 Earnings Release (Feb. 25, 2026), Q4 FY2026 Earnings Call Transcript (Motley Fool / Investing.com, Feb. 25, 2026), Yahoo Finance market...
The post NVIDIA CORPORATION (NASDAQ: NVDA) appeared first on 1redDrop.
Soft-robotic glove uses 37 actuators to cut hand swelling by up to 25%
Researchers expose critical security vulnerability in autonomous drones
Translating music into light and motion with robots
Image taken from the YouTube video created by the authors (see below).
A system developed by researchers at the University of Waterloo lets people collaborate with groups of robots to create works of art inspired by music.
The new technology features multiple wheeled robots about the size of soccer balls that trail coloured light as they move within a fixed area on the floor in response to key features of music including tempo and chord progression.
A camera records the co-ordinated light trails as they snake within that area, which serves as the canvas for the creation of a “painting,” or visual representation of the emotional content of a particular piece of music.
“Basically, we programmed a swarm of robots to paint based on musical input,” said Dr Gennaro Notomista, a professor of electrical and computer engineering at Waterloo.
“The result is a cohesive system that not only processes musical input, but also co-ordinates multiple painting robots to create adaptive, expressive art that reflects the emotional essence of the music being played.”
The robots represent emotion as they “listen” to music via the colours, intensity and width of their lights trails, as well as their position on the canvas and the speed with which they move within it.
People can simultaneously influence a painting in progress using controls to change the width of light trails and their location on the virtual canvas.
“We included the human control input to allow people and robots to work together,” said Notomista, whose interests include the intersection of art and technology. “The human painter should complement and be complemented by what the robots do.”
The first challenge for researchers was developing an algorithm to control multiple robots within a given area. They tested the system with up to 12 robots, but it can be scaled to handle any number.
Step two involved creating technology to extract and analyze musical features that express emotion so they can then be translated into light trails that appropriately represent them.
Lessons learned during the project have potential applications in other areas requiring the control and co-ordination of multiple robots working in unison, such as environmental monitoring, precision agriculture, search and rescue missions, and planetary exploration.
The research also reflects the University of Waterloo’s Global Futures initiative, which advances interdisciplinary work that considers how emerging technologies can shape society, culture and the human experience.
Later, Notomista plans to enlist professional painters and musicians to explore the possibilities of the new tool in user studies and stage public exhibitions.
A paper on the system, Music-driven Robot Swarm Painting, by Notomista and Jingde Cheng, a former Waterloo graduate student, was presented at the 2025 IEEE International Conference on Advanced Robotics and its Social Impacts.