#RoboCup2024 – daily digest: 20 July
The Standard Platform Soccer League in action.
This is the second of our daily digests from RoboCup2024 in Eindhoven, The Netherlands. If you missed the first digest, which gives some background to RoboCup, you can find it here.
Competitions continued across all the leagues today, with participants vying for a place in Sunday’s finals.
The RoboCup@Work league focusses on robots in work-related scenarios, utilizing ideas and concepts from other RoboCup competitions to tackle open research challenges in industrial and service robotics.
I arrived at the arena in time to catch the advanced navigation test. Robots have to autonomously navigate, picking up and placing objects at different work stations. In this advanced test, caution tape is added to the arena floor, which the robots should avoid travelling over. There is also a complex placing element where teams have to put an object that they’ve collected into a slot – get the orientation or placement of the object slightly wrong and the it won’t fall into the slot.
The RoboCup@Work arena just before competition start.
Eight teams are taking part in the league this year. Executive Committee member Asad Norouzi said that there are plans to introduce a sub-league which would provide an entry point for new teams or juniors to get into the league proper.
I caught up with Harrison Burns, Mitchell Torok and Jasper Arnold from Team MiRobot. They are based at the University of New South Wales and are attending RoboCup for the first time.
Team MiRobot from UNSW.
The team actually only started six months ago, so final preparations have been a bit stressful. However, the experience has been great fun, and the competition has gone well so far. Like most teams, they’ve had to make many refinements as the competition has progressed, leading to some late nights.
One notable feature of the team’s robot is the bespoke, in-house-designed grasping mechanism on the end of the arm. The team note that “it has good flexible jaws, so when it grabs round objects it actually pulls the object directly into it. Because it uses a linear motion, compared to a lot of other rotating jaws, it has a lot better reliability for picking up objects”.
Here is some footage from the task, featuring Team bi-t-bots and Team Singapore.
Team b-it-bots take on the RoboCup@Work advanced navigation test, picking up a drill bit #RoboCup2024 pic.twitter.com/QfijZpxaOK
— AIhub (@aihuborg) July 20, 2024
Team Singapore placing an object in the RoboCup@Work advanced navigation test pic.twitter.com/SvgLBOaVo7
— AIhub (@aihuborg) July 20, 2024
In the Middle Size Soccer league (MSL), teams of five fully autonomous robots play with a regular size FIFA ball. Teams are free to design their own hardware but all sensors have to be on-board and there is a maximum size and weight limit of 40kg for the robots. The research focus is on mechatronics design, control and multi-agent cooperation at plan and perception levels. Nine teams are competing this year.
Action from the Middle Size League at #RoboCup2024.
Falcons vs Robot Club Toulon pic.twitter.com/GHcNLOx2nV
— AIhub (@aihuborg) July 20, 2024
I spoke to António Ribeiro, who is a member of the technical committee and part of Team LAR@MSL from the University of Minho, Portugal. The team started in 1998, but António and most of his colleagues on the current team have only been involved in the MSL since September 2022. The robots have evolved as the competition has progressed, and further improvements are in progress. Refinements so far have included communication, the detection system, and the control system. They are pleased with the improvements from the previous RoboCup. “Last year we had a lot of hardware issues, but this year the hardware seems pretty stable. We also changed our coding architecture and it is now much easier and faster for us to develop code because we can all work on the code at the same time on different modules”.
António cited versatility and cost-effective solutions as strengths of the team. “Our robot is actually very cheap compared to other teams. We use a lot of old chassis, and our solutions always go to the lowest cost possible. Some teams have multiple thousand dollar robots, but, for example, our vision system is around $70-80. It works pretty well – we need to improve the way we handle it, but it seems stable”.
Team LAR@MSL
The RoboCup@Home league aims to develop service and assistive robot technology with high relevance for future personal domestic applications. A set of benchmark tests is used to evaluate the robots’ abilities and performance in a realistic non-standardized home environment setting. These tests include helping to prepare breakfast, clearing the table, and storing groceries.
I arrived in time to watch the “stickler for the rules” challenge, where robots have to navigate different rooms and make sure that the people inside (“guests” at a party) are sticking to four rules: 1) there is one forbidden room – if a guest is in there the robot must alert them and ask them to follow it into another room), 2) everyone must have a drink in their hand – if not, the robot directs them to a shelf with drinks, 3) no shoes to be worn in the house, 4) there should be no rubbish left on the floor.
After watching an attempt from the LAR@Home robot, Tiago from the team told me a bit about the robot. “The goal is to develop a robot capable of multi general-purpose tasks in home and healthcare environments.” With the exception of the robotic arm, all of the hardware was built by the team. The robot has two RGBD cameras, two LIDARs, a tray (where the robot can store items that it needs to carry), and two emergency stop buttons that deactivate all moving parts. Four omnidirectional wheels allow the robot to move in any direction at any time. The wheels have independent suspension systems which guarantees that they can all be on the ground at all times, even if there are bumps and cables on the venue floor. There is a tablet that acts as a visual interface, and a microphone and speakers to enable communication between humans and the robot, which is all done via speaking and listening.
Tiago told me that the team have talked to a lot healthcare practitioners to find out the main problems faced by elderly people, and this inspired one of their robot features. “They said that the two main injury sources are from when people are trying to sit down or stand up, and when they are trying to pick something up from the floor. We developed a torso that can pick objects from the floor one metre away from the robot”.
The LAR@Home team.
You can keep up with the latest news direct from RoboCup here.
Click here to see all of our content pertaining to RoboCup.
Drones could revolutionize the construction industry, supporting a new UK housing boom
Are We Ready for Multi-Image Reasoning? Launching VHs: The Visual Haystacks Benchmark!
Humans excel at processing vast arrays of visual information, a skill that is crucial for achieving artificial general intelligence (AGI). Over the decades, AI researchers have developed Visual Question Answering (VQA) systems to interpret scenes within single images and answer related questions. While recent advancements in foundation models have significantly closed the gap between human and machine visual processing, conventional VQA has been restricted to reason about only single images at a time rather than whole collections of visual data.
This limitation poses challenges in more complex scenarios. Take, for example, the challenges of discerning patterns in collections of medical images, monitoring deforestation through satellite imagery, mapping urban changes using autonomous navigation data, analyzing thematic elements across large art collections, or understanding consumer behavior from retail surveillance footage. Each of these scenarios entails not only visual processing across hundreds or thousands of images but also necessitates cross-image processing of these findings. To address this gap, this project focuses on the “Multi-Image Question Answering” (MIQA) task, which exceeds the reach of traditional VQA systems.
Visual Haystacks: the first "visual-centric" Needle-In-A-Haystack (NIAH) benchmark designed to rigorously evaluate Large Multimodal Models (LMMs) in processing long-context visual information.
Can consciousness exist in a computer simulation?
#RoboCup2024 – daily digest: 19 July
The main soccer arena.
RoboCup is an international scientific initiative with the goal to advance the state of the art of intelligent robots. As part of this initiative, a series of competitions and events are held throughout the year. The main showcase event is an international affair with teams travelling from far and wide to put their machines through their paces.
This year, RoboCup is being held in three arenas in the Genneper Parken, Eindhoven, The Netherlands. The organisers are expecting over 2,000 participants, from 45 different countries, with around 300 teams signed up to take part in the various competitions.
Although RoboCup started out as a football (or soccer) playing competition, other leagues have since been introduced, focussing on robots in industrial, rescue, and home settings. There is even a dedicated league for young roboticists – RoboCupJunior – where participants can take part in either football, rescue, or artistic events.
I am lucky enough to be able to attend this year, and, for the next three days, I’ll be bringing you a daily digest of some of the exciting happenings from Eindhoven.
Today, 19 July, sees the competition in full swing. The main soccer arena, boasting multiple pitches, hosts a number of the different leagues which form RoboCupSoccer.
Some of the pitches in the main soccer arena.
My first port of call was the Standard Platform League, where the round 5 champions cup match between SPQR Team vs rUNSWift was taking place. SPQR ran out winners and advance to round 6. In this league, all teams compete with identical robots (currently the humanoid NAO by Aldebaran). The robots operate fully autonomously, meaning that there is no external control from neither humans nor computers.
Standard platform league. Round 5 champions cup match between SPQR Team vs rUNSWift.
Goal! pic.twitter.com/dMfNDUKNZc
— AIhub (@aihuborg) July 19, 2024
The Humanoid AdultSize league is arguably the most challenging of the leagues, with many constraints placed on the robots to make them as human-like as possible. For example, they must have roughly human-like body proportions, they need to walk on two legs, and they are only allowed to use human-like sensors (up to two cameras to sense the environment). In this AdultSize competition, two robots from each team compete, and the team members walk behind the robots to catch them in case of a fall. Such a mishap could prove costly in terms of potential hardware damage.
Action from the Humanoid AdultSize League.
The RoboCup Rescue Robot League sees teams developing robotic systems with the goal of enabling emergency responders to perform extremely hazardous tasks from safer stand-off distances. During the competition, teams compete in a round-robin, putting their robots through their paces on a number of different challenges. The leading teams following this initial phase progress to the finals on Sunday. The tasks include navigating in complex environments, opening doors, and sensing. Teams may run the machines completely autonomously, or with some assistive control. More points are awarded for completely autonomous operation.
RoboCup Rescue arena from above.
Some action from the @robocup_org #RoboCup2024 Rescue league, where teams compete in a variety of challenges.
Team Hector Darmstadt in the "Obstacles: pallets with pipes" challenge pic.twitter.com/4Ll75uENjM
— AIhub (@aihuborg) July 19, 2024
KMUTNB navigate rough terrain, including gravel and sand pic.twitter.com/rsI7NliEwd
— AIhub (@aihuborg) July 19, 2024
You can keep up with more RoboCup2024 news here.
Researchers use light to control ferrofluid droplet movements in water
IBM Uses AI for the Next Generation of Support
This month, IBM had an interesting briefing on how AI could be used to improve the customer support experience, drive deeper engagement with customers, and effectively improve customer loyalty while dramatically reducing support costs. This sounds unusually good given how […]
The post IBM Uses AI for the Next Generation of Support appeared first on TechSpective.
How a S.E.A of Data Unlocks AI’s Potential in Recruitment
The emergence of artificial intelligence is elevating operational efficiency for organizations worldwide. Conference keynotes and media headlines are dominated by the latest AI advancements, generating intrigue and excitement. While the full potential of AI remains unknown, organizations across all industries […]
The post How a S.E.A of Data Unlocks AI’s Potential in Recruitment appeared first on TechSpective.
New framework allows robots to learn via online human demonstration videos
Google DeepMind at ICML 2024
Google DeepMind at ICML 2024
Can Robotics Truly Drive Down Energy Costs?
Analyzing internal world models of humans, animals and AI
Target’s AI Gamble: Empowering Staff or Replacing Them?
Target’s recent announcement of a chain-wide rollout for their Store Companion chatbot, a GenAI (Generative Artificial Intelligence) tool, has sparked a wave of interest. While the press release (Target Corporation, June 20, 2024) touts benefits like increased efficiency and improved support for team members, some industry watchers are raising questions about the potential impact on...
The post Target’s AI Gamble: Empowering Staff or Replacing Them? appeared first on 1redDrop.