Welcome to The Robotics World

Robotics is an interdisciplinary research area at the interface of computer science and engineering. Robotics involves design, construction, operation, and use of robots. The goal of robotics is to design intelligent machines that can help and assist humans in their day-to-day lives and keep everyone safe.

What do we Offer?

We offer any company connected with robotics in any way to contact us for further cooperation on mutually beneficial terms.

Promotion & Advertisement

Promote and Advertise your technology if you're a robotics company

Searching for Robotics

We help to search for technologies to integrate robots into your working process

Aggregation Of Information

We collect and aggregate news and other robotics information for you to able use it in the most efficient way

Robotics News

Latest headlines and updates on news from around the world. Find breaking stories, upcoming events and expert opinion.

Graphene-based sensor to improve robot touch

Schematic showing the materials used in the sensor and the sensing array on a robotic manipulator. Figure from Multiscale-structured miniaturized 3D force sensors. Reproduced under a CC BY 4.0 licence.

Robots are becoming increasingly capable in vision and movement, yet touch remains one of their major weaknesses. Now, researchers have developed a miniature tactile sensor that could give robots something much closer to a human sense of touch.

The technology, developed by researchers at the University of Cambridge, is based on liquid metal composites and graphene – a two-dimensional form of carbon. The ‘skin’ allows robots to detect not just how hard they are pressing on an object, but also the direction of applied forces, whether an object is slipping, and even how rough a surface is, at a scale small enough to rival the spatial resolution of human fingertips. Their results are reported in the journal Nature Materials.

Human fingers rely on multiple types of mechanoreceptors to sense pressure, force, vibration, and texture simultaneously. Reproducing this level of multidimensional tactile perception in artificial systems is a significant challenge, especially in devices that are both small and durable enough for practical use.

“Most existing tactile sensors are either too bulky, too fragile, too complex to manufacture or unable to accurately distinguish between normal and tangential forces,” said Professor Tawfique Hasan from the Cambridge Graphene Centre, who led the research. “This has been a major barrier to achieving truly dexterous robotic manipulation.”

To overcome this, the research team developed a soft, flexible composite material, combining graphene sheets, deformable metal microdroplets, and nickel particles, embedded in a silicone matrix.

Inspired by the microstructures found in human skin, the researchers shaped the material into tiny pyramids, some as small as 200 micrometres across. These pyramid structures concentrate stress at their tips, enabling the sensor to detect extremely small forces while maintaining a wide measurement range.

The result is a tactile sensor sensitive enough to detect a grain of sand. Compared with existing flexible tactile sensors, the new device improves size and detection limits by roughly an order of magnitude.

The sensor can also distinguish shear forces from normal pressure, a capability that allows it to detect when an object begins to slip. By measuring signals from four electrodes beneath each pyramid, the sensor can mathematically reconstruct the full three-dimensional force vector in real time.

In demonstrations, the team integrated the sensors into robotic grippers. The robots were able to grasp fragile objects, such as thin paper tubes, without crushing them. Unlike conventional force sensors, which rely on prior information about an object’s properties, the new system adapts in real time through slip detection.

At even smaller scales, microsensor arrays could identify the mass, geometry, and material density of tiny metal spheres by analysing force magnitude and direction. This opens the door to applications in minimally invasive surgery or microrobotics, where conventional force sensors are far too large.

Beyond robotics, the technology could have significant implications for prosthetics. Advanced artificial limbs increasingly rely on tactile feedback to provide users with a sense of touch. Highly sensitive, miniaturised 3D force sensors could enable more natural interactions with objects, improving control, safety, and user confidence.

“Our approach shows that bulky mechanical structures or complex optics are not required to achieve high-resolution 3D tactile sensing,” said lead author Dr Guolin Yun, a former Royal Society Newton International Fellow at Cambridge, and now Professor at the University of Science and Technology of China. “By combining smart materials with skin-inspired structures, we achieve performance that comes remarkably close to human touch.”

Looking ahead, the researchers believe the sensors could be miniaturised even further, potentially below 50 micrometres, approaching the density of mechanoreceptors in human skin. Future versions may also integrate temperature and humidity sensing, moving closer to a fully multimodal artificial skin.

As robots increasingly move out of controlled factory environments and into homes, hospitals, and unpredictable real-world settings, such advances in touch could be transformative — allowing machines not just to see and act, but to truly feel.

A patent application has been filed through Cambridge Enterprise, the University’s innovation arm. The research was supported by the Royal Society, the Henry Royce Institute, and the Advanced Research and Invention Agency (ARIA). Tawfique Hasan is a Fellow of Churchill College, Cambridge.

Reference

Multiscale-structured miniaturized 3D force sensors, Guolin Yun, Zesheng Chen, Zhuo Chen, Jinrui Chen, Binghan Zhou, Mingfei Xiao, Michael Stevens, Manish Chhowalla & Tawfique Hasan, Nature Materials (2026).

SAP AI Agents: How Enterprises Are Deploying Agentic AI on SAP?

SAP AI Agents: How Enterprises Are Deploying Agentic AI on SAP?

The Problem That Brought You Here

Your SAP environment runs the core of the business — procurement, inventory, production planning, finance. And now leadership is asking what AI can actually do on top of it. Not a demo. Not a proof of concept. Something that runs in production and solves a real bottleneck.

SAP AI agents are the answer a growing number of enterprise IT and operations teams are landing on. This article explains what they are, where they are being deployed today, and what it takes to put one into a live SAP environment.

USM Business Systems is a specialized SAP AI delivery partner based in Ashburn, VA. We place SAP BTP AI developers, AI Core engineers, and enterprise LLM integration specialists inside enterprises and system integrators executing SAP AI programs.

What Is a SAP AI Agent?

An AI agent is software that perceives its environment, reasons about a goal, takes actions, and checks results — without a human directing each step. When that environment is SAP, the agent reads SAP data, calls SAP APIs or workflows, interprets the output, and acts again.

SAP has built AI agent infrastructure directly into its platform. SAP Joule, the AI copilot embedded across S/4HANA, BTP, and SAP Analytics Cloud, uses an agentic architecture under the hood. Developers can extend it using SAP AI Core, the managed AI runtime where custom models and agents are deployed and governed at enterprise scale.

The practical result is an agent that can, for example, monitor a supplier’s delivery performance in SAP, flag an anomaly, cross-reference historical data, draft a purchase order adjustment, and route it for approval — without a procurement analyst touching it.

Where Enterprises Are Deploying SAP AI Agents Today?

  • Procurement and Supplier Intelligence

Agents monitor supplier delivery windows, contract compliance, and pricing variances inside SAP Ariba and S/4HANA. When a pattern signals risk — a supplier consistently shipping 4 days late on a specific SKU category — the agent flags it, pulls the relevant contract terms, and surfaces a recommended action. Procurement teams report 60-70% reductions in manual monitoring time after deploying these agents [Gartner, 2024 Supply Chain AI Survey].

  • Production Scheduling and Capacity Planning

In manufacturing environments, agents integrated with SAP PP (Production Planning) adjust schedules dynamically based on real-time inventory levels, machine availability, and demand signals from SAP IBP. The agent doesn’t replace the planner — it does the 45 minutes of data gathering and cross-referencing that used to happen before every planning decision.

  • Finance and Accounts Payable Automation

Agents working in SAP Finance match invoices against purchase orders, flag discrepancies above a defined threshold, and route exceptions to the right reviewer. Companies using this pattern report 80%+ straight-through processing rates on standard invoices within 90 days of deployment [McKinsey, 2024 Finance AI Report].

  • Inventory and Demand Signal Processing

Agents read point-of-sale signals, seasonal demand patterns, and supplier lead times from SAP, then recommend reorder quantities and safety stock adjustments. This is particularly high-value in food production and retail distribution where demand volatility is high and the cost of stockouts is immediate.

  • What is the difference between SAP Joule and a custom SAP AI agent?

SAP Joule is SAP’s native AI copilot — it works within SAP’s defined interaction patterns and covers general tasks across S/4HANA, SAP SuccessFactors, and other SAP applications. A custom SAP AI agent is built to solve a specific workflow problem in your environment, using SAP AI Core or SAP BTP as the infrastructure. Custom agents handle tasks Joule does not cover natively and can integrate with non-SAP data sources inside the same workflow.

  • Do SAP AI agents require a full BTP implementation to deploy?

Not necessarily. Agents that work purely within S/4HANA APIs can be deployed with targeted BTP services rather than a full BTP platform rollout. The right architecture depends on where your data lives, what your agent needs to access, and your existing SAP landscape. A scoping conversation typically takes 30 minutes to map this out.

What Makes SAP AI Agent Deployments Fail?

Most SAP AI agent projects that stall do so for one of three reasons:

  • The agent was built without a clean data feed. Agents that read SAP master data often encounter inconsistent coding, missing fields, or legacy data structures that were never cleaned because no one needed them to be. The agent surfaces the problem immediately.
  • The workflow boundary was too broad at the start. ‘Automate procurement’ is not an agent design. ‘Monitor supplier on-time delivery for the top 50 SKUs and flag variance above 10%’ is. Scoping matters more here than in almost any other AI project type.
  • The team building it did not have SAP AI Core experience. Standard ML engineering skills do not transfer cleanly to SAP’s AI infrastructure. SAP AI Core has its own API patterns, lifecycle management approach, and governance requirements. Engineers who have not worked inside it add 4-8 weeks of ramp time to every deployment.

What a SAP AI Agent Deployment Actually Looks Like

A typical first agent deployment for a mid-to-large SAP environment follows this sequence:

  • Week 1-2: Workflow scoping. Identify the specific process, the SAP modules involved, the data fields the agent needs to read, and the action it will take on completion.
  • Week 3-4: Data readiness assessment. Confirm that the relevant SAP master data and transactional data are clean enough for the agent to reason accurately. Identify gaps.
  • Week 5-8: Build and test in SAP AI Core. Deploy the agent model, connect to SAP APIs, build the agentic loop, run on historical data.
  • Week 9-10: Controlled live run. Agent runs in parallel with the existing manual process. Outputs are compared. Confidence thresholds are tuned.
  • Week 11-12: Production deployment with monitoring. Agent goes live. A dashboard tracks decision volume, exception rate, and accuracy. A human review loop handles edge cases.

Why USM Business Systems?

USM Business Systems is a CMMi Level 3, Oracle Gold Partner AI and IT services firm headquartered in Ashburn, VA. With 1,000+ engineers, 2,000+ delivered applications, and 27 years of enterprise delivery experience, USM specialises in AI implementation for supply chain, pharma, manufacturing, and SAP environments. Our SAP AI practice places specialized engineers inside enterprise programs within days — on contract, as dedicated delivery pods, or on a project basis.

Ready to put SAP AI into production? Book a 30-minute scoping call with our SAP AI team at usmsystems.com.

FAQ

What SAP modules are most commonly used with AI agents?

SAP S/4HANA, SAP Ariba, SAP IBP, SAP PP, SAP Finance, and SAP Datasphere are the most active areas. The agent infrastructure runs on SAP AI Core and BTP regardless of which module the agent is reading or acting on.

How long does a first SAP AI agent deployment take?

A well-scoped first agent typically reaches production in 10-14 weeks. Projects that try to automate too broad a workflow or that start with messy master data take longer.

Do we need to train a model from scratch?

Most SAP AI agent deployments use pre-trained LLMs or SAP’s foundation models as the reasoning layer, fine-tuned or prompted for the specific workflow. Training from scratch is rarely necessary and significantly extends timelines.

Can SAP AI agents work with non-SAP systems in the same workflow?

Yes. SAP AI Core supports external API connections, so an agent can read a SAP data source, call a third-party logistics API, and write a result back to SAP in the same workflow loop.

What governance controls exist for SAP AI agents?

SAP AI Core includes lifecycle management, model versioning, audit logging, and role-based access. Agents deployed in regulated industries like pharma can be configured to require human approval above defined thresholds before taking action.

Get In Touch!

[contact-form-7]