Physical AI Matters More Than Humanoid Robots
Writing about last year’s Hannover Messe, I made a point of calling out the small number of humanoid robots I saw at this huge industrial trade show. Fast-forward to 2026, and I’m about to jump on a plane to Germany for this year’s event. I know that I’m going to see a lot of humanoid robots. I know that most of them will be Chinese, a few will be European, and many of them will be impressive. And, as Charlie Dai and I argue in our new report, I am confident that exhibitors’ and attendees’ apparent obsession with legs and arms misses the real story.
Our new report, “Physical AI Perceives, Reasons, And Acts In The Real World,” argues that the more important story is really the growing capability of physical AI. Humanoid robots like those that my colleague Charlie Dai recently wrote about benefit from that, for sure, but so do many other types of physical automation: and most of them are cheaper, more durable, and more useful than the bundle of compromises required to squeeze batteries, computers, sensors, actuators and more into a vaguely humanoid shape.
I’ll be discussing the background to the report’s findings and answering questions in a client webinar on 24 June: Sign up now to participate live, or to receive the recording after we’re done.
Physical AI Touches The Real World
Physical AI is all about bringing AI into the real world, making AI aware of what’s happening around it, and giving AI the ability to affect – touch – that world. As we describe in the report, physical AI comprises four broad capabilities. Each is a huge field of fast-moving research in its own right, but something special happens when all four are brought together to deliver physical AI, which:
- Models and simulates the real world. Modern world models are neural networks trained on lots of data. Unlike an LLM, which generates text, these world models create simulated physical spaces and generate interactions within them. While a world model might not really understand Newton’s law of universal gravitation, or the quadratic drag equations used to account for friction, it has observed their effects and can replicate these to simulate a falling object.
- Perceives the real world. Physical AI systems are enriched by a wide range of relevant sensor inputs from the real world, including sound, light, temperature, tactile feedback, and more.
- Reasons about the real world. In the context of physical AI, conceptual models and sensor-based observations are a means to an end: they give a physical AI system the facts it needs to reason about how best to achieve its objective.
- Acts upon the real world. Conceptual models, sensor-based observation, and AI reasoning combine with physical systems like a robot’s hands or an autonomous vehicle’s steering controls to effect change in the real world: the robot picks up a banana without squeezing too hard, and the autonomous vehicle steers around an obstacle in its’ path. These abilities may be instantiated within a physical machine (like a robot or autonomous vehicle), in which case this capability is often referred to as embodied AI.
I’ll be looking for evidence of physical AI throughout my week in Hannover, and I’ll blog about that (and other highlights of the show) once I’ve had time to digest everything I see.
As always, if you have your own perspectives to share, please schedule a briefing and tell us all about them. If you’re a Forrester client and want to discuss (or challenge) our thinking on these topics, please schedule an inquiry or guidance session.