What will it take to get robots out of controlled demonstrations and into homes, factories and warehouses, without dragging a server rack on board?

A robotics push, which Qualcomm is making an easy bet on based on an edge-computing thesis that the company is quite familiar with: the most useful machines are those that are mobile, constrained by batteries, and have to make real-time decisions. In CES 2026, the corporation unveiled its Dragonwing IQ10 Series, which it claimed to be a complete stack of robotics that integrates hardware, software and AI designed to be enhanced to diminish miniature service robots all the way to full-size humanoids.
The most notable piece of evidence came through partner presentation: the Motion 2 humanoid of Vietnamese robotics company VinMotion. One of the short videos demonstrated the robot in a series of motions that were selected due to their visuality- hitting a piece of wood, crouching to pick up a teddy bear, and a deep back and forth spinal movement that emphasized a band of movement labeled as a “bandy-backed.” The demonstration was more of an argument than a product introduction: perception, balance, and manipulation demonstrated as one integrated system instead of three independent laboratory modules.
In the case of Qualcomm, that is the point of integration. Over the years, the company has been shipping silicon used in phones, vehicles and other edge gadgets where power budgets are constrained and latency is important. In robotics, such limitations are further defined: compute, sensing and control have to fit within a moving platform that cannot tolerate heat, weight or energy waste. The fit, as defined by Qualcomm CEO Cristiano Amon in a previous interview, is summarized as follows: “Whether from enterprise to consumer, I think the type of silicon that we develop for phones and for the edge is the perfect silicon for robots.”
The technical direction that Qualcomm is pointing to is based on a change in AI models that robotics teams have been expecting to occur: one which bridges perception and action. The ability of large language models to communicate intent and comprehend instructions simplified it, yet robots require a gap between “understand” and “do”. It is here that vision-language-action methods, models which consume images (and other sensor outputs), compare them to task context and produce motor plans become central. Qualcomm proposes Dragonwing as that the models and the control stack around it can be deployed near to the sensors, and the control “brain of the robot” is on the device rather than having to be round-tripped to the cloud.
The message of the firm recurred over and over again in terms of what had been learned in automotive. A robot, such as a recent car, is a moving collection of safety requirements, sensor synthesis, and uncertainty decision-making. Automobile grade thinking as Qualitycomm has suggested translates to: you cannot “put a server inside” a car and the analogous applies to a robot which is expected to perform on the untethered long distances. According to Qualcomm EVP of automotive and robotics Nakul Duggal, the move was an expansion of what the company already does: “By building on our strong foundational technologies and expanding portfolio of developer tools, we’re redefining what’s possible with physical AI by moving intelligent machines out of the labs and into real-world environments,” he said.
Partners can clarify what the “real-world environments” that Qualcomm is targeting first. The company mentioned relations with Figure and the industrial and embedded players together with VinMotion, including Kuka Robotics, Advantech, APLUX, Autocore, Booster Robotics, and Robotech.ai. That combination implies two-track policy: enterprise robots that can be rightfully justified by the usage, and consumer-oriented machines that should be much more demanding in terms of noise, safety, and the ability to operate continuously. The enterprise side has been pointed by Amon as coming earlier than the consumer side, which is in line with the historical movement of robotics adoption, which has moved outward of factories and warehouses.
Meanwhile, a case study can be found in VinMotion of why the platform question is all the rage now. Vingroup supports the company, and it has introduced humanoids that could perform synchronized routines in the company, where the focus has been on the internal production of mechanical, electronic, and software systems. According to remarks by VinMotion chairman Nguyen Trung Quan, the company talked of the optimization process of real-time calculation and network coordination: Besides stable hardware and motion-control software, we had to optimize real-time computing and the network infrastructure connecting the robots. This ensures their ‘communication’ is almost perfectly synchronized, with algorithms executed in real time. It further described the demo as a move towards the implementation of multi-robots in the field.
The focus on synchronization and time indicates a minor engineering fact: robotics does not mention only large models. It is also concerning deterministic loops, control, power management, sensor sampling, safety interlocks, and milliseconds can be the difference between stable motion and fall. The platform language by which Qualcomm represents itself is perception, motion planning, manipulation, and human-robot interaction which makes Qualcomm sound like a systems checklist suggesting that the firm views Dragonwing not as a single accelerator but as a predictable substrate on which developers should fit loops.
The level of competition is increasing since the prize is not that of one robot design, but rather the de facto layer of compute and software that can be shared by many designs. An example is Nvidia, which has been marketing open physical-AI models and a training workflow around robotics, which consists of an open vision-language-action model of humanoids and an open Jetson module, with an economics of a wide deployment. Qualcomm is retaliating with a platform argument of its own: AI that is power efficient and has developer tools, meant to enable robots to become useful without relying on the cloud all the time. Even the market framing has grown, Qualcomm has indicated a 1 trillion physical AI market by 2040, an indicator that the firm views robotics as a long-duration category with reference to mobile or automotive.
To the readers who see robotics as a spectacle turning into a structure, the twist in the back of the humanoid is not the most enlightening. The effort to standardize the “robot brain” on edge constraints: power, thermals, safety, and latency, all on a single architecture that can be delivered in large quantities by partners. Provided such standardization, the next-generation of robots will not appear as individual, disconnected miracles, but rather like an ecosystem – robots that are differentiated by mechanics and design to fit use cases, but that share a common compute core that renders autonomy feasible where needed: on-the-go, in environments that are not that clean.

