News/Press
CES 2026: The ChatGPT moment of robotics ?

“The ChatGPT moment for Physical AI has arrived.”
If there is one sentence to remember from CES 2026, it is this one. It was spoken by Jensen Huang, CEO of NVIDIA, during his opening keynote. And he is right. Artificial intelligence is no longer limited to generating text or images. It is now taking physical form in the real world.
No surprise, CES was filled with robotics companies. Boston Dynamics, Hugging Face, NVIDIA, Neura, Agibot and many others. And once again, humanoids were the stars of the show. But among this crowd of machines, one stood out clearly. No suspense needed. Let us talk about Atlas.

Best of CES 2026 The rebirth of Atlas by Boston Dynamics
This was the revelation of the show. Boston Dynamics finally unveiled the commercial version of its humanoid robot Atlas.
Forget the viral parkour videos from a few years ago. The previous Atlas was a hydraulic research platform. Heavy, complex, full of fluids and exposed cables. The 2026 version is a complete break.
It is fully electric. No oil leaks. No deafening noise. Boston Dynamics had already shown an electric prototype over a year ago, but the commercial version is cleaner, safer and clearly production ready.
Maintenance is fast. This was one of the strongest messages at the booth. Thanks to a modular design, Boston Dynamics promises that a full limb can be replaced in under fifteen minutes. This matters for industrial environments where downtime is simply not an option.
The form factor is impressive. Atlas stands around one point nine meters tall and weighs roughly ninety kilograms. Compact and dense, yet capable of lifting heavy loads thanks to powerful actuators. It can even swap its own batteries, enabling true twenty four seven operation.
The brain A reunion with Google
Alongside the hardware announcement, Boston Dynamics confirmed a major strategic move. A renewed collaboration with Google DeepMind.
The irony is hard to miss. Google sold Boston Dynamics to SoftBank in 2017. Nearly ten years later, they are back through the front door.
The goal of this partnership is to integrate Google DeepMind’s Gemini Robotics foundation models into Atlas. The ambition is clear. Give the robot cognitive reasoning abilities so it can adapt to non scripted tasks instead of following rigid instructions.
This rapprochement is also driven by people. Aaron Saunders, former CTO of Boston Dynamics, joined Google DeepMind in late 2025, strengthening the ties between the two organizations.

A selective industrial strategy
Unlike competitors targeting consumers or broad markets, Boston Dynamics is taking a highly selective approach.
First comes Hyundai. As the parent company, Hyundai Motor Group will take the initial production capacity in 2026. Atlas robots will be deployed at the Robot Metaplant Application Center to validate the technology at scale.
Then comes Google. A dedicated fleet is planned for Google DeepMind to accelerate learning and real world experimentation.
Everyone else will have to wait. Broader commercial availability is announced for early 2027.
Hyundai’s ambition is explicit. Industrialize not only the robot’s usage but its manufacturing. With actuators supplied by Hyundai Mobis, the group is targeting production capacity of thirty thousand units per year by 2028, with complex assembly tasks in sight by 2030.

Integration into Orbit
Atlas does not operate alone. It integrates natively with Orbit, Boston Dynamics’ fleet management software that already controls the Spot robot.
This enables direct connections with industrial systems such as WMS and MES platforms. Spot can detect an anomaly. Atlas can then physically intervene to resolve it.
This level of technological and industrial maturity earned Boston Dynamics the Best of CES 2026 title without much debate.

NVIDIA and Hugging Face A major step for open source robotics
If Boston Dynamics dominated the hardware conversation, the intelligence battle was happening elsewhere.
One of the most important announcements came from NVIDIA and Hugging Face.
Since the release of Reachy Mini 2 in late 2025, Hugging Face has been pushing open source robotics forward by allowing developers to share knowledge freely. When NVIDIA chooses to align with this vision, the signal to the robotics community is strong.

What was announced
Jensen Huang, CEO of NVIDIA, and Clem Delangue, CEO of Hugging Face, presented a shared vision: democratizing access to robotic intelligence. Two points stand out:
NVIDIA’s advanced models are now accessible to everyone on Hugging Face.
It is like a top chef sharing their secret recipes with the world. NVIDIA announced the immediate availability of its most advanced Physical AI models, such as NVIDIA Cosmos. This model goes beyond predicting patterns from data. It has a deep understanding of physical laws, such as gravity and collisions, whereas previous models would simply infer that an object falls because they had seen thousands of examples, without actually knowing the concept of gravity.
Alongside it, NVIDIA Isaac GR00T, a Vision-Language-Action model, unlocks full humanoid control. As the name suggests, it analyzes what it sees (Vision), understands the instructions it receives (Language), and instantly converts this into coordinated movement (Action).
The LeRobot framework now integrates professional-grade simulation.
Hugging Face offers LeRobot as a standard tool to train robots. The major breakthrough is that it now integrates NVIDIA simulation technologies through Isaac Lab Arena. No more complex setup or installation nightmares. Developers can now run photorealistic simulations and train their robots with just a few lines of code.

The era of the generalist specialist
Until now, robots were specialists. With Cosmos and GR00T becoming accessible through open source, NVIDIA and Hugging Face are pushing robotics into a new era.
As NVIDIA’s Rev Lebaredian explains it, robots now have the equivalent of a doctoral level general education. They finally have the broader understanding required to adapt to an unpredictable world.
Three notable innovations beyond the giants
Beyond the major players, CES 2026 was full of quieter but equally transformative ideas. Here are three that stood out.

The smart home becomes embodied LG and Samsung
LG introduced a new version of its AI Agent, a small two wheeled domestic robot. Unlike voice assistants such as Alexa or Siri, this robot exists physically in the home.
It manages smart devices, monitors pets, detects user emotions through facial recognition and patrols the house. The evolution feels natural. The voice assistant becomes a mobile butler.

Exoskeletons become invisible German Bionic
In professional environments, the trend is moving away from bulky suits toward discretion.
German Bionic presented a textile based active suit that looks like a simple work vest. No loud motors. Instead, active fibers reduce back strain by up to thirty kilograms per lift.
This is robotics worn on the body, and likely one of the fastest to be adopted in logistics and warehousing.

Agibot enters the US market
Agibot is not here to observe from the sidelines.
On the software side, the company unveiled Genie Sim version three. Rather than building everything from scratch, they created a unified layer on top of NVIDIA Isaac Sim. This allows rapid generation of synthetic data and near instant validation of sim to real transfer for fine manipulation tasks.
On the hardware side, the message is aggressive. With new humanoid models such as the Expedition A two series, Agibot is attacking on price. The promise is a capable, modular humanoid at a fraction of the cost of American competitors.
Conclusion
CES 2026 marks a historic inflection point for robotics.
With mature hardware like Atlas and widely accessible robotic intelligence through the NVIDIA and Hugging Face alliance, the industry is moving beyond prototypes toward real world deployment.
Robots are no longer viral curiosities. They are becoming operational assets, designed to integrate with existing systems, adapt to complex environments and deliver measurable impact on the ground.
At Osedea, this shift is already tangible. We work with organizations to design, build and deploy intelligent robotic systems, from perception and decision making to real world integration. Whether the challenge is autonomy, human robot interaction or scaling from proof of concept to production, our teams focus on turning Physical AI into reliable, usable solutions.
The era of Physical AI has arrived. The question is no longer if robotics will transform operations, but how fast organizations are ready to adopt it. Let’s talk about how we can bring your vision to life.

Did this article start to give you some ideas? We’d love to work with you! Get in touch and let’s discover what we can do together.

-min.jpg)
-min.jpg)
-min.jpg)

