Robotics dominated CES 2026 in a way we haven't seen before.
More robots appeared at this year's show than ever, but what's notable isn't the quantity—it's the shift in conversation. Companies weren't just showing impressive demos. They were talking real shipping numbers and production timelines.
As Nvidia CEO Jensen Huang put it in his keynote: "The ChatGPT moment for physical AI is here—when machines begin to understand, reason, and act in the real world."
Boston Dynamics Goes Live
Boston Dynamics—the robotics company famous for those viral videos of robots doing backflips and dancing—did something rare at CES: a live demo of their Atlas humanoid robot.
This matters because companies typically only release edited videos, which allow them to hide failures. Going live signals real confidence.
"For the first time ever in public, please welcome Atlas to the stage," announced Zack Jackowski, Boston Dynamics' general manager for humanoid robots.
What followed was both impressive and slightly unsettling.
Atlas walked across the stage for several minutes, waved to the crowd, and moved with remarkable fluidity. Its head can rotate 360 degrees. When it stands up from the ground, it does so in ways that look nothing like how humans move—efficient, but alien.
An engineer piloted the robot remotely for the demo, but in production Atlas will operate autonomously. The production version has 56 degrees of freedom, human-scale hands with tactile sensing, and can lift up to 110 pounds.
The Google DeepMind Partnership
Perhaps bigger news was Boston Dynamics' partnership with Google DeepMind to integrate Gemini Robotics AI models into Atlas.
The goal: robots that can learn new behaviors far faster than current training methods allow.
"The robot can learn almost anything you can consistently demonstrate through teleoperation," explained Carolina Parada, senior director of robotics at Google DeepMind. The partnership aims to develop what they're calling "the world's most advanced robot foundation model."
This represents something of a reunion—Google actually owned Boston Dynamics about a decade ago before selling to SoftBank, which then sold to Hyundai. Now Google is back in the picture, bringing AI capabilities to the hardware it once owned.
The production targets are ambitious: Hyundai is targeting 30,000 humanoid robots produced annually by 2028. The first Atlas deployments are already committed for 2026.
Nvidia Wants to Be the Android of Robotics
Nvidia is making an aggressive move to become the default platform for robotics development. They're releasing open foundation models that any robotics company can build on:
Cosmos handles world simulation and synthetic data generation. It understands physics—how objects move, collide, fall, or persist when out of view. Real-world robotics data is expensive and slow to collect; Cosmos can generate physically plausible training scenarios at scale.
GR00T (yes, that's how they spell it) focuses on humanoid control, handling articulation, mobility, and locomotion.
The partner list is impressive: Boston Dynamics, Caterpillar, Franka Robotics, LG, NEURA Robotics, and others are already building on Nvidia's stack.
"The physical world is diverse and unpredictable," Huang explained. "Collecting real-world training data is slow and costly and it's never enough."
The solution: synthetic data grounded in physics, created in simulation and scaled using generative models. "Cosmos turns compute into data."
By open-sourcing these models, Nvidia is betting that widespread adoption will drive demand for their hardware—the same playbook that made their CUDA software platform dominant in AI development.
The Numbers Are Getting Real
Several announcements signaled that robotics is moving past the prototype phase.
AG Bot, a Chinese company, announced their U.S. market entry with 5,000 robots already shipped—well past the prototype stage.
Qualcomm unveiled Dragonwing IQ10, a processor specifically designed for humanoid robots, partnering with Figure and KUKA Robotics.
LG showed CLOiD, an AI-powered home robot that can start laundry cycles, fold and stack clothes, retrieve items from the refrigerator, place food in the oven, and unload the dishwasher. It demonstrated these capabilities live at CES.
CLOiD is more concept than product right now—LG hasn't announced pricing or availability—but it signals where consumer robotics is heading. LG's stated goal: a "Zero Labor Home."
NEURA Robotics launched a Porsche-designed Gen 3 humanoid.
Goldman Sachs projects the humanoid robot market will reach $38 billion by 2035. Manufacturing costs have already dropped significantly, from $50,000–$250,000 down to $30,000–$150,000, making commercial deployment much more realistic.
The Workforce Question
Physical AI opens real questions about workforce impact that associations across every sector will need to consider.
There's an obvious case for robots taking on dangerous jobs—positions near toxic materials, hot equipment, or physically hazardous conditions. There's equally strong arguments for robots handling backbreaking labor that takes a toll on workers' bodies over time.
But those jobs pay bills and put food on tables.
The question isn't whether automation is coming—it clearly is. The questions are: How quickly? Which roles first? And how do we retrain workers whose positions change?
Nvidia pointed to labor shortages as a driving force, arguing that automation powered by physical AI is increasingly necessary. Whether that framing resonates depends on your perspective and your industry.
For associations, the calculus varies by sector. Manufacturing associations are already deeply engaged with these questions. Healthcare associations are watching surgical robotics and care robots closely. But even associations in knowledge-work fields should be paying attention—the same AI advances powering physical robots are powering the digital agents that will reshape office work.
The Compounding Effect
The broader pattern matters: AI is leaving the screen and entering the physical world.
The companies building this future are moving faster than most projections suggest. And each advancement compounds.
Better chips enable better AI models. Better models enable better robots. Better robots generate more real-world data. More data improves the models.
We're watching a flywheel spin up. The question for every organization is how to position yourself as it gains speed.
January 20, 2026