What’s next after agentic AI? Physical AI, Nvidia says
Nvidia, a major force behind the rise of generative AI (genAI) in recent years, now sees physical AI as the next step in the technology’s evolution.
“The next wave is physical AI,” Kari Briski, vice president for generative AI software for enterprises at Nvidia, said during a briefing ahead of the company’s GTC trade show in Washington, DC.
Today, agentic AI can help computers take action. But that technology will manifest in the physical world through data from cameras, sensors, lidar and other data-gathering instruments. “Physical AI perceives the world, reasons about its environment, and outputs actions,” Briski said.
With physical AI, Nvidia envisions the inclusion of AI in robotics, machines, autonomous vehicles and physical devices. “The modern factory is robotic, with humans and robots working alongside each other. Robots will do the dangerous jobs, and workers will do the skilled jobs,” Briski said.
The US is investing about $1.2 trillion to advance high-tech manufacturing and production, driven by electronics. That effort aims to counter a global labor shortage of 50 million workers worldwide, (including 4 million in the US). Physical AI and robotics can fill that gap, Nvidia believes.
“We just don’t have enough people…,” Briski said. “Robots and physical AI are our answer. We’re not replacing jobs, we’re filling the ones that don’t have enough people to fill.”
Nvidia CEO Jensen Huang has taken a long-term view of genAI, which is already changing the way people work, much as happened with the industrial revolutions and introduction of the Internet.
Some researchers have taken a doom-and-gloom approach to AI, saying it will take jobs away. A January study by the World Economic Forum noted that 41% of businesses surveyed planned to reduce their workforce as AI automates certain tasks. (Amazon, for instance, on Tuesday laid off 14,000 employees, determining it could be leaner with the advent of genAI; and tech layoffs have surged as AI changes the workplace.)
But other experts have argued the technology will create more new jobs than it eliminates, although those workers will require new skillsets.
Nvidia sees new jobs coming with the rise of high-tech factories. But a big challenge there will be the integration of buildings, production and robots from hundreds of suppliers. That can take close to five years, said Rev Labaredian, vice president of Omniverse and simulation technology at Nvidia.
“We don’t have that kind of time,” Labaredian said.
That, as far as Nvidia is concerned, is where physical AI comes in; the company’s models can connect real-world input into large language models (LLMs) that can reason. The output could help robots navigate the real world and make reasonable decisions on actions. (The company recently announced the Cosmos Reason model; it helps robots make decisions based on information from video and graphics inputs.)
The company announced that some of the top robot makers, including FANUC, Skilled AI and Foxconn, are now using the “Mega” blueprint to connect robots to factories. That involves using Omniverse software and the company’s RTX Pro hardware. The robots are linked to Mega’s controller, which allows robots to follow robot AI models in real time.
“Robot AI models and factory control systems operate together, synchronizing every movement, every process, across the digital and physical world,” Labaredian said.
Nvidia also announced it has partnered with Uber to bring fully autonomous robotic taxis to the streets by 2027. “Together, we’ll deploy more than 100,000 robotaxis worldwide in the next few years,” Briski said, adding the fleet will be 20 times larger than today’s robotaxis and could benefit local industry. The autonomous vehicles will use the company’s Hyperion Level 4 platform.
“New local jobs are already being created for today’s robotaxi fleets, and we’ll scale even more with this new partnership … manufacturing and vehicle integration, fleet operations, depot logistics, cleaning, charging, remote assistance, customer support, compliance, and more,” Briski said.
Finally, Nvidia announced IGX Thor, an AI computer for the industrial and medical applications. It will use the latest Blackwell GPU and include AI models to support those verticals. IGX Thor will come in two versions, the integrated IGX T7000 system for direct purchase, and the IGX T5000, which can be customized by system makers.
“It’s industrial-grade performance, so it has a much higher extended temperature range and vibration tolerance,” Labaredian said, adding, “We have seen many customers in that space demand those features.”
Read more: What’s next after agentic AI? Physical AI, Nvidia says