
Nvidia CEO Jensen Huang said the company's recent $30 billion investment in OpenAI "might be the last time" it invests in the artificial intelligence startup before it could go public toward the end of the year.
Huang said the opportunity to invest $100 billion in OpenAI, which was the figure that the two companies touted as part of a massive infrastructure deal in September, is probably "not in the cards."
"The reason for that is because they're going to go public," Huang said during the Morgan Stanley Technology, Media & Telecom Conference on Wednesday.
He also mentioned that Nvidia's $10 billion investment in OpenAI rival Anthropic would likely be its last. Nvidia previously shared its plans to invest in Anthropic in an announcement alongside Microsoft in November.
Huang's comments come after months of speculation about the scope of Nvidia's relationship with OpenAI. The chipmaker disclosed in a quarterly filing in November that the previously announced $100 billion deal may not come to fruition, and the Wall Street Journal reported in January that the agreement was "on ice."
Nvidia included similar language in its quarterly filing in February, which noted that there was "no assurance" that the company will enter into an "investment and partnership agreement with OpenAI or that a transaction will be completed."
The chipmaker's $30 billion investment in OpenAI was unveiled as part of a $110 billion funding round that the startup announced on Friday. The round also included a $50 billion commitment from Amazon and a $30 billion commitment from SoftBank.
As part of the agreement, OpenAI secured 3 gigawatts of dedicated inference capacity and 2 gigawatts of training capacity on Nvidia's Vera Rubin systems for AI data centers, OpenAI said Friday.
The companies' September deal, which rocked the tech sector and sparked a flurry of subsequent infrastructure agreements, outlined a structure where Nvidia would invest in OpenAI over several years as it brought new supercomputing facilities online. Nvidia's $30 billion investment, by contrast, is not tied to any deployment milestones.
The chipmaker has been one of the biggest winners of the AI boom because it makes the graphics processing units, or GPUs, that AI companies need to train models and run large workloads.
Even so, the needs of AI companies have been shifting from training to inference, a type of processing that allows AI models to rapidly respond to user queries, which has put some pressure on the company. Nvidia is reportedly developing a new chip specifically for inference, and OpenAI is expected to be one of the largest customers of the new processor.
In February, OpenAI announced it would sign up for a major purchase of "dedicated inference capacity" from Nvidia. OpenAI has also invested heavily in inference-optimized chips from Amazon and also uses Google's Tensor Processing Units.
OpenAI CEO Sam Altman will be speaking at the Morgan Stanley conference on Thursday, according to a source familiar with the schedule who asked not to be named because the details are private.
— CNBC's Katie Tarasov and Kate Rooney contributed to this report.
WATCH: Nvidia CEO Jensen Huang: AI has gone through a new inflection point