Nvidia On Wednesday, a large amount of fiscal first-quarter earnings were reported.
Wall Street is pleased with NVIDIA's continued sales growth, which hit 69% in the quarter. The company's data center division continues to surge as companies, state and cloud providers snap up NVIDIA graphics processing units or GPUs for AI software.
"The team continues to lead the lead by 1-2 steps through its silicon/hardware/software platform and strong ecosystem, and over time, the team's new product launches and more product segments have further alienated."
Here are the three major gains from the company's revenue:
NVIDIA expects to sell about $45 billion in chips in the July quarter, but it revealed on Wednesday that the company will record sales of about $8 billion if it does not limit U.S. H20 chip exports.
NVIDIA also said sales in the April quarter were $2.5 billion due to H20 export restrictions.
NVIDIA CEO Jensen Huang said on the company's earnings call that China represents a $50 billion market that has actually shut down NVIDIA.
He also said that export controls are misled and will only encourage Chinese artificial intelligence developers to use native chips, rather than making the American platform the world choice for world AI software.
Huang said: "The U.S. policy is based on the assumption that China cannot make AI chips. This assumption is always questionable and is obviously wrong now."
He said export control measures are pushing AI talents to use chips from local competitors, such as Huawei.
"We want every developer in the world to like the American technology stack," Huang told CNBC's Jim Cramer on Wednesday night.
NVIDIA said it is not ready for alternative chips in China, but it is considering options for "interesting products" that can be sold on the market.
The strength of the company's Blackwell business balances some concerns about China's impact.
"NVIDIA is taking full rest to digestion, which shows the acceleration of businesses around growth drivers, which seems to be durable. Everything will be better from here," said Joseph Moore, an analyst at Morgan Stanley.
NVIDIA said it has many clients ranging from sovereign states to universities to businesses that want to study AI.
But it confirmed again on Wednesday, cloud providers - companies like this Microsoft blue, Google cloud, Oracle Cloud infrastructure and Amazon Web Services - Still accounting for half of its data center revenue, which reported sales of $39.1 billion in quarterly.
CFO Colette Kress said in earnings call that the companies tend to buy the latest and latest NVIDIA chips, including Blackwell, which accounted for 70% of NVIDIA data center sales in the quarter.
The company said Microsoft has deployed "thousands of" Blackwell GPUs, and the company processed "100 trillion tokens" in the first quarter. Tokens are metrics of AI output.
This will be the first queue to get the Blackwell Ultra, an updated version of the chip, with additional memory and performance. NVIDIA said shipments for these systems will begin this quarter.
Stacy Rasgon of Bernstein said that as the company strengthens Blackwell’s rollout and increases computing requirements, “the overall outlook and environment seem encouraging”.
"In a messy quarter, Nvidia got very good," he said.
Over the past few years, many NVIDIA GPUs have been used in a resource-intensive process called training, which processes data through AI models until new capabilities are acquired.
Now Huang is talking about the potential of NVIDIA's GPUs to provide millions of customers with AI models, a process known as industry inference. He said that on the revenue call, this is a new source of demand.
"Overall, we think NVDA's technical leadership is still strong, and Blackwell's growth in shipments benefited from exponential growth in reasoning AI and the realization of economies of scale," said Ross Seymore of Deutsche Bank.
Huang said the latest AI model needs to generate more tokens (or create more output) to do "inference", improving AI answers. Of course, NVIDIA's latest Blackwell chip was designed for this, Huang said.
“We’ve witnessed a sharp increase in the demand for reasoning,” Huang said. “Openai, Microsoft and Google have seen a gradual leap in the token generation.”
Huang compared the modern AI model with the "single shot" method used by Chatgpt when it debuted in 2022, and said the new model requires "one hundred, one thousand times" calculations.
“It’s essentially thinking about yourself, gradually breaking the problem,” Huang said. “It may be planning multiple pathways for answers. It can use tools, read PDFs, read web pages, watch videos, and then produce results.”
During the call, Huang's voice was significantly clouded with a dark tone, focusing on the impact of export controls rather than his usual spread of AI's potential to change the world.
He spoke in detail about our chip limits and clearly explained the impact of the limit on current and future business.
“The AI competition isn’t just about chips,” he said. “It’s about which stack in the world. As the stack grows to 6G and Quantum, the global infrastructure leader in the U.S. is under threat.”
- CNBC's Kristina Partinevelos contributed to this article.
Correction: Stacy Rasgon is an analyst with Bernstein. The earlier version misspelled his name.