BOOK THIS SPACE FOR AD
ARTICLE ADChip giant Nvidia, which dominates the training of artificial intelligence, sold thirty-one billion dollars worth of the systems last quarter, it reported Wednesday evening
As chief executive of Nvidia, Jensen Huang, who co-founded the company, is also the chief evangelist of all things AI these days. As such, he spent Wednesday evening telling Wall Street analysts about the important applications being made possible with the technology.
Also: Google's AI podcast tool transforms your text into stunningly lifelike audio - for free
Huang emphasized one of his favorite applications, Google's Notebook LM, saying he "used the living daylights" out of the AI tool.
The NotebookLM program, which is free to use with a Google account, lets users upload documents and create a summary of their contents. It suggests questions to ask about the documents, engages in a chat about the documents, and it will even create an audio conversation about the contents between two speakers in the style of a podcast.
Nvidia CEO Jensen Huang says he has "used the living daylights" out of Google's NotebookLM program.
"I put every PDF, every archived paper into it just to listen to it as well as scanning through it," he said.
Huang talked about NotebookLM in response to one Wall Street stock analyst's question about the trends in "inference," the practice of making predictions with an AI model after the model has been trained.
Also: Google's new AI tool could be your new favorite learning aid - and it's free
"Our hope and dream is that someday, the world does a ton of inference," replied Huang. "That's when AI has really succeeded," he continued, "is when every single company is doing inference inside their companies, for the marketing department and forecasting department and supply chain group and their legal department and engineering, of course, and coding."
Huang cited other examples of AI inference, such as "physical AI," meaning AI models that "understand the meaning of the structure" of the physical world and how to "reason" about it. Such models can "not only understand but can predict, roll out a short future."
Huang is betting that inference tasks will be the new growth engine for his company's data center GPU chip sales, after having dominated the market for chips for training AI models for years.
Huang told Wall Street his company's latest GPU, "Blackwell," unveiled in March, is "in great shape," and will generate billions of dollars of revenue in the fiscal fourth quarter ending in January.
"Blackwell production is in full steam," said Huang. "We will deliver this quarter more Blackwells than we had previously estimated. And so, the supply chain team is doing an incredible job of working with our supply partners to increase Blackwell, and we're going to continue to work hard to increase Blackwell through next year. It is the case that demand exceeds our supply."