Log In

How Nvidia (NASDAQ:NVDA) Stayed One Step Ahead in the AI Race | Markets Insider

Published 1 day ago3 minute read

The world of artificial intelligence has been evolving fast but Nvidia (NVDA) has remained agile and ready to change with the demands in the market. Last year it was faced with a big potential threat as AI took off and it led to a marked shift in what was required: operating the underlying large language models (LLMs) became a lot more important than providing the tools to train them, a process known as “inference.” 

Nvidia’s new Blackwell chips, however, were designed to meet this demand and seem tailored to meet inference scaling demands. On the company’s Q4 earnings call last month, CEO Jensen Huang noted that inference demand is accelerating, driven by test time scaling and new reasoning models like OpenAI’s o3, DeepSeek-R1, and Grok 3. 

“The vast majority of our compute today is actually inference and Blackwell takes all of that to a new level. We designed Blackwell with the idea of reasoning models in mind,” he said. 

According to Huang, the industry is only at the beginning of reasoning AI and inference time scaling. But Blackwell chips, which are larger and have more computer memory, are well suited to navigating this trend. “We defined Blackwell for this moment,” he said.  

A year ago, CFO Colette Kress revealed how more than 40% of Nvidia’s data center business was for inference, that is, the deployment of AI systems and not training. It was the first significant sign of a change in the tectonic places of AI. This stoked some fears that NVDA could start to lose its advantage, since inference can be done on less-powerful and cheaper chips than Nvidia’s most-advanced output. 

Then-CEO of Intel (INTC) Pat Gelsinger was confident inferencing would see a change. “If I can run those models on standard [Intel chips], it’s a no-brainer,” he told the Wall Street Journal in an interview last year. 

Competition is rising. As well as Intel, Advanced Micro Devices (AMD), whose AI chips are mainly aimed at the inference market, has stepped up efforts while a number of startups have emerged in this space. Meanwhile, tech companies are developing their own AI inference chips to compete with NVDA’s. It’s thought that NVDA’s chips may ultimately be limited by the fact that their origins as graphics processing units. Last week it was reported Meta Platforms (META) is testing its first in-house AI chip, though this is aimed at training models. 

But the distinction between training and inferencing is becoming less clear. “We’re clearly seeing an increasing blurring of the lines between training and inference,” noted Cantor Fitzgerald five-star analyst CJ Muse. Or as Huang pointed out recently, it’s going to become more human by “learning and inferencing all the time.”

On Wall Street, NVDA has a Strong Buy consensus rating, based on 39 Buys and three Holds. The average NVDA price target of $177.23 implies over 48% upside to current levels.

Questions or Comments about the article? Write to [email protected]

Origin:
publisher logo
markets.businessinsider.com
Loading...
Loading...
Loading...

You may also like...