The stock market is sensitive to everything from the path of interest rates to ominous OpenAI commentary - and now investors are digging more closely into Nvidia's earnings
Nvidia's fiscal third-quarter earnings report came out ahead of Wall Street's expectations, but it failed to give AI stocks a boost.
The artificial-intelligence trade was already starting to crack before Nvidia Corp. reported earnings, and upbeat results from the world's largest company weren't enough to flip sentiment.
That dynamic highlights growing jitters about factors far beyond Nvidia's (NVDA) control, and sets the market on a bumpy path for the balance of 2025.
Nvidia's October-quarter earnings results "were extraordinarily good, probably the best quarter they've reported this year," Gil Luria, D.A. Davidson's head of technology research, told MarketWatch. And initially, it seemed like those numbers would be a sufficient spark for AI stocks that had seen a comedown in recent weeks.
That wasn't to be. Shares of Nvidia and other AI plays opened the next session higher but closed meaningfully lower by the end of the day. For momentum-oriented names like Sandisk Corp. $(SNDK)$, which had run up big on optimism for AI-related storage needs, the declines were especially dramatic.
"If really strong Nvidia results and a positive tone didn't send the group to new highs ... not much else is going to be a positive catalyst through the end of the year," Jamie Zakalik, a senior research analyst at Neuberger Berman, told MarketWatch.
What gives?
There's a laundry list of issues ailing the AI trade - from big-picture macroeconomic questions affecting the rate-cut outlook, to highly specific comments from key players like OpenAI that have rattled the market. Or as Ken Mahoney, the chief executive of Mahoney Asset Management, put it in emailed comments, markets are now facing a "perfect storm."
Thinking on a micro level, the market downturn following Nvidia's earnings could have been driven by a fund or portfolio liquidating, Luria said, and wasn't necessarily a reaction to Nvidia's report itself. By sharing that the company has visibility into $500 billion in Blackwell and Rubin revenue from now through 2026, Nvidia gave "the longest-term outlook they've given in the last three years," according to Luria.
That's not to say Nvidia's report was deemed to be flawless. Ipek Ozkardeskaya, a senior analyst at Swiss online bank Swissquote, said investors may be taking issue with "swelling inventories and unusual patterns in deferred revenue" at the chip giant.
Nvidia "has been taking in hefty prepayments from customers and then recognizing those payments as revenue ... before chips are delivered," Ozkardeskaya noted in emailed comments. "This is not illegal," she added, but it "could leave a gap if future orders slow."
The fact that investors are looking closely at these elements is perhaps indicative of how the attitude on Wall Street has shifted from euphoria to vigilance.
"When you dig deep enough, you're sure to find dirt," Ozkardeskaya wrote. "And people only start digging when they begin to feel uncomfortable - and that level of discomfort is rising."
In the preceding weeks before Nvidia's earnings report, Luria said there was "a lot of good discussion about what constitutes a bubble" and what market behaviors are healthy or unhealthy.
That said, Bloomberg Intelligence technology analyst Mandeep Singh isn't worried about Nvidia's high inventory levels. In his view, the company is trying to ramp up the capacity of graphics processing units it makes every year, and that means keeping the same allocation from Taiwan Semiconductor Manufacturing Co. Ltd.
"They don't want that allocation to go to AMD $(AMD)$ or even TPUs for that matter," Singh said, referring to tensor processing units designed by Google $(GOOGL)$ $(GOOG)$. "From that perspective, I think Nvidia cares about how many units it can make and I think they are managing a very complex supply chain."
With more than $330 billion in revenue expected next year, "that will involve an inventory buildup," Singh added.
As for concerns that Nvidia reported a high amount of accounts receivables on its earnings call, Neuberger Berman's Zakalik said she isn't concerned because the company's major customers are hyperscalers with large cash flows.
Watching the Fed
Wall Street's more discerning earnings reactions also relate to evolving expectations around the Federal Reserve's next interest-rate decision due out in December.
"It seemed like the market was saying it's extremely unlikely" that there will be a cut, Zakalik said, following delayed September unemployment data released on Thursday that "was not perfectly clean." Because of the government shutdown, the Fed won't be receiving October and November data until after the rate-cut decision.
Fed uncertainty has been "the largest single sentiment driver in the past few weeks," according to emailed comments from Mark Malek, the chief investment officer at Siebert Financial.
"Does a 25-basis-point cut in December or January really have anything to do with how well Nvidia or Meta is going to perform over the next five years?" Malek asked. "Please tell me you answered with a quick 'no!' But, alas, investors have gotten their minds so wrapped up in [Fed] policy that it has distorted many investors' ability to make reasonable, facts-based decisions."
The interest-rate landscape, however, has implications for high-growth technology companies sensitive to borrowing costs.
The OpenAI issue
There are questions about the AI trade beyond Nvidia's control, Zakalik said, including the debt financing by major customers like Oracle Corp. $(ORCL)$ and longer-term worries that there will be an overbuilding of AI infrastructure.
Everything increasingly comes back to OpenAI, which has announced big-money deals with cloud and chip players that are in excess of its current revenue.
The market is "sensitive" to anything related to OpenAI and its ability to pay off and build the commitments it has made over the next few years, Zakalik said.
Investors are increasingly worried that OpenAI could be making promises it can't live up to as it tries to raise massive amounts of capital, Luria said. He pointed to neocloud CoreWeave Inc. (CRWV), which is borrowing money to build data centers for OpenAI. But the ChatGPT creator doesn't exactly have the money to pay for them.
"People are having bubble PTSD over that kind of behavior, and that's been an impact on the stocks for the last couple of weeks," Luria noted.
OpenAI hasn't necessarily reassured investors recently. Sarah Friar, the company's chief financial officer, mused about a "backstop" from the government during an appearance at the Wall Street Journal's Tech Live event earlier this month - comments she later walked back on LinkedIn.
And while ChatGPT was once the face of chatbots, Alphabet has been gaining ground with its own efforts, both casting doubt on OpenAI's future dominance and underscoring Google's success with its in-house hardware.
Singh told MarketWatch that Google's release of Gemini 3 this week could have been among the reasons for the negative market action.
The AI model, which Google said outperforms OpenAI's ChatGPT 5.1 and Claude Sonnet 4.5 in areas such as scientific reasoning and agentic tasks, was trained on the company's tensor processing units. In Singh's view, that shows that Google could be getting further with its capital expenditures than other hyperscalers that depend more on Nvidia for chips.
While Singh expects Google to continue being an Nvidia customer, the chip maker will likely have to rely more on other large customers such as Microsoft Corp. $(MSFT)$ and Meta Platforms Inc. (META), and also on OpenAI, to get the $330 billion in revenue it is modeling for next year.
"I think now, with the numbers getting as big as they are for Nvidia, that question is looming again in terms of how much is going towards training [and] how much is going towards inferencing, because inferencing spend shows up in the revenue of the hyperscalers, OpenAI and Anthropic," Singh said.
Inferencing refers to the process of running AI models. Training is a sunk cost, Singh noted, because although it is important to train models to improve them, inferencing is related to the monetization of the AI models.
Comments