The winners of hyperscalers’ big spending plans will continue to be chip companies like Nvidia
Amazon plans heavy spending on AI, though its AWS cloud-computing unit expects uneven business in the coming years.
Investors still worrying about chip stocks in light of fears of lower spending on AI can breathe a sigh of relief now that Amazon has joined in with three other Big Tech companies committed to massive capital spending.
In its fourth-quarter earnings call Thursday, Amazon.com Inc. said it spent $26.3 billion on capital expenditures in the quarter, and would continue to spend at that rate in 2025. The vast majority was on AI for its Amazon Web Services business, Chief Executive Andy Jassy told analysts.
Amazon’s forecast now brings the total funding that four Big Tech companies have so far promised to spend on AI data centers in 2025 to about $280 billion. Microsoft Corp. said it will spend $80 billion in its fiscal 2025, which ends June 30; Facebook parent Meta Platforms Inc. said it would spend between $60 billion to $65 billion; and Google parent Alphabet Inc. committed to $75 billion. Some of Amazon’s capital spending will also go to its retail distribution system.
Amazon, Microsoft and Alphabet all mentioned on their earnings calls with Wall Street analysts that they did not have enough cloud-computing capacity to meet the demand of their customers. The cloud-services businesses at all three companies did not meet analysts’ expectations this quarter and their stocks have suffered. Amazon also warned its cloud growth would be “lumpy” over the next few years, while at the same time gushing with enthusiasm about AI.
“AI represents for sure the biggest opportunity since cloud, and probably the biggest technology shift and opportunity in business since the internet,” Jassy told analysts Thursday.
The biggest beneficiary of this spending will likely be Nvidia Corp., of course, which has seen its stock recover somewhat from its $600 billion plunge last week on the news of China’sDeepSeek, which saidit used older, lower-cost Nvidia GPUs to train its AI models. Nvidia’s shares, though, are still down about 13% from its trading price right before the DeekSeek revelations made the rounds on Wall Street. Investors will continue to be nagged by the question of whether Nvidia will be able to sell its most advanced Blackwell family of AI chips, or if hyperscalers and cloud-services companies start to look for ways, a la DeepSeek, to run AI models at a lower cost. That would mean using lower-end, less pricey chips, but Nvidia would still be a beneficiary.
Amazon’s earnings call held one clue. The company said the industry, in general, is exploring ways to lower its computing costs.
“What you heard the last couple of weeks out of DeepSeek is a piece of it, but everybody is working on this,” Jassy told analysts. “I believe the cost of inference will meaningfully come down.” Inference is the element of AI that makes predictions based on the data that it has already trained on.
Other chip beneficiaries of continued AI data-center spending will likely be Broadcom, Marvell Technology, which has worked with Amazon on its Trainium chip family, and to a lesser extent Advanced Micro Devices Inc., which gave analysts pause this weekwhen it stopped forecasting separate revenue for its MI300 AI chip line.
The proposed spending plans will also likely benefit computer hardware companies like Dell Technologies, Super Micro Computer Inc. and Hewlett Packard Enterprise Co. But those companies also typically have razor-thin profit margins in servers. They will start reporting earnings next week, with Super Micro on Feb. 11.
Of course, these tech giants could still change their massive spending plans, with reasons ranging from tariffs, to interest rates to the stronger dollar. But for now, investors got a lot of reassurance in earnings reports over the past 10 days that the current boom is not over yet.
Comments