• 6
  • Comment
  • 2

After a Year of Blistering Growth, AI Chip Makers Get Ready for Bigger 2026

Dow Jones2025-12-30

Driven by the explosive growth of artificial intelligence, the largest semiconductor companies in the world recorded more than $400 billion in combined sales in 2025, by far the biggest year for chips on record. Next year promises to be even bigger.

Yet the blistering pace of growth, fed by what CEOs and analysts describe as "insatiable demand" for computing power, has created a host of challenges, from shortages of vital components to questions about how and when AI companies will be able to generate reliable enough profits to keep buying chips.

Hardware designers such as Nvidia, which more than doubled its revenue year-over-year, are the main suppliers of the picks and shovels behind this new digital gold rush. But Nvidia faces growing competition from the likes of Alphabet's Google and Amazon.com, while the battleground shifts under its feet.

Last week, Nvidia signed a $20 billion licensing deal with the chip startup Groq, which designs chips and software that help accelerate AI inference, the process whereby trained AI models serve up answers to prompts. Where the last leg of the AI race was defined by training, tech giants are now competing to deliver the fastest and most cost-efficient inference.

"Inference workloads are more diversified and may open up new areas for competition," analysts at Bernstein wrote after Nvidia's recent deal was announced.

Data-center operators, AI labs and business customers have clamored for Nvidia's advanced H200 and B200 graphics processing units. Google's increasingly sophisticated custom chips, known as TPUs, and Amazon's Trainium and Inferentia chips, both of which compete with Nvidia's GPUs, are also scooping up customers, while software developers such as OpenAI are joining with custom designers such as Broadcom to design their own chips.

Advanced Micro Devices, a half-century old maker of gaming, personal computer and data-center chips, is launching a GPU in 2026 that represents its first major challenge to Nvidia's AI processors.

And Microsoft said in October that it would double its data-center footprint in the next two years, which means chip makers are likely to see added revenues in 2026, according to analysts, with expectations even higher than this past year.

All of it points to another record year coming for chips. Goldman Sachs estimates that Nvidia alone will sell $383 billion in GPUs and other hardware in the 2026 calendar year, an increase of 78% over the prior year. Analysts polled by FactSet estimate that the combined sales from Nvidia, Intel, Broadcom, AMD and Qualcomm will top $538 billion. That doesn't include revenues from Google's TPU business or Amazon's custom chips sales, neither of which are broken out by their parent companies.

Yet 2026 could also bring unprecedented challenges. A shortage of components such as electrical transformers and gas turbines hampers data-center construction, and operators struggle to secure the immense amounts of electrical power required to run computing clusters.

Another major challenge: a global shortage of components that go into AI data-center servers. Items in short supply include the ultrathin layers of silicon substrate some chips require and memory chips, the semiconductors that feed data to AI processors and help store the results of computations. As data-center construction has ramped up and demand has risen for inference, the need for more high-bandwidth memory chips has surged.

That is in part because AI inference workloads are more likely to be "memory-bound," or constrained by having enough accessible memory capacity, than are training workloads, which tend to be limited by the power of the processors used.

"We're significantly short of our customers' needs and it's going to persist for a while," said Sumit Sadana, chief business officer of Micron Technology, one of the largest makers of the high-bandwidth memory chips used in AI.

Micron, which has seen its share price rise 229% so far this year, and competitors in the memory chip space such as South Korea's Samsung and SK Hynix, have been major beneficiaries of this supply crunch. It has allowed them to hike prices for their products and increase capital expenditures to expand manufacturing operations, but it takes time to build the massive clean rooms and fabrication plants required to expand capacity enough to meet the big chip companies' needs.

There are also serious questions surrounding the sustainability of the financing behind the data-center build-out, and whether major customers such as OpenAI can raise enough money quickly to keep up their breakneck pace of chip purchases. Meanwhile, investors have grown used to outsize growth in revenues from quarter to quarter and are easily spooked by any sign of a slowdown.

This fall, investors broadly sold out of AI stocks, including the large chip designers, over concerns that the financing driving the purchase of AI infrastructure products might not be as solid as they once believed.

A large portion of the massive data-center build-out has been fueled by OpenAI, which has multibillion-dollar agreements with Amazon, Microsoft, Oracle and others for computing power. Hyperscale companies such as Microsoft have committed to ratcheting up data-center build-outs in 2026, but some analysts think that the boom could slow down in 2027.

"There's a chance that 2026 is a peak," said Gil Luria, an analyst with DA Davidson. "If it's the end of March and we don't hear that OpenAI has raised a hundred billion dollars, then the market may start pumping the brakes."

As more chip companies launch AI products, there is also some concern about pressure on profit margins. Broadcom's stock sank even after the company reported record quarterly revenue in December, in part over investor worries that going forward sales growth will be slow for its higher-margin product lines.

Others in the industry have taken a more optimistic view that demand will be long-lasting, with consistent growth.

"I don't think this will be the top," said Brad Gastwirth, global head of research for Circular Technologies, a Massachusetts-based distributor of computing hardware, including the GPUs, server racks and networking technology used for AI. "The race to artificial general intelligence is still powering a huge appetite for compute, across the spectrum of customers."

At the request of the copyright holder, you need to log in to view this content

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Report

Comment

empty
No comments yet
 
 
 
 

Most Discussed

 
 
 
 
 

7x24