The three major memory manufacturers all reported good results! WSJ: AI detonates memory and enters "super boom period"

Tech     7:47am, 4 November 2025

Due to AI business cooperation such as NVIDIA (NVIDIA) and OpenAI, the memory chip industry is entering a period of long-term prosperity. Samsung, SK Hynix and Micron have all benefited from this, ushering in rapidly growing product demand.

TrendForce, a research firm, predicts that the total revenue of the DRAM industry will likely hit a new high of approximately US$231 billion next year, more than four times the cycle trough in 2023.

The Wall Street Journal reported that the current financial reports of major memory manufacturers also show an optimistic state. Samsung's third-quarter net profit increased by 21% compared with the same period last year, with the memory chip department hitting a quarterly high. SK Hynix also achieved a new high in the third quarter, with net profit doubling from the same period last year. It also stated that the memory market has entered a super boom cycle and all production capacity will be sold out by next year. Micron's net profit in the latest quarter more than tripled to $3.2 billion.

According to the World Semiconductor Trade Statistics (WSTS), memory chips account for about a quarter of global chip sales. Another large type of chips are logic chips, including AI chips such as NVIDIA and CPUs required for computers and smartphones.

Reports indicate that OpenAI signed letters of intent with Samsung and SK Hynix to include them as partners in the Stargate project. SK Hynix expects that OpenAI’s demand for DRAM may be as high as 900,000 pieces per month, more than double the current production capacity of the entire HBM industry.

In addition to the strong demand for HBM, the demand for traditional memory is also quite hot. Major data center manufacturers such as Amazon, Google and Meta are purchasing large quantities of the memory required for traditional servers. Since major memory manufacturers have previously mainly expanded HBM production capacity, the supply of traditional memory has been tight.

In addition, traditional memory chips required by general servers can also be used in AI inference. Peter Lee, a semiconductor analyst at Citi in Seoul, said that for certain tasks used in inference, such as memorizing or retrieving large amounts of data generated by large-language AI models, it may be more cost-effective to deploy traditional servers.

SK Hynix expects that the HBM market alone is expected to grow by an average of more than 30% in the next five years, and this number is a conservative estimate. Avril Wu, senior vice president of research at TrendForce, said that the current memory supply crunch will continue until 2026, and may even extend into early 2027.

However, Sanjeev Rana, a semiconductor analyst at Hong Kong brokerage CLSA, believes that the figures proposed by OpenAI are quite exaggerated. He has reservations about whether existing production capacity and planned production capacity are really unable to meet demand, and whether capital expenditures need to be expanded.

Memory-Chip Makers Are Enjoying a Boom to Remember, Thanks to AI

Further reading: Q3 financial report is better than expected! Samsung will start mass production of HBM4 next year and start 2nm mass production this quarter NVIDIA’s market capitalization exceeds $5 trillion! Rising demand for AI chips drives Lasertec's stock price soaring 21%