As artificial intelligence demand surges, memory chip prices have soared, reshaping the semiconductor industry's profit structure. According to The Korea Economic Daily, Samsung Electronics' memory division and SK Hynix are projected to surpass TSMC in gross profit margins by the fourth quarter of 2025. This marks the first time since Q4 2018 that the memory sector's profit statement will exceed that of the foundry industry.
The report indicates that Samsung Electronics and SK Hynix's gross margins are projected to range between 63% and 67%, exceeding TSMC's estimated 60% margin. Additionally, Micron, the world's third-largest memory chip manufacturer, achieved a gross margin of 56% in the first fiscal quarter of its 2026 fiscal year (September to November 2025) and expects this to rise further to 67% in the second fiscal quarter (December 2025 to February 2026). This indicates Micron also has the potential to surpass TSMC in profitability during the first quarter of calendar year 2026.
The rapid surge in memory chip prices is the primary driver of profit growth in the memory industry. Currently, the three major memory chip manufacturers have allocated approximately 18% to 28% of their DRAM production capacity to High Bandwidth Memory (HBM). HBM requires stacking 8 to 16 DRAM chips for manufacturing, significantly compressing the supply of general-purpose DRAM. This has led to quarterly price increases exceeding 30% for general-purpose DRAM.
The report notes that as the AI industry shifts from “training” to “inference,” rapid data storage and retrieval become critical. Inference applies knowledge gained during training to solve problems, which in turn requires memory like HBM to store data and continuously feed it to GPUs. This growing demand for memory chips is driving memory chip gross margins to exceed those of wafer foundries.
Additionally, while general-purpose DRAM lags behind HBM in performance, early AI inference workloads are typically handled by DRAM variants like GDDR7 and LPDDR5X, reserving HBM for more intensive tasks. NVIDIA's use of GDDR7 in inference-focused AI accelerators exemplifies this trend.
Meanwhile, memory chip manufacturers plan to sustain the memory-centric era by developing high-performance products tailored for AI. One example is Processor-in-Memory (PIM), which enables memory to handle portions of computational workloads traditionally executed by GPUs. The report adds that technologies like Vertical Channel Transistor (VCT) DRAM and 3D DRAM are also poised to enter the market, enhancing data density by storing more information in smaller areas.