The AI boom is not just about powerful GPUs like NVIDIA’s H100 and upcoming Blackwell B200, which are at the forefront of the generative AI movement. High-speed memory and storage are equally vital for building the massive data centers and supercomputers that train complex AI models, like Meta’s recent Llama 3.1. This is why the DRAM and NAND Flash industry is poised for significant growth in 2024.
Market analysis predicts DRAM revenue will skyrocket to $90.7 billion in 2024, increasing another 51% to $136.5 billion in 2025. This surge is driven by several factors, including the demand for High-Bandwidth Memory (HBM), increased DRAM prices, and the emergence of more complex (and expensive) DRAM products like DDR5 memory. While HBM is projected to account for only 5% of total DRAM shipments this year, it will contribute a significant 20% to overall revenue.
Meanwhile, NAND Flash revenue, which includes SSD storage devices, is expected to reach $67.4 billion in 2024 and $87 billion in 2025. This growth will be fueled by the use of QLC enterprise SSDs in AI servers, workstations, and data centers. Beyond AI, Apple’s plans to implement QLC storage in its iPhones by 2026 will further contribute to NAND Flash demand.
This explosive growth presents both opportunities and challenges. The increased revenue provides a valuable influx of cash that can be invested in new technologies and practices. However, the surge in demand also puts pressure on the supply chain, raising concerns about potential shortages of raw materials needed for manufacturing. The AI boom is driving a significant shift in the memory and storage industry, with both positive and negative implications for the future.