Samsung’s bid to supply high-bandwidth memory (HBM3E) to NVIDIA this year has hit a major snag. According to reports, Samsung is facing significant performance hurdles, leaving SK Hynix as the dominant player in the HBM market.
Results for: HBM3E
NVIDIA CEO Jensen Huang confirms the company is rapidly working to certify Samsung’s new HBM3E memory chips, a move vital for sustaining the explosive growth of its AI GPUs. This collaboration marks a significant shift in the HBM market, potentially disrupting SK hynix’s dominance.
SK Hynix has showcased its cutting-edge HBM3E 12-Hi memory at the OCP Global Summit, demonstrating its commitment to leading the future of semiconductor technology. This groundbreaking memory boasts the world’s highest speed, capacity, and stability, setting a new benchmark for AI performance.
SK hynix has announced mass production of its new 12-layer HBM3E memory, boasting a record-breaking 36GB capacity and speeds of 9.6Gbps. This breakthrough memory technology is set to significantly enhance the performance of AI systems, with NVIDIA leading the charge as the first customer to receive the high-performance chips.
SK Hynix shares surged over 9% on Thursday after the company announced the start of mass production for its new HBM3E memory chips. These advanced chips offer 50% more capacity than previous versions, putting SK Hynix at the forefront of the AI memory market. This positive news comes amidst recent stock volatility and signals a strong future for the company.
Micron has announced the release of its HBM3E 12-Hi memory, boasting an impressive 36GB capacity and a bandwidth of 1.2TB/s. This powerful memory is designed to fuel the demands of artificial intelligence (AI) GPUs, enabling the processing of larger AI models and faster insights.
SK hynix is set to begin mass production of its advanced 12-layer HBM3E memory chips for AI applications by the end of September, with shipments expected to start in Q4 2024. The company is also pushing forward with its next-generation HBM4 memory, slated for release in the second half of 2025, designed to be compatible with NVIDIA’s upcoming Rubin R100 AI GPU.
NVIDIA’s Blackwell Ultra AI GPU is getting a spec bump with up to 288GB of HBM3E memory and a 50% performance increase. However, the rollout has been hit with delays, and a redesigned B200A version is in the works. The news comes from SemiAnalysis, which has also revealed that Blackwell Ultra will come in both standard CoWoS-L and a new MGX NVL 36 form factor.
SK hynix will be attending FMS 2024, a global semiconductor memory event, where it will showcase advancements in its memory technologies and products, including its next-generation AI memory products like the 12-layer HBM3E and 321-high NAND. The company will also present its vision for the AI space and highlight its leadership in the industry.