Micron has announced the release of its HBM3E 12-Hi memory, boasting an impressive 36GB capacity and a bandwidth of 1.2TB/s. This powerful memory is designed to fuel the demands of artificial intelligence (AI) GPUs, enabling the processing of larger AI models and faster insights.
Results for: HBM3E
SK hynix is set to begin mass production of its advanced 12-layer HBM3E memory chips for AI applications by the end of September, with shipments expected to start in Q4 2024. The company is also pushing forward with its next-generation HBM4 memory, slated for release in the second half of 2025, designed to be compatible with NVIDIA’s upcoming Rubin R100 AI GPU.
NVIDIA’s Blackwell Ultra AI GPU is getting a spec bump with up to 288GB of HBM3E memory and a 50% performance increase. However, the rollout has been hit with delays, and a redesigned B200A version is in the works. The news comes from SemiAnalysis, which has also revealed that Blackwell Ultra will come in both standard CoWoS-L and a new MGX NVL 36 form factor.
SK hynix will be attending FMS 2024, a global semiconductor memory event, where it will showcase advancements in its memory technologies and products, including its next-generation AI memory products like the 12-layer HBM3E and 321-high NAND. The company will also present its vision for the AI space and highlight its leadership in the industry.