But there will be stacks of 36 GB
Micron has unveiled what it says is the fastest and most powerful HBM memory in the industry.
We are talking about HBM3 Gen2 memory with a capacity of 24 GB per stack and a bandwidth of more than 1.2 TB / s. The new memory, Micron says, sets new records for critical AI data center performance, capacity and energy efficiency.
Micron Introduces Industry’s Fastest HBM3 Gen2 with Up to 24GB Per Stack
For the production of such memory, the Micron 1β process technology is used, which allows you to assemble a 24 GB stack within the standard case size for such a chip. In this case, we are talking about eight layers, but in the first quarter of 2024 the company promises 12-layer stacks with a capacity of 36 GB.
Of course, such memory is not focused on video cards at all. As Micron itself says, this is a product for the data center and high performance computing. Such memory may well find application in next-generation Nvidia and AMD accelerators.
According to the company’s roadmap, in 2026 we will see HBM Next memory (probably HBM4) with a capacity of up to 64 GB and a bandwidth of 1.5-2 TB / s and even higher.