49
Already next year
HBM memory is becoming increasingly important to the market as it is used by AI accelerators. Samsung and Hynix told us when to expect new HBM4 memory.
Samsung should be the first. She is ready to release HBM4 next year. These will be 16-layer stacks, likely up to 24 GB per stack. That is, if you keep the same eight stacks that are often used now in accelerators for AI, you get 256 GB for one adapter. Today the maximum — this is 192 GB.
As for Hynix, it is ready to release HBM4 only in 2026. To do this, it announced a collaboration with TSCM to use CoWoS 2 technology. There are no technical details about Hynix memory yet, but you can expect it to be roughly in line with what competitors will offer.