Samsung HBM4 launches soon for Nvidia AI chips, with limits
Industry reports deliver great news for Samsung and Nvidia fans with the worldβs first HBM4 production commencing this February for the next-gen AI chips.
According toΒ Yonhap, Samsung will begin producing HBM4 for Nvidia AI chips after the Lunar New Year holidays. With this milestone, the Korean tech giant will become the first in the world to commercialize the new HBM.
Samsungβs HBM4 reportedly exceeds the speeds required by Nvidia. The company is utilizing a fabrication process better than its rivals. NVIDIA is set to use HBM4 memory chips in its Vera Rubin AI accelerators.
Despite steep edges, Samsung is having mid-20% share in Nvidiaβs HBM4 supply. A larger portion, going beyond 50 percent, is going to SK Hynix, with Samsung is runner up, followed by the US-based memory maker Micron.
Korean outletΒ Hankyung reports that Samsung has secured mid-20% share for HBM4. Itβs significantly higher than the current HBM3E supply, which the company trailed; therefore, itβs good to go for future partnerships.
Β
An industry insider stated:
βSamsung, which has the worldβs largest production capacity and the broadest product lineup, has demonstrated a recovery in its technological competitiveness by becoming the first to mass-produce the highest-performing HBM4.β
The post Samsung HBM4 launches soon for Nvidia AI chips, with limits appeared first on Sammy Fans.