Samsung Electronics is facing challenges with its latest high bandwidth memory (HBM) chips, which have yet to pass Nvidia's rigorous tests due to issues with heat and power consumption, according to sources familiar with the matter. These problems are affecting Samsung’s fourth-generation HBM3 chips as well as the upcoming fifth-generation HBM3E chips.
Samsung’s HBM3 and HBM3E chips are designed for high-performance graphics processing units (GPUs) used in artificial intelligence applications. Despite being the world's largest memory chip maker, Samsung has been trying to meet Nvidia's standards since last year, but recent tests in April revealed that both the 8-layer and 12-layer HBM3E chips failed to pass Nvidia's requirements.
Samsung stated that it is working closely with customers to optimize its HBM products, but did not comment on specific client relations. Following the initial report by Reuters, Samsung refuted the claims, stating that testing is proceeding smoothly and that issues related to heat and power consumption are untrue. Nvidia declined to comment on the matter.
High bandwidth memory (HBM) is crucial for processing the vast amounts of data generated by AI applications. With Nvidia holding approximately 80% of the global AI GPU market, securing approval for HBM chips from Nvidia is vital for HBM manufacturers' growth and reputation.
The failure to meet Nvidia’s standards could mean Samsung may fall behind competitors like SK Hynix and Micron Technology, both of which have already secured supply agreements with Nvidia. SK Hynix, which has been supplying HBM3 chips to Nvidia since June 2022 and began shipping HBM3E chips in March, is seen as a technological leader in the HBM market.
Amid these challenges, Samsung has recently replaced the head of its semiconductor unit, signaling the company’s urgency in addressing its lag in the HBM sector. Despite these setbacks, Samsung continues to supply HBM3 to customers like Advanced Micro Devices (AMD) and plans to begin mass production of HBM3E chips in the second quarter.
Jeff Kim, head of research at KB Securities, noted that although market expectations were high for Samsung to quickly pass Nvidia's tests, it is natural for specialized products like HBM to undergo extensive performance evaluations. Samsung's slower progress compared to SK Hynix and Micron has been observed by investors, with Samsung’s shares down 2% year-to-date, while SK Hynix's and Micron's stocks have surged by 42% and 48%, respectively.
Nvidia and AMD are eager for Samsung to perfect its HBM chips to introduce more competition and reduce SK Hynix's pricing leverage. Nvidia CEO Jensen Huang's approval of Samsung's 12-layer HBM3E at a recent AI conference highlighted the potential for collaboration, provided Samsung can meet the required standards.
Research firm Trendforce anticipates that HBM3E chips will become the mainstream HBM product this year, with significant shipments expected in the second half of 2024. SK Hynix predicts an 82% annual increase in demand for HBM memory chips through 2027, emphasizing the importance of resolving these issues promptly for Samsung to capitalize on the growing market.
As the competition intensifies, Samsung’s ability to refine its HBM technology and secure Nvidia's approval will be critical in maintaining its position in the high-stakes AI memory chip market.