Samsung Electronics (KS:005930) has initiated mass production of 8-layer High Bandwidth Memory 3 (HBM3) chips following successful qualification tests by Nvidia (NASDAQ:NVDA), according to Seoul Economic Daily, citing unnamed industry sources.
This development boosts expectations that Samsung's advanced HBM3E chips, which the company is eager to supply to Nvidia (NVDA), will also meet the required standards.
HBM chips are playing a key role in the ongoing AI boom due to their exceptional memory bandwidth and efficiency, enabling faster data processing and improved performance in AI applications.
These chips support the high computational demands of AI workloads, such as deep learning and neural network training, by providing rapid access to large datasets and reducing latency.
The report, if true, represents a big step forward for Samsung. Earlier this year, Reuters reported that Samsung’s HBM chips have encountered issues with heat and power consumption, preventing them from passing Nvidia's tests for use in the U.S. company's AI processors.
So far, Samsung’s domestic competitor SK Hynix has been successfully supplying Nvidia with HBM3 chips since June 2022 and started shipping HBM3E chips to an undisclosed customer in late March, which sources confirm to be Nvidia.
Meanwhile, Micron (NASDAQ:MU), another major HBM manufacturer, has also announced plans to provide Nvidia with HBM3E chips.
Meeting Nvidia's standards is critical for HBM manufacturers, given that Nvidia holds about 80% of the global GPU market for AI applications. Success in this area is crucial not only for reputation but also for driving profit growth.