Samsung is witnessing a major breakthrough with its HBM3E technology, as it gains significant adoption from AMD in their latest AI accelerator, the Instinct MI350X. This development comes as a relief to Samsung’s struggling HBM business, especially after disappointing outcomes in previous attempts to gain NVIDIA’s approval. The company’s foundry division has faced a slump, but AMD’s decision to use Samsung’s HBM3E 12-Hi stacks for its new AI accelerators could turn the tide for Samsung in the high-performance memory segment.
For months, there had been talks about the possibility of NVIDIA adopting Samsung’s HBM, but after several rounds of testing, Samsung failed to meet Team Green’s rigorous standards.
This setback pushed the Korean giant further down in the AI segment. However, a recent announcement from AMD has given Samsung new hope. AMD revealed that its latest AI GPUs, the Instinct MI350X and MI355X, will be powered by Samsung’s HBM3E alongside Micron’s memory. These GPUs are equipped with 288GB of memory, likely sourced from Samsung’s 12-Hi stacks. This partnership marks the first time AMD has officially confirmed working with Samsung on such a project, and the industry is expecting a positive response.
Furthermore, AMD’s plans to scale its AI offerings to rack-scale solutions will likely boost the demand for HBM3E memory solutions, benefiting Samsung even more in the long run. Additionally, Samsung plans to ramp up its production of HBM4 as the second half of the year approaches, further solidifying its position in the market. The company is also looking forward to partnering with AMD on the upcoming Instinct MI400 accelerator lineup, which will utilize Samsung’s HBM4 process. Samsung aims to regain lost ground in the HBM market, which has seen significant gains by competitors like SK hynix and Micron after their collaboration with NVIDIA. With the AMD partnership and its upcoming HBM4 solution, Samsung has a clear opportunity to take back its share of the market.