NVIDIA’s 800,000 SOCAMM Memory Units for AI Products in 2025

NVIDIA is planning to roll out up to 800,000 units of its new SOCAMM (modular memory) modules in 2025 for use in its AI products. This modular memory solution is designed to make upgrading memory on devices much simpler, promising enhanced performance and power efficiency.

The SOCAMM memory, which was first introduced at the NVIDIA GTC event, is part of the company’s efforts to boost the performance and reduce power consumption of its AI-focused products.

At the GTC event, NVIDIA showcased its GB300 platform using the SOCAMM memory, which is developed by Micron. This new memory solution offers a contrast to the commonly used HBM and LPDDR5X in AI products like servers and mobile devices. SOCAMM is based on LPDDR DRAM, which is typically used in mobile devices, but with a key difference: it’s upgradable. Unlike HBM and LPDDR5X, SOCAMM isn’t soldered to the PCB and can be securely attached with just three screws.

According to reports from Korean outlet ETNews, NVIDIA plans to produce between 600,000 and 800,000 units of SOCAMM memory this year, marking a significant push for its deployment in various AI products. The first product to use SOCAMM memory is the GB300 Blackwell platform, which is likely to signal NVIDIA’s move toward adopting this modular memory solution for many future AI devices.

While the production goal of 800,000 units in 2025 is relatively small compared to the HBM memory quantities expected to be shipped to NVIDIA in the same period, SOCAMM’s production is expected to ramp up next year, especially with the upcoming SOCAMM 2 memory. This modular memory is not only compact but also significantly more energy-efficient than RDIMM solutions. Though exact power-efficiency improvements are not detailed, reports suggest SOCAMM will provide higher bandwidth than RDIMM, LPDDR5X, and LPCAMM, which are among the leading memory types in mobile platforms.

SOCAMM memory is expected to deliver bandwidth ranging from 150 to 250 GB/s, offering a flexible and upgradeable solution for AI PCs and servers. It is anticipated that SOCAMM will become the standard for low-power AI devices in the near future. Micron is the current manufacturer for these modules, but sources indicate that NVIDIA is in discussions with Samsung and SK Hynix to potentially expand production capabilities.

Related posts

ASRock Unveils DeskMini B860 and X600/USB4 Mini PCs with Next-Gen Intel and AMD CPUs

Beelink Introduces GTi15 Ultra Mini-PC Featuring Intel’s Core Ultra 9 285H Processor

ASUS Unveils Hatsune Miku-Themed Custom PC Build