Wednesday, April 22, 2026

SK hynix Begins Mass Production of SOCAMM2 Optimized for Nvidia's Vera Rubin

Input
2026-04-20 09:23:44
Updated
2026-04-20 09:23:44
SK hynix SOCAMM2 192GB product. Provided by SK hynix.
[The Financial News] SK hynix said on the 20th that it will begin full-scale mass production of its SOCAMM2 192GB product, a next-generation memory module standard based on 6th-generation (1c) LPDDR5X low-power DRAM in the 10-nanometer class (1 nm equals one-billionth of a meter).
SK hynix has stabilized its mass production system early in line with demand from global cloud service provider customers. The new SOCAMM2 product is designed to be optimized for Nvidia's next-generation Graphics Processing Unit (GPU), Vera Rubin.
SOCAMM2 is a module that adapts low-power memory, which has mainly been used in mobile devices such as smartphones, for server environments. It is primarily used in next-generation Artificial Intelligence (AI) servers.
SK hynix explained that "the SOCAMM2 product using the 1c nanometer process delivers more than twice the bandwidth of existing RDIMM and improves energy efficiency by more than 75%, making it an optimized solution for high-performance AI computing." RDIMM, or Registered Dual In-Line Memory Module (RDIMM), is a DRAM module for servers and workstations that adds a component to relay address and command signals between the memory controller and DRAM chips within the memory module.
SK hynix expects the product to fundamentally ease memory bottlenecks during the training and inference of massive AI models, significantly boosting overall system processing speed. It also said that wider adoption of SOCAMM2 could create a structure in which GPUs and High Bandwidth Memory (HBM) handle premium performance, while SOCAMM2 is responsible for optimizing the Total Cost of Ownership (TCO) of AI infrastructure.
SK hynix said SOCAMM2 is drawing attention as a next-generation memory solution, as the AI market is shifting in earnest from training to inference and requires a low-power way to run Large Language Models (LLMs).
Kim Joo-sun, President of AI Infrastructure and Chief Marketing Officer (CMO) at SK hynix, said, "By supplying the SOCAMM2 192GB product, we have set a new standard for AI memory performance." He added, "Based on close cooperation with global AI customers, we will establish ourselves as the AI memory solution company customers trust most."
 
ehcho@fnnews.com Jo Eun-hyo Reporter