NVIDIA And SK Hynix Build AI-focused SSD Promising 10x Performance Boost

NVIDIA and South Korean memory giant SK hynix are teaming up to develop a powerful new solid-state drive (SSD) designed specifically to handle the heavy data demands of artificial intelligence (AI) inference — the stage where trained AI models generate real-time answers for users.

The companies are working on what they describe as a next-generation “AI SSD” that could deliver up to ten times the performance of today’s enterprise SSDs. Internally, the initiative is called “Storage Next” at NVIDIA and “AI-NP” (AI NAND Performance) at SK hynix.

SK hynix Vice President Kim Cheon-seong disclosed the project at Korea’s Artificial Intelligence Semiconductor Future Technology Conference, saying that the new SSD could eventually reach 100 million input/output operations per second (IOPS) — a level far beyond what current enterprise SSDs can achieve.

Why AI Needs A New Kind Of  SSD

The push is due to the rapid shift in AI computing from training models to running them in real-world applications, a process known as inference, where speed and efficiency are critical. Modern AI models rely on constant, low-latency access to enormous volumes of data — far more than traditional DRAM and even high-bandwidth memory (HBM) struggle to handle economically.

HBM, which SK hynix already supplies to NVIDIA, has played a critical role in its AI GPUs during training. But HBM is expensive, limited in capacity, and increasingly strained as AI models grow larger and more personalized.

The idea behind the new SSD is to turn NAND flash — traditionally used for storage — into something closer to a “pseudo-memory” layer optimized specifically for AI computation. By pairing advanced NAND chips with new controller architectures optimized for AI, the SSD could feed data to GPUs far more efficiently than conventional storage, thereby improving throughput, lowering energy consumption, and cutting the overall cost of running large-scale AI services.

Prototype Expected By 2026

The project is currently at the proof-of-concept (PoC) stage, with both companies aiming to have a working prototype ready by late 2026. If development stays on track, potential commercial deployment could follow in 2027.

SK hynix has framed the effort as part of a broader strategy to tailor memory and storage products directly to AI workloads, rather than relying on general-purpose designs. The company is also pursuing complementary technologies, including High Bandwidth Flash (HBF), developed in collaboration with SanDisk, which stacks NAND chips in a way similar to HBM, further blurring the line between memory and storage.

Market Impact And Supply Concerns

While the technology could significantly reduce AI inference costs and improve AI performance as well as energy efficiency, analysts warn that a successful AI SSD may also increase pressure on global NAND flash supplies, potentially driving up storage prices higher — much like what has already happened with DRAM. 

For NVIDIA and SK hynix, however, the goal of the project is clear: remove one of the biggest bottlenecks in AI infrastructure. As AI systems grow larger and more complex, storage is no longer just about capacity — it’s becoming a critical performance component of how AI actually works.

 

Kavita Iyer
Kavita Iyerhttps://www.techworm.net
An individual, optimist, homemaker, foodie, a die hard cricket fan and most importantly one who believes in Being Human!!!
spot_img

Read More

Suggested Post