KIVA - The Ultimate AI SEO Agent by AllAboutAI Try it Today!

Samsung Secures Nvidia’s Approval for a Lower-Tier AI Memory!

  • Editor
  • January 31, 2025
    Updated
samsung-secures-nvidias-approval-for-a-lower-tier-ai-memory

Key Takeaways

  1. Samsung’s 8-layer HBM3E chips were certified by Nvidia in December 2024, marking a step forward for the company in the AI chip market.
  2. These chips are intended for Nvidia’s AI processors in China, complying with U.S. export restrictions on advanced AI technology.
  3. SK Hynix continues to lead in AI memory technology, supplying 12-layer HBM3E chips, which are used in Nvidia’s most powerful AI accelerators.
  4. Samsung’s stock fell 2%, while SK Hynix shares dropped nearly 12%, reflecting concerns about future AI chip demand and competition.
  5. The company is investing heavily in R&D and restructuring its engineering teams to compete with SK Hynix in next-generation HBM4 memory, expected to launch in late 2025.

Samsung Electronics has secured a major approval from Nvidia, allowing it to supply its 8-layer HBM3E high-bandwidth memory (HBM) chips for Nvidia’s AI processors in China.

While this marks an important breakthrough for Samsung, it does not yet position the company as a leader in high-bandwidth AI memory.

SK Hynix continues to dominate the market with its more advanced AI memory technology.

Nvidia’s Approval: A Step Forward for Samsung, But With Limits

According to people familiar with the matter, Bloomberg reported: “Samsung Electronics Co. has obtained approval to supply its high-bandwidth memory chips to Nvidia Corp..”

This means that Samsung is now an official supplier of HBM (high-bandwidth memory) to Nvidia, which dominates the AI chip industry with its GPU accelerators used in AI training and inference workloads.

However, it’s important to note that these chips are specifically for Nvidia’s AI processors targeting China rather than its flagship AI products used globally.

“Nvidia has approved a version of Samsung Electronics’ fifth-generation advanced high-bandwidth memory chips for artificial intelligence applications,” Bloomberg reported on Friday.

This strategic decision is largely driven by U.S. government restrictions, which limit Nvidia’s ability to export its most advanced AI chips to China.

Samsung vs. SK Hynix: The AI Memory Race Continues

While Nvidia’s approval is a positive step for Samsung, it does not immediately challenge SK Hynix’s dominance.

Sources notes: “Nvidia’s approval has been long in the works, as Samsung races to catch up to fellow Korean chipmaker SK Hynix Inc., Nvidia’s go-to partner for supplying the most advanced HBM to pair with its AI chips.”

SK Hynix has been ahead of Samsung in both technological development and mass production.

  • First to mass-produce 8-layer HBM3E in early 2024.
  • Started shipping 12-layer HBM3E chips for Nvidia’s flagship AI accelerators at the end of 2024.

This means that Nvidia’s most powerful AI systems, which rely on high-performance memory, still depend on SK Hynix for their 12-layer HBM3E chips.

In contrast, Samsung’s newly approved 8-layer HBM3E chips are for less powerful AI applications, mainly in China.

Market Reaction: How Investors Responded

The stock market reacted to the news with a mix of optimism and caution.

  • Samsung’s stock fell by around 2%, reflecting investor concerns that the approval may not significantly boost its AI memory business.
  • SK Hynix shares dropped nearly 12%, possibly due to fears of future competition, as well as broader concerns about AI chip demand.

Additionally, Micron Technology, another AI memory competitor, also saw its stock decline, signaling industry-wide uncertainty.

Samsung’s Plan to Catch Up: Investing in HBM4

Despite trailing behind SK Hynix, Samsung is not backing down. The company is aggressively improving its AI memory technology and competing for next-generation contracts.

Samsung’s chip division chief, Jun Young-hyun, has already acknowledged delays in securing Nvidia’s approval.

In response, Samsung has taken several key steps:

  1. Reorganizing its engineering teams to improve efficiency.
  2. Increasing research and development (R&D) investments in high-bandwidth memory.
  3. Developing next-generation HBM4, which is expected to be a critical component in AI chips launching in late 2025.

Sources confirms: “Under Jun’s leadership, Samsung has reorganized its team of engineers and ratcheted up research and development expenditure, hoping to reverse its market position with the next generation of HBM chips, or HBM4.”

What’s Next? The Race for HBM4

The real battle in AI memory is just beginning.

Both Samsung and SK Hynix are now focusing on HBM4, the next-generation high-bandwidth memory that will power AI accelerators launching in 2025 and beyond.

The winner of this race will secure multi-billion-dollar contracts with AI chipmakers like Nvidia, AMD, and Intel.

For now, SK Hynix remains ahead, but Samsung is making strategic investments to close the gap.

Samsung’s Nvidia approval is a significant milestone, but it does not immediately change the competitive ground.

  • SK Hynix continues to dominate Nvidia’s AI memory supply with its 12-layer HBM3E chips.
  • Samsung’s approved chips are for Nvidia’s China-focused AI processors, rather than its flagship AI chips.
  • The next major competition will be in HBM4, where Samsung hopes to close the gap and challenge SK Hynix more directly.

While this news is a step in the right direction, Samsung still has work to do before it can fully compete with SK Hynix in the AI memory industry.

January 23, 2025: Samsung Debuts AI Smartphones With Qualcomm Chips and Slimmer Designs!

January 23, 2025: SK Hynix Outpaces Samsung in Profits for the First Time Amid AI Surge!

January 21, 2025: Samsung Galaxy Unpacked 2025: Galaxy S25 and AI Innovations Await!

For more news and insights, visit AI News on our website.

Was this article helpful?
YesNo
Generic placeholder image
Editor
Articles written12503

Digital marketing enthusiast by day, nature wanderer by dusk. Dave Andre blends two decades of AI and SaaS expertise into impactful strategies for SMEs. His weekends? Lost in books on tech trends and rejuvenating on scenic trails.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *