Samsung Leads the Charge with Unprecedented 12-Stack HBM3E for AI!

  • Editor
  • February 27, 2024

Samsung Electronics, a global leader in memory technology, announced on Tuesday the development of the 12-Stack HBM3E memory, which boasts the highest capacity in the industry to date.

This cutting-edge chip enhances both performance and capacity by over 50%, addressing the growing demand for high-capacity, high-bandwidth memory (HBM) from AI service providers.

Yongcheol Bae, Executive Vice President of Memory Product Planning at Samsung Electronics, highlighted the chip’s design, tailored to meet the expanding needs of the AI industry.

“The industry’s AI service providers are increasingly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need,” said Yongcheol Bae, executive vice president of memory product planning at Samsung Electronics.

Bae emphasized Samsung’s commitment to technological leadership in the high-capacity HBM market, a crucial component for powering advanced generative AI models like OpenAI’s ChatGPT.

“This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market in the AI era,” said Bae.

The demand for such high-performance memory chips is surging as generative AI models require significant memory to effectively process and recall vast amounts of data, facilitating humanlike interactions.

Samsung’s HBM3E 12H chip is expected to be a pivotal solution for future systems that demand more memory, offering superior performance and capacity.

“As AI applications grow exponentially, the HBM3E 12H is expected to be an optimal solution for future systems that require more memory. Its higher performance and capacity will especially allow customers to manage their resources more flexibly and reduce total cost of ownership for data centers,” said Samsung Electronics.

This will enable customers to more flexibly manage their resources and reduce the total cost of ownership for data centers. Mass production of the chip is scheduled for the first half of 2024, with samples already being distributed to customers.

SK Kim, Executive Director of Daiwa Securities, anticipates that this development will positively impact Samsung’s share price, noting Samsung’s strategic move to secure leadership in the high-density HBM3E product segment for Nvidia.

This demand is part of a broader AI boom, with Nvidia, a U.S. chip designer, reporting a 265% increase in revenue in the fourth fiscal quarter, driven by the growing need for its graphics processing units to support AI applications.

Samsung 12-Stack HBM3E memory features a 12-layer stack, but utilizes advanced thermal compression non-conductive film technology, allowing it to maintain the same height specification as 8-layer chips.

This innovation does not increase the physical footprint of the chip while packing more processing power. Additionally, Samsung has achieved the industry’s smallest gap between chips at seven micrometers (µm) and enhanced vertical density by over 20% compared to its HBM3 8H product.

For more AI news and trends, visit the news section of our website.

Was this article helpful?
Generic placeholder image

Dave Andre


Digital marketing enthusiast by day, nature wanderer by dusk. Dave Andre blends two decades of AI and SaaS expertise into impactful strategies for SMEs. His weekends? Lost in books on tech trends and rejuvenating on scenic trails.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *