KIVA - The Ultimate AI SEO Agent Try it Today!

AMD Launches AI Chip to Compete with Nvidia’s Blackwell!

  • October 11, 2024
    Updated
amd-launches-ai-chip-to-compete-with-nvidias-blackwell

Key Takeaways:

  • AMD has launched the Instinct MI325X AI chip aimed at competing with Nvidia’s Blackwell series, expected to ship early next year.
  • AMD’s MI325X chip could potentially disrupt Nvidia’s market share by offering developers an alternative that may pressure Nvidia’s pricing.
  • Despite AMD’s advancements, Nvidia’s CUDA ecosystem poses a challenge for AMD as it tries to entice AI developers to its platform.
  • AMD is enhancing its ROCm software to make switching easier for developers accustomed to Nvidia’s CUDA framework, with a focus on performance for content creation and predictive AI models.

AMD recently launched its latest artificial intelligence chip, the Instinct MI325X, in a direct bid to challenge Nvidia’s dominance in the data center GPU market.

The new chip, which will begin production before the end of 2024, aims to attract developers and cloud providers who might see it as a competitive alternative to Nvidia’s AI products.

If successful, AMD’s entry could potentially put pricing pressure on Nvidia, which has maintained gross margins of around 75% due to high demand for its GPUs over the last year.

The growing demand for advanced generative AI technologies, like OpenAI’s ChatGPT, has driven companies to seek powerful data center GPUs, a market Nvidia has long dominated.

While AMD has historically held the second position in this market, the company is now strategically targeting a larger slice of the $500 billion AI chip market by 2028.

AI demand has actually continued to take off and exceed expectations. It’s clear that the rate of investment is continuing to grow everywhere,” AMD CEO Lisa Su stated at the announcement event.

The event, however, did not showcase any new major cloud or internet customers for AMD’s Instinct GPUs. Previously, AMD disclosed that both Meta and Microsoft utilize its AI GPUs, and OpenAI employs them for some applications.

The MI325X will reportedly enhance AMD’s annual product release schedule, enabling it to compete more effectively with Nvidia and benefit from the AI chip boom. It also serves as the successor to AMD’s MI300X, which started shipping in late 2023.

This latest launch could draw attention from investors searching for companies positioned to profit from the rapid expansion of AI technologies.

While AMD’s stock has seen a modest 20% increase in 2024, Nvidia’s shares have surged over 175%, underscoring Nvidia’s stronghold on more than 90% of the AI data center GPU market. AMD’s shares fell by 4% following the announcement, while Nvidia’s rose by 1%.

One of AMD’s primary challenges in competing with Nvidia lies in the latter’s proprietary programming language, CUDA, which has become the industry standard among AI developers.

This has created a barrier for AMD, as developers are often deeply entrenched in Nvidia’s ecosystem. To counter this, AMD has been focusing on advancing its competing software platform, ROCm, to enable AI developers to transfer more of their models to AMD’s hardware.

AMD’s approach to AI hardware also highlights its competitive edge in scenarios where AI models are used for content creation and prediction rather than heavy data processing.

The advanced memory in the MI325X chip, for example, allows it to handle models like Meta’s Llama faster than some of Nvidia’s chips, with AMD claiming, “What you see is that the MI325 platform delivers up to 40% more inference performance than the H200 on Llama 3.1.”

In addition to GPUs, AMD’s core business remains its central processors or CPUs, which are integral to almost every server worldwide.

The company’s data center sales more than doubled year-over-year to $2.8 billion in the June quarter, with about $1 billion attributed to AI chips.

AMD currently holds around 34% of the data center CPU market, still trailing Intel, which dominates the segment with its Xeon processors. AMD is working to change this with its new EPYC 5th Gen CPUs, which were also announced at the event.

These processors are available in a range of configurations, from cost-effective, low-power 8-core chips priced at $527 to powerful 192-core chips intended for supercomputers at $14,813 each.

The latest CPUs from AMD are particularly optimized for AI workloads that require high data throughput, and nearly all GPUs depend on an accompanying CPU to operate.

As Su explained, “Today’s AI is really about CPU capability, and you see that in data analytics and a lot of those types of applications.”

Overall, AMD is positioning itself as a formidable contender in the AI chip market, offering alternatives to Nvidia and Intel with a strong focus on scalability and innovation in AI hardware.

For more news and insights, visit AI News on our website.
Was this article helpful?
YesNo
Generic placeholder image
Articles written 2477

Midhat Tilawat is endlessly curious about how AI is changing the way we live, work, and think. She loves breaking down big, futuristic ideas into stories that actually make sense—and maybe even spark a little wonder. Outside of the AI world, she’s usually vibing to indie playlists, bingeing sci-fi shows, or scribbling half-finished poems in the margins of her notebook.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *