Key Takeaways:
AMD recently launched its latest artificial intelligence chip, the Instinct MI325X, in a direct bid to challenge Nvidia’s dominance in the data center GPU market.
The new chip, which will begin production before the end of 2024, aims to attract developers and cloud providers who might see it as a competitive alternative to Nvidia’s AI products.
If successful, AMD’s entry could potentially put pricing pressure on Nvidia, which has maintained gross margins of around 75% due to high demand for its GPUs over the last year.
The growing demand for advanced generative AI technologies, like OpenAI’s ChatGPT, has driven companies to seek powerful data center GPUs, a market Nvidia has long dominated.
While AMD has historically held the second position in this market, the company is now strategically targeting a larger slice of the $500 billion AI chip market by 2028.
The event, however, did not showcase any new major cloud or internet customers for AMD’s Instinct GPUs. Previously, AMD disclosed that both Meta and Microsoft utilize its AI GPUs, and OpenAI employs them for some applications.
The MI325X will reportedly enhance AMD’s annual product release schedule, enabling it to compete more effectively with Nvidia and benefit from the AI chip boom. It also serves as the successor to AMD’s MI300X, which started shipping in late 2023.
This latest launch could draw attention from investors searching for companies positioned to profit from the rapid expansion of AI technologies.
While AMD’s stock has seen a modest 20% increase in 2024, Nvidia’s shares have surged over 175%, underscoring Nvidia’s stronghold on more than 90% of the AI data center GPU market. AMD’s shares fell by 4% following the announcement, while Nvidia’s rose by 1%.
One of AMD’s primary challenges in competing with Nvidia lies in the latter’s proprietary programming language, CUDA, which has become the industry standard among AI developers.
This has created a barrier for AMD, as developers are often deeply entrenched in Nvidia’s ecosystem. To counter this, AMD has been focusing on advancing its competing software platform, ROCm, to enable AI developers to transfer more of their models to AMD’s hardware.
AMD’s approach to AI hardware also highlights its competitive edge in scenarios where AI models are used for content creation and prediction rather than heavy data processing.
In addition to GPUs, AMD’s core business remains its central processors or CPUs, which are integral to almost every server worldwide.
The company’s data center sales more than doubled year-over-year to $2.8 billion in the June quarter, with about $1 billion attributed to AI chips.
AMD currently holds around 34% of the data center CPU market, still trailing Intel, which dominates the segment with its Xeon processors. AMD is working to change this with its new EPYC 5th Gen CPUs, which were also announced at the event.
These processors are available in a range of configurations, from cost-effective, low-power 8-core chips priced at $527 to powerful 192-core chips intended for supercomputers at $14,813 each.
The latest CPUs from AMD are particularly optimized for AI workloads that require high data throughput, and nearly all GPUs depend on an accompanying CPU to operate.
Overall, AMD is positioning itself as a formidable contender in the AI chip market, offering alternatives to Nvidia and Intel with a strong focus on scalability and innovation in AI hardware.