AMD has announced the launch of its latest AI chip, the Instinct MI325X, marking a bold move to challenge Nvidia’s dominance in the data center GPU market. The new chip, set to begin production by the end of 2024, aims to compete directly with Nvidia’s highly sought-after GPUs, which have become essential for building and training advanced AI models.
As AI technologies like OpenAI’s ChatGPT continue to drive massive demand for data center processing power, AMD sees an opportunity to capture a significant portion of the booming market. Valued at approximately $500 billion by 2028, the AI chip sector is poised for tremendous growth, and AMD is positioning itself to play a major role in this space.
AMD CEO Lisa Su highlighted the rapidly growing demand for AI technology, stating, “AI demand has exceeded expectations, and investment continues to grow across the board.” Although AMD did not reveal any new major cloud customers at the launch event, it has previously disclosed partnerships with Meta and Microsoft for its AI chips, and OpenAI utilizes AMD’s products for certain applications.
The new MI325X chip is designed to perform well in use cases where AI models are generating content or making predictions, largely thanks to its advanced memory capabilities. AMD claims that its chip outperforms Nvidia’s H200 GPU by up to 40% when running Meta’s Llama 3.1 AI model, a significant advantage for certain AI workloads.
While Nvidia continues to hold more than 90% of the data center AI chip market, AMD’s new chip and its software ecosystem, ROCm, aim to make it easier for AI developers to transition away from Nvidia’s proprietary CUDA programming language. This move could help AMD attract developers and companies looking for alternatives to Nvidia’s hardware.
AMD’s strategy includes a faster product release schedule, with plans to roll out new chips annually. Following the MI325X, AMD will introduce the MI350 in 2025 and the MI400 in 2026, in a bid to keep pace with Nvidia’s aggressive development schedule, including its upcoming Blackwell chips.
In addition to its AI-focused GPUs, AMD is doubling down on its core CPU business. The company announced its 5th generation EPYC CPUs, designed for data centers and AI workloads. These processors range from cost-effective 8-core models to 192-core, high-power units for supercomputers, positioning AMD to compete with Intel’s Xeon line.
With AI chips accounting for roughly $1 billion of its $2.8 billion in data center sales in the June quarter, AMD continues to challenge both Nvidia and Intel in this rapidly evolving market.
READ MORE: