Advanced micro devices (NASDAQ:AMD) unveiled its latest artificial intelligence processor and presented an AI chip development roadmap for the next two years.
AMD CEO Lisa Su introduced the AMD Instinct MI325X accelerator, which will be released in the fourth quarter of 2024. Computex 2024 show in Taipei. The company also unveiled the AMD Instinct MI350 series accelerator, which is based on the AMD CDNA 4 architecture. This accelerator is scheduled for release in 2025, delivering a 35x generational improvement in AI inference performance.
Additionally, AMD previewed the 5th generation AMD EPYC server processors, scheduled to be released in the second half of 2024. AMD also introduced its third generation AMD AI-enabled mobile processors, the AMD Ryzen AI 300 Series and AMD Ryzen. 9000 series processors for laptops and desktop PCs respectively.
AMD details its expanded multi-generation accelerator roadmap, showing plans to deliver year-over-year performance and memory leadership for generative AI.
NVIDIA (NVDA) is also planning to potentially shorten its release cycle to an annual basis. On Sunday, Nvidia announced plans to improve AI accelerators with the launch of its Blackwell Ultra chips next year and a next-generation platform called Rubin in 2026.
“With our updated annual product stream, we deliver the leadership features and performance that the AI industry and our customers expect will drive the next generation of advancements in data center AI training and inference at a relentless pace of innovation,” said Brad McCredie, Enterprise. said: Vice President, Data Center Accelerated Computing, AMD.
McCredie added that the Instinct MI300X accelerator continues to be adopted by partners and customers, including Microsoft (MSFT), Meta Platforms (META), Dell Technologies (DELL), Hewlett Packard Enterprise (HPE), and Lenovo for its cloud platform Azure.
The company said the AMD Instinct MI400 series, based on an architecture called AMD CDNA “Next,” will launch in 2026 for additional performance and efficiency for inference and large-scale AI training.
The MI325X accelerator provides 288GB of HBM3E memory. Meanwhile, the AMD Instinct MI350X accelerator, the first product in the AMD Instinct MI350 series, is built using advanced 3nm process technology and features up to 288GB of HBM3E memory.
The race to develop generative AI products has increased demand for advanced chips used in AI data centers. Companies developing large-scale language models (LLMs) use advanced processors to train these models, among other things because they require high computational power.