CPUs are the unsung heroes of AI

  • GPUs get all the love but CPUs are - and will continue to be - vital for AI
  • CPUs tend to take care of all the background work, like data preprocessing and workflow orchestration
  • Enterprises and hyperscalers are wising up to the importance of CPUs, with an upgrade cycle now in motion

The AI industry seemingly hangs on Nvidia’s every word, fawning as the dominant GPU player releases a new chip every year. But GPUs aren’t the only silicon essential to AI. It turns out good old CPUs – while less sexy – are the workhorses turning the wheels behind the scenes.

“CPUs are doing critical—but less visible—work managing the pipeline, feeding data to GPUs and scaling distributed AI systems,” Matt Eastwood, research SVP at IDC, told Fierce. “CPUs are essential for orchestrating agentic workflows, handling control logic, and powering edge and hybrid deployments.”

Eastwood’s comments mirror those made by Dan McNamara, SVP and GM of AMD’s Server Business Unit. McNamara told Fierce that CPUs will become especially important with the rise of agentic AI, which will require access to and coordination and preprocessing of data.

“It’s all about data processing and moving data,” he explained. "You’ve got workloads like ETL – extract, transform and load – that’s all on CPUs.”

Divvying up the AI work

Per IDC, AMD is one of the leading players in the CPU market alongside Intel, with Ampere, AWS’ Graviton and Apple’s silicon positioned as emerging competitors. So, it makes sense that McNamara would push this pro-CPU narrative.

But both Eastwood and McNamara told Fierce the same thing: CPUs are vital to the future of AI. Sure, GPUs are better at parallel processing, which is why they shine when it comes to model training and real-time inferencing. But there are still plenty of data-based sequential processing tasks that needs to be done for AI as well.

Asked specifically about what AI-related tasks CPUs are best suited for, McNamara pointed to data preprocessing, memory management, data movement and integration and running both smaller models and edge inferencing workloads.

“GPUs tend to get all the attention because they enable headline grabbing performance gains in model training and inference,” Eastwood said. But “CPUs will remain foundational to AI infrastructure.”

Apparently even the hyperscalers – some of the biggest GPU customers – have realized this as well. McNamara said some enterprises have begun to revamp their CPU holdings, adding AMD is also seeing strong uptake of its Turin processors among cloud platers.

“Refresh cycles have elongated, but we’re in the midst of one right now,” he concluded.