Big tech is grappling with a severe CPU shortage as AI demand surges. The scarcity has rippled across the semiconductor supply chain, creating a challenge and an opportunity for longtime CPU mainstay Intel.
Intel believes the CPU, once the cornerstone of modern computing, is poised for a resurgence that could shift the AI infrastructure market. CPUs could become the “indispensable foundation of the AI era,” Intel CEO Lip-Bu Tan said, during a recent quarterly earnings call.
Intel’s nearly ubiquitous commercial computing CPU remains an industry standard, but the company's revenue has stalled, largely due to the expansion of its custom foundry business.
As AI adoption evolves, Intel’s momentum is picking up. With the rise of agentic AI, the humble CPU is becoming increasingly vital for handling logic-driven tasks, such as autonomous decision-making, according to Intel executives.
Demand for Intel’s Xeon server CPUs is currently outpacing supply, and AI-related businesses now represent 60% of revenue during Q1, with AI-related revenue growing 40% year-over-year, the company said during a Q1 2025 earnings call. Intel also reported its first-quarter revenue was up 7% year-over-year.
AI infrastructure has generally depended on GPUs, which excel at the fast, parallel computing needed to train large AI models on vast datasets. Meanwhile, CPUs have played a supporting role, managing workloads and orchestrating data flow.
Over the last few years, the AI-driven GPU boom unseated Intel as the top player in the global semiconductor industry while Nvidia’s market share soared. As agentic AI usage spreads and inference workload volume rises, Intel could see its fortunes shifting once again.
“We don’t see CPUs ‘replacing’ existing AI infrastructure, but their importance and need is rising as AI moves from traditional chat-bot based inference to agentic inference where multiple agents collaborate to reach outcomes,” Anil Nanduri, VP of AI Products and GTM at the Intel Data Center Group, told Channel Dive in an email statement. “This boom won’t be defined only by how much compute is added, but by how efficiently that compute is deployed across the stack and improves utilization of the overall datacenter for inference.”
Agentic AI will require CPU-intensive work, including multi-step orchestration, real-time reasoning and code generation at scale, according to Gartner VP analyst Gaurav Gupta.
“As this transition continues to take place, companies will invest in agentic infrastructure, and we'll see a growing prominence of CPUs,” Gupta told Channel Dive.
Brendan Burke, Research Director at Futurum, said cost-efficiency is also an important component of the CPU comeback.
“Using GPUs for streamline tasks like inference and reinforcement learning rollouts is highly inefficient, because most GPU cores sit idle during that time,” Burke told Channel Dive. “So, you’re paying for a very large chip that isn’t being properly utilized. When there's a tenfold price difference between a server CPU and a server GPU, and you can carry out similar operations on a CPU, it makes economic sense to use the CPUs available to carry out those low-level tasks, while reserving the GPUs for model training.”
Several tech giants have reinvigorated their own CPU projects in anticipation of surging demand: Gupta cited Nvidia’s upcoming Vera CPU, British semiconductor group Arm launching its AGI CPU, with Meta as its first customer, and major ongoing investments by Amazon and Google in their proprietary chips.
Renewed focus on CPUs caught some in the tech industry by surprise. “I think the industry was unprepared for this re-prominence of CPUs,” Gupta said. “For the last couple years, the story was all about GPUs and everyone had sort of forgotten or taken CPUs for granted.”