Arm-based central processing units (CPUs) are on course to capture at least 90% of the host CPU market in custom artificial intelligence (AI) server infrastructure by 2029, up from approximately 25% in 2025, according to new research from global technology market intelligence firm Counterpoint Research. The projection signals one of the most consequential shifts in the semiconductor industry in a generation, as the world’s largest technology companies accelerate a departure from the Intel and AMD x86 processors that have dominated data centres for decades.
The transition, documented in Counterpoint Research’s latest Data Center AI Server Compute ASICs Shipment Forecast and Tracker from its High Performance Computing (HPC) service, has been building gradually but is now reaching a structural tipping point. The driving force is the rise of custom AI accelerators, or Application-Specific Integrated Circuits (ASICs), which require a new approach to server architecture that exposes the inefficiencies of legacy x86 designs.
Counterpoint Research Vice President Neil Shah said the shift is deliberate and commercially motivated rather than experimental. “The transition from x86 to Arm in AI servers is not a single switch. It has played out generation by generation, configuration by configuration. Hyperscalers are making deliberate choices based on their specific deployment needs, writing compatible and interoperable software, and the economics are very encouraging,” he said, adding that the transition is expected to “accelerate meaningfully in the second half of 2026.”
The economics are stark. Arm’s AGI CPU delivers more than twice the performance per rack compared with x86 platforms, and Arm projects up to $10 billion in capital expenditure savings per gigawatt of AI data centre capacity versus x86 alternatives. Counterpoint Research also notes that Arm-based CPUs have demonstrated up to twice the performance-per-watt of comparable x86 rack configurations, a critical advantage as hyperscalers seek to maximise compute density within fixed power envelopes.
The most significant recent development cementing this direction is Arm’s entry into production silicon for the first time in its more than three-decade history. Arm launched its AGI CPU, an Arm-designed CPU for AI data centres built to address a rising class of agentic AI workloads, with Meta serving as the lead partner and co-developer. Meta will deploy the Arm AGI CPU alongside its own custom Meta Training and Inference Accelerator (MTIA) chips. Meta confirmed it will also release its board and rack designs for the Arm AGI CPU under the Open Compute Project later this year, making the architecture available to the broader AI ecosystem.
Santosh Janardhan, head of infrastructure at Meta, said the partnership reflects how AI is fundamentally changing the structure of data centre hardware. “AI is reshaping how data center infrastructure is built and deployed at scale. Our collaboration with Meta to co-develop the Arm AGI CPU reflects the next phase of the Arm compute platform,” he said.
The shift extends well beyond Meta. For Google, the ramp-up of its Axion Arm-based CPU in its next-generation Tensor Processing Unit (TPU) infrastructure is described by Counterpoint Research as the most significant single event in the transition. Amazon Web Services has been shifting across successive Trainium generations, with its Arm-based Graviton processors playing a growing role in higher-density configurations. Microsoft has paired its Azure Cobalt Arm CPU with its Maia AI accelerator family from the outset.
Counterpoint Research Associate David Wu said the transition carries important implications for investors and technology strategists tracking which specific deployments are changing. “While x86 architectures currently maintain a significant presence in AI server infrastructure, our generation-by-generation analysis suggests this established stronghold is swiftly transitioning toward proprietary Arm-based designs. Understanding which specific hyperscaler and which ASIC generation is transitioning from x86 to Arm is where the actionable insight lives,” he said.
Arm estimates the AGI CPU will generate gross profits of approximately $500 per chip, compared with around $50 currently through selling intellectual property licences alone, representing a 10x increase in revenue per socket. Arm projects that by 2031, AGI CPU sales will contribute $15 billion to its annual revenue, alongside $10 billion from its traditional intellectual property and compute subsystem business.
The shift also carries significant implications for the broader semiconductor supply chain, particularly for Taiwan Semiconductor Manufacturing Company, which manufactures both AI accelerator chips and Arm-based CPUs. As hyperscalers bring both layers of AI server hardware in-house simultaneously, advanced foundry capacity will face growing demand from two directions at once.
For Intel and AMD, the transition represents a meaningful threat to their server CPU businesses. Counterpoint’s Shah noted that the pressure is now on the x86 camp to protect its market share, whether through pricing, a better architecture, or software improvements.


