200MW of US-UAE's Jointly Planned 5GW AI Campus Is Coming Online Soon, Powered by 1000s of Next-Gen Chips

Agentic AI Sparks a CPU Comeback as Substrate Orders Surge

AI CPU Demand Is Starting to Reshape the Semiconductor Supply Chain, Ibiden Forecasts

The growing interest in agentic AI computing is no longer just a story about GPUs and custom accelerators. Demand is now beginning to ripple through the broader semiconductor supply chain, including the market for CPU package substrates used in servers.

Japanese printed circuit board and package substrate supplier Ibiden has indicated that it expects stronger demand for products tied to general-purpose servers in the coming years. The company’s latest financial outlook suggests that AI workloads are pushing the industry toward a wider mix of chips, with CPUs becoming increasingly important alongside GPUs and ASICs.

Ibiden plays a key role in the upstream chip supply chain through its Electronics segment, which manufactures integrated circuit package substrates. These substrates act as the critical connection between a chip and the printed circuit board, making them essential for modern processors used in AI servers, data centers, and high-performance computing systems.

The company supplies substrates for several categories of advanced chips, including AI GPUs, server CPUs, and application-specific integrated circuits, commonly known as ASICs. ASICs are custom chips designed for specific workloads and are increasingly used by major technology companies to improve AI performance, efficiency, and cost control.

Ibiden’s latest financial materials show that the company is preparing for a stronger AI-driven growth cycle. For its fiscal year 2026, which ends in March 2027, Ibiden now expects revenue from its Electronics segment to reach 330 billion Japanese yen. That is higher than its previous forecast of 310 billion yen.

The upgraded forecast is being supported by expected demand for general-purpose server products and switching IC products. The general-purpose server category is especially notable because it is closely tied to server CPUs, which are expected to play a larger role as AI systems evolve.

One of the biggest shifts highlighted by Ibiden is the movement from AI training toward AI “intelligence,” a term that points to more inference, reasoning, autonomous agents, and real-time decision-making workloads. While GPUs remain central to AI model training, CPUs are becoming increasingly relevant for managing complex AI workflows, coordinating tasks, handling data movement, and supporting agentic AI systems in large-scale server environments.

This shift could make server CPUs a more important part of future AI infrastructure spending. As AI moves beyond training massive models and into practical deployment across enterprise systems, cloud platforms, and automated digital agents, demand for balanced server architectures may increase. That means more CPUs, more switching chips, more custom silicon, and more advanced package substrates.

Ibiden expects its production load in calendar year 2026 to reach 1.8 times its 2024 level. Looking further ahead, the company forecasts that load could climb to 2.4 times by 2028. Growth is expected to come from AI servers, ASICs, and server CPUs, while demand connected to PCs is expected to weaken.

The forecast is another sign that the AI hardware boom is broadening. For much of the recent AI cycle, attention has focused heavily on graphics processors and high-bandwidth memory. However, the supply chain is now showing signs of demand spreading into other critical components, including CPU substrates, switching ICs, and custom accelerator packaging.

This matters because package substrates are not optional components. As chips become larger, faster, and more power-hungry, substrate technology becomes increasingly important for performance, reliability, and manufacturing scale. Any rise in demand for AI servers can quickly translate into stronger demand for the materials and components that support advanced processors.

The broader takeaway is clear: AI infrastructure is becoming more diverse. GPUs remain essential, but CPUs and ASICs are gaining importance as companies build systems capable of supporting training, inference, automation, and agentic AI workloads. Ibiden’s outlook suggests that this evolution is already being felt upstream, long before finished AI servers reach data centers.

If these trends continue, the next stage of AI hardware growth may not be defined by one chip category alone. Instead, it could be driven by a combination of server CPUs, AI accelerators, custom ASICs, networking chips, and the advanced substrates required to connect them all.