CSP CapEx to Soar Past US$520 Billion in 2026, Driven by GPU Procurement and ASIC Development
October 13, 2025 | TrendForceEstimated reading time: 2 minutes

TrendForce’s latest investigations reveal that the rapid expansion of AI server demand is propelling global cloud service providers (CSPs), such as Google, AWS, Meta, Microsoft, Oracle, Tencent, Alibaba, and Baidu, to boost investments in NVIDIA’s rack-scale GPU solutions, data center expansion, and in-house AI ASIC design. Total CapEx from the eight major CSPs is expected to surpass US$420 billion in 2025, roughly equivalent to their combined spending in 2023 and 2024, marking a 61% YoY increase.
TrendForce predicts that as rack-scale solutions like GB/VR systems grow in 2026, the total CapEx of the eight CSPs will hit a new peak, surpassing $520 billion with a 24% year-over-year rise. Additionally, investment priorities are moving from revenue-generating assets to short-lived infrastructure such as servers and GPUs, indicating a strategic emphasis on boosting long-term competitiveness and market share rather than immediate profits.
In 2025, NVIDIA’s GB200/GB300 rack systems are poised to become the main deployment targets for CSPs, driven by higher-than-anticipated demand growth. Besides North America’s leading four CSPs and Oracle, emerging customers such as Tesla/xAI, CoreWeave, and Nebius are increasing their purchases for AI cloud leasing and generative AI workloads. By 2026, CSPs are likely to shift from GB300 racks to the new NVIDIA Rubin VR200 rack platform in the latter half of the year.
Custom AI chip production continues to scale up
North America’s leading four CSPs are increasing their investments in AI ASICs to improve autonomy and manage costs for large-scale AI and LLM tasks. Google is partnering with Broadcom on the TPU v7p (Ironwood), an optimized platform for training, set to expand in 2026 and succeed the TPU v6e (Trilium). TrendForce predicts Google’s TPU shipments will stay the highest among CSPs, with over 40% annual growth in 2026.
AWS is prioritizing its Trainium v2 chips, with a liquid-cooled rack version expected by late 2025. The Trainium v3, developed jointly with Alchip and Marvell, is planned for mass production in early 2026. AWS’s ASIC shipments are anticipated to more than double in 2025, marking the fastest growth among major CSPs, with a further approximate 20% increase in 2026.
Meta is enhancing its partnership with Broadcom, with MTIA v2 set for mass production in Q4 2025 to boost inference efficiency and lower latency. In 2025, shipments will mainly back Meta’s internal AI platforms and recommendation systems, while MTIA v3, which includes HBM integration, will launch in 2026, doubling total shipment volume.
Microsoft intends to mass-produce Maia v2 with GUC in the first half of 2026. However, the schedule for Maia v3 has been postponed because of design changes, resulting in limited ASIC shipments in the near term and falling behind competitors.
Testimonial
"The I-Connect007 team is outstanding—kind, responsive, and a true marketing partner. Their design team created fresh, eye-catching ads, and their editorial support polished our content to let our brand shine. Thank you all! "
Sweeney Ng - CEE PCBSuggested Items
AI Demand Fuels Enterprise SSD Growth; 3Q25 NAND Flash Prices Likely to Rise Further
05/27/2025 | TrendForceTrendForce’s latest investigations reveal that continued AI investments by major North American CSPs are expected to drive a significant increase in enterprise SSD demand in the third quarter of 2025.
AI Demand Fuels Enterprise SSD Growth; 3Q25 NAND Flash Prices Likely to Rise Further
05/26/2025 | TrendForceTrendForce’s latest investigations reveal that continued AI investments by major North American CSPs are expected to drive a significant increase in enterprise SSD demand in the third quarter of 2025.
Geopolitical Tensions Fuel a Wave of AI Chip Independence as US and Chinese CSPs Race to Develop In-House ASICs
05/15/2025 | TrendForceTrendForce’s latest research reveals that the surge in demand for AI servers is accelerating the pace at which major US CSPs are developing in-house ASICs, with new iterations being released every one to two years. In China, the AI server market is adjusting to new US export controls introduced in April 2025, which are expected to reduce the share of imported chips (e.g., from NVIDIA and AMD) from 63% in 2024 to around 42% in 2025.
Dell Technologies Accelerates Telecom Transformation with Comprehensive Program, Expanded Partnerships
02/27/2025 | BUSINESS WIREDell Technologies announces the Open Telecom Transformation Program and new AI solutions to simplify and accelerate network cloud transformation for communications service providers (CSPs).
AI Server Growth in 2025 Faces Uncertainties, DeepSeek Effect to Boost AI Inference Share
02/13/2025 | TrendForceTrendForce’s latest investigations reveal that global AI server shipments grew by 46% in 2024, driven by strong demand from CSPs and OEMs. However, multiple factors, including US chip restrictions, the DeepSeek effect, and supply chain readiness for GB200/GB300 racks, could impact AI server shipments in 2025. TrendForce has outlined the following three scenarios based on these uncertainties.