Site iconSite icon ForkLog

ARK Invest sees AI‑infrastructure outlays hitting $1.5trn by 2030

ARK Invest sees AI‑infrastructure outlays hitting $1.5trn by 2030

The falling cost of training neural networks is making the technology more accessible, while swelling demand is prompting heavy investment in computing power. By 2030, global spending on AI infrastructure could approach $1.5trn, according to ARK Invest.

Prices fall, demand surges

Analysts estimate that the cost of training neural networks is dropping by 75% a year. Inference for models scoring above 50% on benchmarks is getting cheaper still—by an average of 95%.

Source: ARK Invest.

Cheaper technology usually cuts spending. Not so with artificial intelligence: as training and deployment become more affordable, a wider array of tasks becomes economically viable.

Mass adoption of AI is occurring twice as fast as the internet. In just three years penetration has reached 20%. The web took more than six.

Enterprise demand is surging too. Token requests via OpenRouter have risen 28-fold since December 2024. Anthropic lifted annual revenue from $100m in 2023 to $14bn by February 2026. By November 2025 OpenAI had 1m business customers.

Source: ARK Invest.

An infrastructure boom

Since ChatGPT’s debut, demand for accelerated computing has rocketed. Nvidia’s annual revenue rose from $27bn in 2022 to $216bn in 2025. Analysts expect it to reach $350bn in 2026.

Global growth in investment in server systems has accelerated from 5% a year (in the decade to 2022) to 30% over the past three years. According to ARK, GPU- and ASIC-based solutions have become the dominant segment, accounting for 86% of server compute.

Private investment in AI infrastructure topped $200bn in 2025, of which about $80bn went to foundation-model developers. Hyperscalers are seeking alternative financing: Meta’s $30bn deal with Blue Owl was the largest private-capital transaction on record.

The chip wars

Booming demand has intensified competition among hardware makers. AMD has caught up with Nvidia on total cost of ownership (TCO) for inference on smaller models. But in heavyweights, Nvidia retains the performance lead thanks to its Grace Blackwell architecture.

Source: ARK Invest.

Hyperscalers are rolling out their own semiconductor designs. Google has been designing TPU for ten years. SemiAnalysis estimates that using custom chips for internal workloads can cut compute costs by 62% compared with Nvidia-based architectures.

Amazon is pushing Trainium, making it Anthropic’s preferred training platform. Microsoft is deploying second-generation Maia accelerators, optimised for inference.

Broadcom dominates back-end design, partnering on Google’s TPU, Meta’s MTIA and OpenAI’s forthcoming chip. Citi forecasts the company’s AI revenue will grow from $20bn in 2025 to $100bn in 2027.

Startups with novel architectures are stirring. Cerebras, known for its Wafer Scale Engine chip, plans to list this year. Groq, for its part, signed a $20bn licensing agreement with Nvidia.

Outlook

ARK reckons annual investment in AI infrastructure will reach $1.5trn by 2030—tripling in five years. Specialised ASICs’ share of computing capacity will rise to a third of the market.

Source: ARK Invest.

“The infrastructure being built today is not a bubble ready to burst, but the foundation of a platform shift that happens once a generation. Useful AI agents are only starting to be deployed; they are ‘token-hungry’ but far more capable than users are accustomed to. Scaling these agents to millions of businesses will require colossal computation, justifying the investment,” the experts concluded.

Earlier, analysts at Citrini Research predicted an economic collapse caused by artificial intelligence.

Exit mobile version