$700 billion poured into AI, and Americans are the first to taste the bitterness of inflation.

robot
Abstract generation in progress

null

On April 1, St. Louis Fed economists Miguel Faria-e-Castro and Serdar Ozkan published a blog post. The title is restrained, but the conclusion is biting: AI optimism itself is an inflation driver. Not because electricity bills are rising, and not because chips are in short supply. It’s because everyone believes AI will make the future better—so that belief makes them start spending more right now.

On the same day, Fortune disclosed a Deutsche Bank experiment: they had three AI models assess the “impact of AI on inflation.” The conclusion was that even the AI itself thinks it is pushing up prices.

There’s no shortage of posts on social media about soaring U.S. prices

Put these two things together, and they point to an uncomfortable loop: the more you invest in AI, the higher inflation rises, the farther rate cuts are pushed out, and the higher borrowing costs get—but investment is still accelerating.

An arms race that won’t stop

First, follow the money. According to each company’s financial reports, the combined capital expenditures of Amazon, Microsoft, Google, and Meta in 2023 totaled about $152 billion. By 2024, that figure jumped to $251 billion, up 65%. For full-year 2025, it landed at $416 billion again, up 66%.

The guidance for 2026 is even more aggressive. According to Wolf Street’s compilation, Amazon’s guidance is $200 billion, Google’s is $175 billion to $185 billion, Microsoft’s is $145 billion to $150 billion, and Meta’s is $135 billion. Together, the four total about $663 billion. If you include Oracle’s $42 billion, the five together come to nearly $700 billion.

In four years, the four companies’ capital expenditures have quadrupled. That pace has no precedent in U.S. corporate history. As Fortune reported, this scale already exceeds Sweden’s full-year GDP.

One data center offsets an entire state’s electricity use

Most of this money flows to data centers. And the biggest bottleneck for data centers is not land—it’s power. According to EIA data, Vermont uses about 5,364 gigawatt-hours of electricity per year, which translates to an average load of 0.61 gigawatts. Rhode Island is slightly higher, at about 0.83 gigawatts.

Now take a look at what data centers are doing. According to company announcements, the Stargate project—OpenAI’s collaboration with Oracle and SoftBank—has a planned total power capacity of 10 gigawatts, equal to 16 times Vermont’s entire electricity use. Meta’s Hyperion campus in Louisiana plans 5 gigawatts, with an investment of $27 billion. Musk’s xAI, through its Colossus in Memphis, Tennessee, has expanded to 2 gigawatts; according to Introl, it has deployed 555,000 Nvidia GPUs, costing about $18 billion. Amazon and Anthropic’s Project Rainier, co-built in Indiana, plans 2.2 gigawatts.

According to S&P Global data, U.S. data centers consumed 183 terawatt-hours of electricity in 2024, accounting for more than 4% of total national electricity use. By 2030, that figure is expected to triple.

These power demands are not distant, planned stories—they are already squeezing existing power grids. According to a CBRE report, the vacancy rate for North American data centers fell from 3.3% in the first half of 2023 to 1.6% in the first half of 2025, the lowest level on record. According to Cushman & Wakefield data, the vacancy rate ticked up slightly to 3.5% in the second half of 2025, but that’s only because a large wave of newly built capacity is being delivered—absolute levels are still at historically low levels, and meaningful supply relief before 2030 is unlikely.

Even AI says it’s boosting inflation

As these investments drive demand, lift power prices, and pull chips into a shortage, there’s another, more hidden inflation channel.

According to a Fortune report on April 1, a team led by Deutsche Bank’s chief U.S. economist Matthew Luzzetti ran an experiment: they had Deutsche Bank’s in-house model dbLumina, Anthropic’s Claude, and OpenAI’s ChatGPT-5.2 each estimate the “probability that AI will raise inflation over the next year.”

Results: dbLumina gave 40%, Claude gave 25%, and ChatGPT-5.2 gave 20%. All three models agreed on the probability that AI would “significantly reduce inflation”: just 5%.

The inflation drivers cited by the three models were highly consistent: data centers are expanding at scale, semiconductor demand is surging, and the electricity consumption of AI workloads is rising rapidly—these are all demand-driven sources of upward price pressure.

This runs counter to a consensus among some investors on Wall Street. The Deutsche Bank team wrote in a research note: “Will AI become the main disinflationary force? Even AI itself doesn’t think so.”

On a five-year horizon, the models do indeed tilt toward more disinflationary possibilities. But the probability of “AI causing large-scale disinflation” is still pushed into the tail-risk range.

Optimism itself is inflation

The St. Louis Fed paper provides a theoretical framework to explain all of this.

Faria-e-Castro and Ozkan use a standard macroeconomic model and define the AI investment boom as a “news shock.” As described in the Fed’s blog post, the model’s logic is: when households see AI described as revolutionary technology, they expect future income to rise and increase consumption ahead of time. Firms expect productivity improvements and boost investment. Together, these forces cause demand to outpace supply quickly. The paper writes: “These forces jointly produce an inflationary surge in aggregate demand—this is a core feature of the news shock’s early phase.”

The model offers two paths. If AI truly brings a leap in productivity, then in the short run inflation would be absorbed by longer-run output growth, and the economy would enter a virtuous cycle. But if productivity doesn’t materialize—the paper’s wording is “sustained low growth and stubbornly high inflation,” i.e., stagflation.

Based on data cited in the Fed’s blog post, since the release of ChatGPT, the annualized growth rate of U.S. total factor productivity (TFP) has been 1.11%, below the historical average of 1.23%. So far, AI has not left a mark in the productivity data.

Meanwhile, according to BLS data, in February 2026 the U.S. CPI is up 2.4% year over year, and core CPI is up 2.5%—neither has returned to the Fed’s 2% target yet. The Fed’s March dot plot shows a median year-end rate forecast of 3.4%, pointing to only one rate cut this year.

$700 billion is pouring into AI infrastructure. Whether this money is the cause of inflation—or the prelude to a productivity revolution—depends on a question no one has been able to answer yet: will the models running inside these data centers actually make the economy more efficient?

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin