TrendPulse Logo

‘FOMO has proven a stronger incentive than poor stock performance’: Goldman Sachs just issued a brutal verdict on the AI boom

Source: FortuneView Original
businessMay 6, 2026

The numbers coming out of Wall Street’s most influential research shop tell a story that Silicon Valley would rather not hear.

Recommended Video

In two separate reports published last month, Goldman Sachs analysts examined the great AI infrastructure build-out from opposite ends of the telescope — one team studying how much the machine will cost to build, another studying whether the machine is actually working — and arrived at a rare institutional moment: two wings of a single firm arguing, simultaneously, that the machine costs more than anyone knows and produces less than anyone admits.

Notably, it is not the first time Goldman has said something like this. James Covello, the firm’s head of global equity research, has been one of Wall Street’s most prominent and consistent AI skeptics since he co-authored the original “Too Much Spend, Too Little Benefit?” report in June 2024 — a piece that landed like a thunderclap precisely because it came from inside one of the institutions most deeply enmeshed in financing the boom it was questioning. Goldman advises hyperscalers, underwrites chip company offerings, and sits at the table with the companies building the very infrastructure Covello was interrogating.

Two years later, Covello is back with an update. He was wrong about some things, he acknowledges. But on the central question — whether the spending is producing commensurate returns — he has only gotten more convinced.

Trillion-dollar bill

Start with the cost. The Goldman Sachs Global Institute, in a report titled “Tracking Trillions,” projects roughly $7.6 trillion in cumulative AI capital expenditure between 2026 and 2031 — covering chips, data centers, and power infrastructure. Annual spending is expected to more than double over that period, from $765 billion this year to $1.6 trillion by 2031.

Those figures, the report is careful to note, are not forecasts. They are baseline estimates — and extremely sensitive ones at that. Change a single assumption about how quickly AI chips become obsolete, and cumulative spending swings by hundreds of billions of dollars. Build the next generation of data centers at $19 million per megawatt instead of $15 million, and total data center costs balloon by more than $500 billion over the projection period. The report’s central message is blunt: the $4 trillion to $8 trillion figures that have “featured prominently in recent market commentary” are “far more conditional than they appear.”

The physical reality underlying those numbers is staggering in its own right. Today’s leading AI systems pack 72 processors into a single rack, connected by hundreds of thousands of kilometers of cabling. The facilities housing them require industrial-scale liquid cooling, dedicated power delivery, and redundancy systems that didn’t exist in conventional data center design a decade ago. A standard cloud data center from the 2010s might have been built at $10 million per megawatt. The next generation of AI-optimized facilities costs $15 million to $20 million, and some facilities built just two years ago are already considered insufficiently equipped for the chips being manufactured today.

What is the return on investment?

Then there’s Covello’s perspective.

Covello writes that he spent two years tracking what all that investment is actually producing for the companies deploying it. His findings do not make for comfortable reading in the boardrooms of companies that have staked their technology roadmaps on artificial intelligence.

Despite $30 billion to $40 billion in enterprise investment in generative AI, Covello cited the influential MIT Labs report, as reported by Fortune, which found that 95% of organizations were getting zero return on their AI pilots. A 2025 EY survey found that 99% of companies in its sample reported financial losses due to AI-related risks, with an average loss of $4.4 million per company. A Wall Street Journal survey found a yawning gap between what C-suites say AI is doing for productivity and what workers on the ground actually report. One AI hiring startup tested frontier AI agents on 480 workplace tasks commonly performed by bankers, consultants, and lawyers. Every agent failed to complete most of its duties.

“56% of Americans say they use AI,” the report quotes one research firm saying, “yet 85% of the workforce does not have a value-driving AI use case.”

IT budgets, rather than shrinking as executives promised shareholders, are growing. Gartner projects global IT spending to rise from $5 trillion in 2024 to $6.15 trillion in 2026. The cost savings have not materialized. Harvard Business Review research cited in the report found that AI-generated errors — what researchers are calling “workslop” — cost a 10,000-person organization more than $9 million annually in lost productivity. Far from efficiencies, AI appears to be generating new headaches and expenses in many cases.

Nvidia: the AI economy’s big winner

Somewhere between the two report