‘We’re Lucky Because We Have Our Own TPUs,’ Says DeepMind CEO Demis Hassabis — Yet Admits AI Memory Shortage Constrains Research

‘We’re Lucky Because We Have Our Own TPUs,’ Says DeepMind CEO Demis Hassabis — Yet Admits AI Memory Shortage Constrains Research


The global artificial intelligence race is accelerating, but even the most advanced labs are running into hardware limits.

Demis Hassabis, CEO of Google’s DeepMind AI unit, recently told CNBC that shortages in memory, graphics processing units and electricity are slowing AI deployment.

“Yes, I think that’s constraining a lot of the deployment for sure,” Hassabis said.

China remains central to the global AI competition.

Hassabis told CNBC that Chinese developers are likely still several months behind leading U.S. labs, though the gap may now be smaller than previously estimated.

Don’t Miss:

He said there are “very talented teams” in China and that those AI teams’ “seed models are really good,” pointing to recent advances from companies including Alibaba and ByteDance. He said one or two additional breakthroughs are still needed before artificial general intelligence is achieved.

The constraints extend beyond product rollout and into experimentation. “Also, it does constrain a little bit the research,” Hassabis told CNBC.

Testing new ideas requires substantial chip capacity to determine whether they work at scale, he said. Demand for Gemini, Google‘s main AI system, and its other models also exceeds what the company can currently supply.

Trending: 1.5 Million Users Are Already Working Inside This AI Platform — Investors Can Still Get In

The pressure on memory supply is affecting other technology companies as well. Apple Inc. (NASDAQ:AAPL) Chief Financial Officer Kevan Parekh said on the company’s fiscal Q1 2026 earnings call in January that memory pricing continues to rise significantly and is expected to have a greater impact in fiscal Q2.

HP Inc. (NYSE:HPQ) CFO Karen Parkhill echoed similar pressure in the company’s fiscal Q1 2026 earnings report last month. “With just one quarter behind us in a dynamic environment marked by increasing memory costs, we are holding our outlook for the year yet currently anticipate results to be closer to the low end of our range,” she said.

Google designs its own tensor processing units to power its AI systems, giving it greater control over its computing architecture.

“We’re lucky because we have our own TPUs, so we have our own chip designs,” Hassabis told CNBC.

See Also: It’s no wonder Jeff Bezos holds over $250 million in art — this alternative asset has outpaced the S&P 500 since 1995, delivering an average annual return of 11.4%. Here’s how everyday investors are getting started.

Even so, he said reliance on a small group of suppliers for key components remains a factor. Capacity constraints at any point in the supply chain can create bottlenecks, and he said the broader ecosystem is under strain.


finance.yahoo.com
#Lucky #TPUs #DeepMind #CEO #Demis #Hassabis #Admits #Memory #Shortage #Constrains #Research

Share: X · Facebook · LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *