Fei-Fei Li picks Google Cloud, where she led AI, as World Labs’ main compute provider

Date:

Share post:


Cloud providers are chasing after AI unicorns, and the latest is Fei-Fei Li’s World Labs. The startup just tapped Google Cloud as its primary compute provider to train AI models, a deal that could be worth hundreds of millions of dollars. But Li’s tenure as chief scientist of AI at Google Cloud wasn’t a factor, the company says.

During the Google Cloud Startup Summit on Tuesday, the companies announced World Labs will use a large chunk of its funding to license GPU servers on the Google Cloud Platform, and ultimately train “spatially intelligent” AI models.

A handful of well-funded startups building AI foundation models are highly sought after in the cloud services world. Some of the largest deals include OpenAI, which exclusively trains and runs AI models on Microsoft Azure, and Anthropic, which uses AWS and Google Cloud. These companies regularly pay millions of dollars for computing services, and could one day need even more as their AI models scale. That makes them valuable customers for Google, Microsoft, and AWS to build relationships with early on.

World Labs is certainly building unique, multimodal AI models with significant compute needs. The startup just raised $230 million at more than a billion-dollar valuation, a deal led by A16Z, in order to build AI world models. General manager of startups and AI at Google Cloud, James Lee, tells TechCrunch that World Labs’ AI models will one day be able to process, generate, and interact with video and geospatial data. World Labs calls these AI models “spatial intelligence.”

Li has deep ties with Google Cloud, having led the company’s AI efforts in 2018. However, Google denies that this deal is a product of that relationship, and rejects the idea that cloud services are just commodities. Instead, Lee said services, such as its High Performance Toolkit to scale AI workloads, and its deep supply of AI chips were a larger factor.

“Fei-Fei is obviously a friend of GCP,” said Lee in an interview. “GCP wasn’t the only option they looked at. But for all the reasons we talked about – our AI optimized infrastructure and the ability to meet their scalability needs – ultimately they came to us.”

Google Cloud offers AI startups a choice between its proprietary AI chips, tensor processing units or TPUs, and Nvidia’s GPUs, which Google purchases and has a more limited supply of. Google Cloud is trying to get more startups to train AI models on TPUs, largely as a means to reduce its dependency on Nvidia. All cloud providers are limited today by the scarcity of Nvidia GPUs, so many are building their own AI chips to meet demand. Google Cloud says some startups are training and inferencing solely on TPUs, however, GPUs still remain the industry’s favorite AI training chip.

World Labs chose to train its AI models on GPUs in this deal. However, Google Cloud wouldn’t say what went into that decision.

“We worked with Fei-Fei and her product team, and at this stage of their product roadmap, it made more sense for them to work with us on the GPU platform,” said Lee in an interview. “But it doesn’t necessarily mean it’s a permanent decision… Sometimes [startups] move onto different platforms, such as TPUs.”

Lee would not disclose how large World Labs’ GPU cluster is, but cloud providers often dedicate massive supercomputers for startups training AI models. Google Cloud promised another startup training AI foundation models, Magic, a cluster with “tens of thousands of Blackwell GPUs,” each of which has more power than a high-end gaming PC.

These clusters are easier to promise than they are to fulfill. Google’s cloud services competitor Microsoft is reportedly struggling to meet the insane compute demands of OpenAI, forcing the startup to tap other options for computing power.

World Labs’ deal with Google Cloud is not exclusive, meaning the startup may still strike deals with other cloud providers. However, Google Cloud says it will get a majority of its business moving forward.



Source link

Lisa Holden
Lisa Holden
Lisa Holden is a news writer for LinkDaddy News. She writes health, sport, tech, and more. Some of her favorite topics include the latest trends in fitness and wellness, the best ways to use technology to improve your life, and the latest developments in medical research.

Recent posts

Related articles

OpenAI’s GPT-5 reportedly falling short of expectations

OpenAI’s efforts to develop its next major model, GPT-5, are running behind schedule, with results that don’t...

OpenAI announces new o3 model — but you can’t use it yet

Welcome back to Week in Review. This week, we’re looking at OpenAI’s last — and biggest —...

Google pushes back against DOJ’s ‘interventionist’ remedies in antitrust case

Google has offered up its own proposal in a recent antitrust case that saw the US Department...

If climate tech is dead, what comes next?

Humans have an innate desire to name things, but to be honest, we’re not always that good...

Hollywood angels: Here are the celebrities who are also star VCs

Becoming a venture capitalist has become the latest status symbol in Hollywood.  Everyone these days, from Olivia Wilde...

Meet Skyseed, a VC fund and incubator backing the Bluesky and AT Protocol ecosystem

On November 15, Peter Wang posted a message requesting ideas for a new incubator and fund to...

Sam Altman disputes Marc Andreessen’s description of AI meetings with Biden administration

Famed investor Marc Andreessen recently talked about meetings with Biden administration staff who gave him the impression...

EV startup Canoo places remaining employees on a ‘mandatory unpaid break’

Struggling electric van startup Canoo has placed its remaining employees on what it’s calling a “mandatory unpaid...