Microsoft’s CoreWeave Deal ‘Adds AI Pressure’ To AWS, Google

Microsoft is reportedly pouring billions into AI infrastructure startup CoreWeave as it looks to control more mindshare in the red-hot artificial intelligence market.

ARTICLE TITLE HERE

Microsoft is looking to take ChatGPT and generative AI technology to the next level by signing a deal with cloud infrastructure and AI startup CoreWeave to support its growing AI portfolio.

One top executive from a solution provider who partners with Microsoft, Google and AWS on a global basis said there is an “open AI arms race” between the largest cloud companies on the planet who are all striving to become the leader of “modern AI” and “generative AI.”

“Amazon, Google, Microsoft all need more, and stronger, infrastructure to power all these new AI use cases and large-scale workloads that we’re seeing from our customers,” said the executive from a U.S.-based multibillion-dollar solution provider, who declined to be named. “What this CoreWeave deal would do for Microsoft, is adds AI pressure to its biggest competitors and makes them sort of have to look somewhere else.”

id
unit-1659132512259
type
Sponsored post

“You could say Microsoft called ‘first dibs’ on CoreWeave’s tech and partnership,” he said. “It looks like this will also help power OpenAI and GPT. … They understand the scale, the ecosystem partnerships, the innovation needed for AI solutions both near-and-long term.”

[Related: 5 Google AI Resources Made For Google Cloud Partners]

Microsoft, which backs ChatGPT owner OpenAI, has reportedly agreed to spend potentially billions of dollars over several years on cloud computing infrastructure from CoreWeave, according to a report by CNBC, citing anonymous sources familiar with the deal.

Microsoft and CoreWeave both declined to comment on the matter.

However, just this week, CoreWeave unveiled it raised $200 million in funding, which was an extension of CoreWeave’s series B funding round earlier this year that raised $221 million.

In total, CoreWeave generated a whopping $421 million during its Series B funding round in 2023, including $100 million from Nvidia.

AI Arms Race Between AWS, Microsoft And Google

The same solution provider executive said Microsoft wants to use CoreWeave’s cloud infrastructure and, specifically, its access to Nvidia’s GPUs, which help power AI offerings such as OpenAI’s GPT large language model (LLMs).

He highlighted CoreWeave’s serverless orchestration technology as a fast way clients can get AI solutions to market. According to CoreWeave’s website, customers can spin up workloads in as little as five seconds with increased portability, less overhead and less management complexity thanks to its serverless technology.

“There’s a ton of demand for Microsoft cloud [infrastructure]—obviously there’s a ton of demand for GCP [Google Cloud Platform] and AWS too—but this gives Microsoft options and more ways to get those Nvidia GPUs,” he said. “It’s a win for Microsoft.”

Both AWS and Google have been forming their own AI partnerships to keep up with Microsoft.

In February, AWS partnered with AI company Hugging Face to create next-generation AI models and solutions. The cloud giant has also launched a slew of new AI technology.

Google has been forming similar partnerships with AI companies such as with Character.AI, while also diving into new AI go-to-market alliances with its most strategic channel partners including Deloitte and Tata Consultancy Services (TCS). Google Cloud has also introduced a slew of new AI solutions in 2023.

What Is CoreWeave?

Founded in 2017, Roseland, N.J.-based CoreWeave is a specialized cloud provider that delivers massive scale of GPU compute resources to customers. CoreWeave builds cloud solutions for compute-intensive use cases, such as AI and machine learning, aimed at improving speed and cutting costs.

Santa Clara, Calif.-based chip player Nvidia, whose market valuation briefly crossed the trillion-dollar mark thanks in part to a generative AI-related boom, is a top CoreWeave backer.

CoreWeave provides simplified access to Nvidia’s flagship GPUs- which are popular for running AI workloads and models on.

CoreWeave said the boom in generative AI technology has accelerated demand for its specialized cloud infrastructure to train, serve inference and fine-tune models.

“In my 25-year career, I’ve never been a part of a company that’s growing like this,” said CoreWeave CEO Michael Intrator in a statement. “It’s an incredible moment in time. From a demand standpoint, revenue and client scale, the rise has been exponential.”