Elon Musk Says That Tesla’s AI Training Capacity Will Be Equivalent to Around 85,000 Units of NVIDIA’s H100 Chips by the End of 2024

Rohail Saleem
Tesla

This is not investment advice. The author has no position in any of the stocks mentioned. Wccftech.com has a disclosure and ethics policy.

Tesla's AI training compute capacity increased by 130 percent in Q1 2024 alone. However, if Elon Musk's ambitions pan out, this capacity is heading for a nearly 500 percent increase through 2024.

Related Story NVIDIA’s Blackwell DGX GB200 AI Servers To Enter Mass-Production By H2 2024, Generating Massive Revenue Figures

We noted in a previous post that Tesla likely owns between 30,000 and 350,000 units of NVIDIA's H100 GPUs, based on Elon Musk's comments in the above X post.

Source: Q1 2024 Earnings Deck

Today, as part of its Q1 2024 earnings deck, Tesla revealed that its AI training capacity has increased to nearly 40,000 equivalent units of NVIDIA's H100 GPU, perfectly aligning with Elon Musk's stated range.

Of course, back in January, while confirming a new $500 million investment -  equivalent to around 10,000 units of the H100 GPU - in Tesla's Dojo supercomputer, Elon Musk had announced that the EV giant would "spend more than that on NVIDIA hardware this year" as the "stakes for being competitive in AI" were "at least several billion dollars per year at this point."

Well, Elon Musk has now revealed the true scale of his AI-related ambitions by disclosing that Tesla's AI training compute capacity will increase by around 467 percent year-over-year to 85,000 equivalent units of the H100 GPU by the end of 2024.

This aggressive expansion is already forcing Tesla to sacrifice its free cash flow. As part of its Q1 2024 earnings, the EV giant has revealed that it recorded -$2.5 billion in free cash flow for the quarter, "driven by an inventory increase of $2.7B and AI infrastructure capex of $1.0B in Q1."

Elon Musk is also aggressively deploying AI compute capacity at his artificial intelligence-focused enterprise, xAI. We noted in a recent post that xAI now likely owns between 26,000 and 30,000 units of NVIDIA's AI-focused graphic cards.

Do note that NVIDIA's H100 chips are expected to cede the proverbial ground to the latest GB200 Grace Blackwell Superchip sometime this year. The chip combines one Arms-based Grace CPU with two Blackwell B100 GPUs and can deploy an AI model with 27 trillion parameters. Moreover, the superchip is expected to be 30 times faster at tasks like serving up answers from chatbots.

Share this story

Deal of the Day

Comments