Account Info
Log Out
English
Back
Log in to access Online Inquiry
Back to the Top

Yokozuna Sumo

Elon Musk revealed the true scale of his AI-related ambitions by revealing that Tesla's AI training computing capacity will increase approximately 467% year over year to 85,000 H100 GPU equivalent units by the end of 2024.
Meta invests $1 billion in Nvidia H100 GPU cluster
The training side---
• By 2023, total H100 shipments reached 1.2 million copies (Omdia data).
• To train GPT-4 with 1.8 trillion parameters, 8,000 H100s are required.
This means that the H100 cards currently available on the market can train around 150 GPT-4 models. However, future GPT-5 and GPT-6 improvements, demand for image models and future video models, and demand on the training side are still very high. According to a paper published by FactorialFunds in March 2021, approximately 72,000 H100 sheets will be required after Sora is officially launched.
Disclaimer: Community is offered by Moomoo Technologies Inc. and is for educational purposes only. Read more
9
1
+0
See Original
Report
49K Views
Comment
Sign in to post a comment
    最強のPFを作る! cost lly msft UNH nvda TMF サイボウズ スルガ銀行 TOPIX
    162Followers
    2Following
    735Visitors
    Follow