Account Info
Log Out
English
Back
• English
• 中文繁體
• 中文简体
• Dark
• Light
No matches yet
Operations too frequent. Please try again later.
Please check network settings and try again Refresh Refresh
History record
Quotes All >
News All >
Back to the Top
How does ChatGPT understand “stonk o tracker/stonk o'tracker？”
Views 3514 Contents 9

# AI Series | How high is the cost of ChatGPT? 1-year running cost of up to 475 million US dollars

What are the main costs of ChatGPT? The most important costs include computing resource costs (that is, training costs) and running costs. In addition, there is R&D investment (mainly R&D personnel costs).
1、First of all, let's calculate the operating cost: the annual operating cost is as high as 474.7 million US dollars.
A single Nvidia A100 GPU can take up to 350 milliseconds to print a single word on ChatGPT. Chat-GPT is evolved from GPT-3.5, so there should be 175 billion parameters. The training data is about 4.5 billion TB. And a 3 billion parameter model can generate a token in about 6 milliseconds on NVIDIA A100 GPU (half precision+tensorRT+activation caching). Using this speed as a point of reference, a single Nvidia A100 GPU can take 350ms to print a single word on ChatGPT.
Since the latest version, ChatGPT-3.5, has more than 175 billion parameters, about five A100 GPUs are required to perform the necessary operations to obtain the output of a single query. After taking ChatGPT's ability to output about 15-20 words per second, an average of at least 8 A100 GPUs was obtained.
Each word generated on ChatGPT costs \$0.0003. Considering Microsoft's current rate, a single Nvidia A100 GPU costs \$3 per hour. According to the speed of generating a word in 350 milliseconds, the number of words that can be generated per hour can be calculated. Finally, the cost of each word generated on ChatGPT is 0.0003 Dollar.
Each query costs close to \$0.01. Considering that chat-GPT needs about 30 words to reply to the user's query each time, it costs close to \$0.01 per query. Three times higher than the cost of a Google search.
The daily operating cost is as high as \$1.3 million. According to statistics from UBS, the number of monthly active users of Chat-GPT is expected to reach 100 million in January, and about 13 million people use it every day. Therefore, based on the calculation of 13 million people, each person queries 10 times a day, then there are 130 million queries per day, and the cost of each query is 0.01 US dollars, and the final daily cost is as high as 1.3 million US dollars.
Calculated in 365 days a year, the annual running cost is as high as \$474.7 million, which is very expensive.
2、Computing resource cost (i.e. training cost): The cost of a single iteration is as high as \$4.6 million.
A large language model such as ChatGPT requires at least thousands of GPU graphics clusters to train, and the construction cost itself is very high, and there is a certain limit on the number of calculations. After a certain number of calculations, resources will be exhausted. At this time, a new one needs to be built. GPU cluster.
In terms of the training cost of a single iteration, it costs 4.6 million U.S. dollars to train at least ten to dozens of times. Even if there are only 10 times, the cost will reach 46 million U.S. dollars.
3、Summary
Such an expensive ChatGPT means that subsequent commercialization requires better ideas. In many cases, it is not that the AI technology does not meet the requirements, but it is actually unaffordable. Compared with the cost of a person, AI is still too expensive now. So some generative AI companies declared bankruptcy, and Open AI sold itself to Microsoft for funding.
So we can understand why Google, known for its powerful AI, is so conservative in the application of AI. Because on the one hand, Google has formed a search monopoly and is unwilling to spend such a high cost. The high cost will erode its profits. In addition, there is another problem. When you use chatgpt, you may also find that sometimes the data is wrong, and sometimes it is nonsense. Google is also worried that these mistakes will damage its own search brand.
But for Microsoft, the market share of bing is already very small. If there is enough cash flow support, it is willing to use chatgpt to compete for market share.
Let’s talk about cost this time, and next time we will talk about commercialization.
Disclaimer: Community is offered by Moomoo Technologies Inc. and is for educational purposes only. Read more
2
+0
Translate
Report
33K Views
Comment