Account Info
Log Out
English
Back
Log in to access Online Inquiry
Back to the Top
How does ChatGPT understand “stonk o tracker/stonk o'tracker?”
Views 4160 Contents 9

AI Series | How high is the cost of ChatGPT? 1-year running cost of up to 475 million US dollars

avatar
Noah Johnson joined discussion · Feb 9, 2023 05:05
What are the main costs of ChatGPT? The most important costs include computing resource costs (that is, training costs) and running costs. In addition, there is R&D investment (mainly R&D personnel costs).
1、First of all, let's calculate the operating cost: the annual operating cost is as high as 474.7 million US dollars.
A single Nvidia A100 GPU can take up to 350 milliseconds to print a single word on ChatGPT. Chat-GPT is evolved from GPT-3.5, so there should be 175 billion parameters. The training data is about 4.5 billion TB. And a 3 billion parameter model can generate a token in about 6 milliseconds on NVIDIA A100 GPU (half precision+tensorRT+activation caching). Using this speed as a point of reference, a single Nvidia A100 GPU can take 350ms to print a single word on ChatGPT.
Since the latest version, ChatGPT-3.5, has more than 175 billion parameters, about five A100 GPUs are required to perform the necessary operations to obtain the output of a single query. After taking ChatGPT's ability to output about 15-20 words per second, an average of at least 8 A100 GPUs was obtained.
Each word generated on ChatGPT costs $0.0003. Considering Microsoft's current rate, a single Nvidia A100 GPU costs $3 per hour. According to the speed of generating a word in 350 milliseconds, the number of words that can be generated per hour can be calculated. Finally, the cost of each word generated on ChatGPT is 0.0003 Dollar.
Each query costs close to $0.01. Considering that chat-GPT needs about 30 words to reply to the user's query each time, it costs close to $0.01 per query. Three times higher than the cost of a Google search.
The daily operating cost is as high as $1.3 million. According to statistics from UBS, the number of monthly active users of Chat-GPT is expected to reach 100 million in January, and about 13 million people use it every day. Therefore, based on the calculation of 13 million people, each person queries 10 times a day, then there are 130 million queries per day, and the cost of each query is 0.01 US dollars, and the final daily cost is as high as 1.3 million US dollars.
Calculated in 365 days a year, the annual running cost is as high as $474.7 million, which is very expensive.
2、Computing resource cost (i.e. training cost): The cost of a single iteration is as high as $4.6 million.
A large language model such as ChatGPT requires at least thousands of GPU graphics clusters to train, and the construction cost itself is very high, and there is a certain limit on the number of calculations. After a certain number of calculations, resources will be exhausted. At this time, a new one needs to be built. GPU cluster.
In terms of the training cost of a single iteration, it costs 4.6 million U.S. dollars to train at least ten to dozens of times. Even if there are only 10 times, the cost will reach 46 million U.S. dollars.
3、Summary
Such an expensive ChatGPT means that subsequent commercialization requires better ideas. In many cases, it is not that the AI technology does not meet the requirements, but it is actually unaffordable. Compared with the cost of a person, AI is still too expensive now. So some generative AI companies declared bankruptcy, and Open AI sold itself to Microsoft for funding.
So we can understand why Google, known for its powerful AI, is so conservative in the application of AI. Because on the one hand, Google has formed a search monopoly and is unwilling to spend such a high cost. The high cost will erode its profits. In addition, there is another problem. When you use chatgpt, you may also find that sometimes the data is wrong, and sometimes it is nonsense. Google is also worried that these mistakes will damage its own search brand.
But for Microsoft, the market share of bing is already very small. If there is enough cash flow support, it is willing to use chatgpt to compete for market share.
Let’s talk about cost this time, and next time we will talk about commercialization.
Disclaimer: Community is offered by Moomoo Technologies Inc. and is for educational purposes only. Read more
3
+0
1
Translate
Report
36K Views
Comment
Sign in to post a comment
  • 74376197 : A couple of clarifications. When I left Google Search 5 years ago, they considered Bing to be 3 years behind Google search. And Google itself says that Bing has 10.5% of the search market, so their market share is not "very small" and I think it's growing. In the old days (pre-Ruth Porat, or 'PR'), Google would happily take profits from search and plow them into free services (such as maps and google docs) as a "thankyou" and to "enhance user trust". I saw that discussed all the time while at Google, 2013-2018. AR (after Ruth), Google is much more miserly and tries never to use profits from one activity to subsidize other activities, and it has slowed down Google and put a much meaner face on Google since AR.