Account Info
Log Out
English
Back
Log in to access Online Inquiry
Back to the Top

Defeating Llama 2, Closing in on GPT-4! Mistral with a Valuation of €2 Billion for a 22-person Company in Just Six Months

Defeating Llama 2, Closing in on GPT-4! Mistral with a Valuation of €2 Billion for a 22-person Company in Just Six Months
The open-source marvel unfolds once again: Mistral AI has released the first open-source Mixture of Experts (MoE) large model.
Just a few days ago, a magnet link instantly sent shockwaves through the AI community.
An 87GB seed, an 8x7B MoE architecture – it looks like a mini version of the 'Open-Source GPT-4'!
Defeating Llama 2, Closing in on GPT-4! Mistral with a Valuation of €2 Billion for a 22-person Company in Just Six Months
Defeating Llama 2, Closing in on GPT-4! Mistral with a Valuation of €2 Billion for a 22-person Company in Just Six Months
No press conference, no promotional videos – just a magnet link that kept developers up at night.
This AI startup, founded in France, only shared three pieces of content after creating an official account.
In June, Mistral AI went live. A 7-slide PowerPoint presentation secured the largest seed funding in European history.
In September, Mistral 7B was released, touted as the strongest open-source model with 7 billion parameters.
In December, an open-source version of Mistral 8x7B, similar to the GPT-4 architecture, was released. A few days later, the Financial Times revealed Mistral AI's latest funding round of $415 million, valuing the company at a staggering $2 billion, an eightfold increase.
Now, with just over 20 employees, the company has set the record for the fastest growth in the history of open-source companies.
Disclaimer: Community is offered by Moomoo Technologies Inc. and is for educational purposes only. Read more
5
+0
Translate
Report
89K Views
Comment
Sign in to post a comment
    avatar
    Video Sharer
    news porter, welcome and respect all view~
    1920Followers
    33Following
    5198Visitors
    Follow