Account Info
Log Out
English
Back
Log in to access Online Inquiry
Back to the Top

Microsoft Is Finally Making Custom AI Chips — But They're Not for Displacing Nvidia

avatar
Analysts Notebook wrote a column · Nov 16, 2023 03:46
$Microsoft(MSFT.US)$ has launched its first AI chip, the Maia 100, which is part of its Azure Maia AI accelerator series. The company also unveiled its first Arm-based Azure Cobalt central processing unit for general purpose cloud computing. With these two chips, Microsoft joins rivals $Alphabet-A(GOOGL.US)$ and $Amazon(AMZN.US)$ in developing custom chips to run their competing cloud platforms. The Maia 100 will be used for both cloud-based training and inferencing of AI models.
Microsoft's Maia and Azure Cobalt; Source: Microsoft
Microsoft's Maia and Azure Cobalt; Source: Microsoft
Software is our core strength, but frankly, we are a systems company," Microsoft's Rani Borkar, corporate vice president for Azure hardware systems and infrastructure, said in a statement. "At Microsoft we are co-designing and optimizing hardware and software together so that one plus one is greater than two. We have visibility into the entire stack, and silicon is just one of the ingredients."
Microsoft's AI Chip Can Be Used to Train LLM
Microsoft's custom AI chip could be utilized for training large language models, potentially reducing the need for expensive dependence on $NVIDIA(NVDA.US)$. Azure Maia AI chip and Arm-powered Azure Cobalt CPU are arriving in 2024, on the back of a surge in demand this year for Nvidia's H100 GPUs that are widely used to train and operate generative image tools and large language models. There's such high demand for these GPUs that some have even fetched more than $40,000 on eBay.
Maia 100 is currently being tested on GPT 3.5 Turbo, the same model that powers ChatGPT, Bing AI workloads, and GitHub Copilot. Manufactured on a 5-nanometer TSMC process, Maia has 105 billion transistors — around 30 percent fewer than the 153 billion found on $Advanced Micro Devices(AMD.US)$'s MI300X AI GPU. "Maia supports our first implementation of the sub 8-bit data types, MX data types, in order to co-design hardware and software," says Borkar. "This helps us support faster model training and inference times."
We were excited when Microsoft first shared their designs for the Maia chip, and we've worked together to refine and test it with our models," says Sam Altman, CEO of OpenAI.
The $10 billion OpenAI-Microsoft partnership reportedly valued Altman's AI company at $29 billion.
The $10 billion OpenAI-Microsoft partnership reportedly valued Altman's AI company at $29 billion.
"This Is Not Something That's Displacing Nvidia"
That diversification of supply chains is important to Microsoft, particularly when Nvidia is the key supplier of AI server chips right now and companies have been racing to buy up these chips. Estimates have suggested OpenAI needed more than 30,000 of Nvidia's older A100 GPUs for the commercialization of ChatGPT, so Microsoft's own chips could help lower the cost of AI for its customers. Microsoft has also developed these chips for its own Azure cloud workloads, not to sell to others like Nvidia, AMD, $Intel(INTC.US)$, and $Qualcomm(QCOM.US)$ all do.
We think this gives us a way that we can provide better solutions to our customers that are faster and lower cost and higher quality," said Scott Guthrie, the executive vice president of Microsoft's cloud and AI group.
Microsoft said it does not plan to sell the chips but instead will use them to power its own subscription software offerings and as part of its Azure cloud computing service.
This is not something that's displacing Nvidia," said Ben Bajarin, chief executive of analyst firm Creative Strategies.
Source: Microsoft, Yahoo Finance, The Verge
Disclaimer: Moomoo Technologies Inc. is providing this content for information and educational use only. Read more
22
2
1
+0
Translate
Report
69K Views
Comment
Sign in to post a comment