Account Info
Log Out
English
Back
Log in to access Online Inquiry
Back to the Top
Nvidia retreats nearly 10% from peak: Good buy or goodbye?
Views 427K Contents 101

NVIDIA Pulls Back from Peak: Losing Steam or Preparing for the Next Surge?

avatar
Moomoo News Global joined discussion · Apr 11 09:12
The week has been marked by a series of announcements of new AI chips, with Intel giving details of its Gaudi 3 accelerator while both Alphabet's Google and Facebook-owner Meta Platforms announced chips being made in-house.
Can Intel's Gaudi hold its own against Nvidia?
Intel acquired the Israel-based Gaudi chip parent Habana Labs in 2019 for just $2 billion, which is actually looking like one of prior management's smartest moves.
Intel’s new Gaudi 3 AI accelerators have some impressive specs. According to Intel's presentation, the new chip will have 50% better inferencing performance with 40% better power efficiency than the Nvidia H100 and at a much lower cost. In addition, Intel claims the Gaudi 3 also achieves a 50% faster time-to-train versus the H100.
Intel's leadership of large consortiums innovating open-source AI solutions could represent a potential long-term threat, but Nvidia will release its new Blackwell architecture later this year. While the Gaudi 3 chip may achieve 50% performance improvement over the H100, Blackwell promises between 2.5 and five times the performance of the H100, handily outpacing the Gaudi 3 as well.
Will Nvidia's market share be eroded?
BofA Securities analyst Vivek Arya said Nvidia is set to retain more than 75% of the overall AI accelerator market, which is worth around $90 billion this year and is expected to rise to around $200 billion in 2027.
"We have seen nothing thus far that changes this view, and indeed Nvidia's Blackwell product with its leading training and inference performance, plus Nvidia's strong enterprise presence makes us more confident in the company's ability to maintain/gain share," Arya wrote in a research note this week.
Arya specifically projected Intel's Gaudi 3 will get less than 1% market share of the AI accelerator sector, while custom chips from the likes of Alphabet and Meta might eventually take 10-15% of the market. Arya kept a Buy rating and a $1,100 target price on Nvidia stock.
Nvidia Bolsters AI Ambitions with Strategic Investments and Partnerships"
Nvidia is fortifying its position in the artificial intelligence (AI) arena by acquiring stakes in promising AI ventures. One such company that has piqued Nvidia's interest is SoundHound AI, in which the chipmaking powerhouse has invested by purchasing 1.7 million shares. This move is part of Nvidia's strategy to tap into SoundHound AI's innovative voice AI technology, which has applications in the restaurant industry and the automotive sector. Its advanced voice assistant has been adopted by Stellantis for its DS Vehicles, providing drivers with an enhanced digital assistant experience that goes above and beyond traditional offerings.
Nvidia's approach extends beyond the usual semiconductor industry focus on hardware and chip development; the firm is also carving out a significant presence in lucrative software sectors. Nvidia has been actively promoting its CUDA programming interface, which operates on AI chips, enabling software to utilize specific graphics processing units for more efficient general-purpose computing. Among the many renowned companies that have adopted Nvidia's CUDA interface are Oski Technology, Sinopec Technology, and even Advanced Micro Devices, highlighting its broad and influential reach.
Elon Musk's AI company is expected to use more Nvidia chips
In a recent interview, Elon Musk predicted the development of artificial general intelligence (AGI). He said AGI has the potential to achieve breakthroughs in intelligence beyond human intelligence in the next two years, but this process requires a large amount of GPU and power consumption.
According to Musk, his artificial intelligence company xAI is currently training the second-generation large language model Grok 2 and is expected to complete the next training phase in May. The training of Grok 2 has cost about 20,000 NVIDIA H100 GPU. The development of future advanced versions of Grok 3 may require up to 100,000 NVIDIA H100 GPUs.
Musk noted the current development of AI technology faces two major problems: First, the supply of high-end GPUs, such as Nvidia H100, is in short supply. It is not easy to quickly obtain 100,000 GPUs; second, there is huge power demand. A Nvidia H100 GPU consumes about 700 watts of power when working at full load, so 100,000 of these GPUs will consume up to 70 megawatts of power. If the needs of servers and cooling systems are considered, the power consumption of a data center equipped with 100,000 Nvidia H100 processors will reach approximately 100 megawatts, which is equivalent to the power consumption of a small city.
These two constraints highlight the challenges of scaling AI technology to meet growing computing demands.
Nonetheless, advances in computing and storage technology will make it possible to train even larger language models in the coming years. The Blackwell B200 GPU platform demonstrated by Nvidia at the technology event GTC 2024 is designed to support large language models that can be extended to trillions of parameters, heralding a key step in the development of AGI.
Musk says, “If AGI is defined as intelligence exceeding that of the smartest human being, I think it will probably be achieved within the next year, or two years.”
Disclaimer: Moomoo Technologies Inc. is providing this content for information and educational use only. Read more
2
17
+0
Translate
Report
33K Views
Comment
Sign in to post a comment