Is This the Beginning of the End of Nvidia's Artificial Intelligence (AI) Chip Dominance? | The Motley Fool (2024)

Nvidia's customers are ramping up in-house AI chip development efforts, but investors need to look at the bigger picture.

Nvidia (NVDA -2.38%) has been a massive beneficiary of the surge in demand for artificial intelligence (AI) applications that started at the end of 2022. Cloud computing giants have been lining up to get the company's data center graphics processing units (GPUs) to train and power their large language models (LLMs).

For Meta Platforms (META 1.23%), Microsoft, Amazon, and many others, Nvidia has been the go-to provider of AI chips. It is worth noting that these tech giants have been willing to wait as long as a year between order and delivery to procure Nvidia's chips, and they have been paying top dollar for them. Other chipmakers such as Intel and Advanced Micro Devices (AMD 0.07%) have been left in the dust in this niche: By some estimates, Nvidia controls a whopping 95% of the AI chip market.

As a result, Nvidia's earnings and revenue multiplied rapidly. However, some of its customers are making concerted efforts to reduce their reliance on its chips.

Making AI chips in-house

Nvidia's success in the AI GPU market can be credited to its A100 processor, which it launched in 2020. The graphics chip specialist built this GPU for high-performance computing applications, and it was manufactured using a 7-nanometer (nm) process node. OpenAI reportedly deployed thousands of A100 chips to train ChatGPT.

Interestingly, near the end of 2021, rival AMD started offering a competing data center accelerator that was built on a 6nm process node -- the MI250X. However, the A100 reportedly outperformed the newer AMD chip in LLM training training tasks, per third-party estimates.

Then in 2022, Nvidia upped its game with the H100 processor, which is built on a custom 5nm process. The company packed 80 billion transistors into the chip as compared to 54 billion on the A100. The H100 turned out to be significantly more powerful than its predecessor. AMD, on the other hand, took until the end of 2023 to arrive with its next competing chip, the MI300.

This explains why Nvidia's H100 was in terrific demand last year, driving $47.5 billion in data center revenue for the company in its fiscal 2024 as compared to $15 billion in the previous year. Meta alone shelled out billions of dollars to Nvidia for H100s, and it wasn't the only big buyer to do so.

However, the lack of a potent alternative to the H100, its high pricing, and its thin availability explain why some of Nvidia's top customers started in-house AI chip development efforts to reduce their reliance on the chipmaker. Meta Platforms, for instance, recently announced the second generation of its own AI chip, which is built on a 5nm process node.

According to Meta, the new chip "more than doubles the compute and memory bandwidth of our previous solution while maintaining our close tie-in to our workloads. It is designed to efficiently serve the ranking and recommendation models that provide high-quality recommendations to users."

Moreover, Meta plans to continue its in-house chip development program as it looks to reduce the operating and development costs of its AI servers.

Something similar is happening at Microsoft. The tech giant revealed two custom AI chips toward the end of 2023, one of which is a 5nm AI accelerator called Maia 100. This AI chip reportedly has 105 billion transistors and has been built for running AI workloads in the cloud, including LLM training and inference.

Amazon, too, has gone down the in-house AI chip development route. It revealed its latest offering, the Trainium2, in November, which it claims is 4 times more powerful than its predecessor. Amazon Web Services customers have the option of using these chips to train AI models. Meanwhile, Alphabet has jumped onto this bandwagon with its newly revealed Axion custom AI processor.

Given that Meta, Microsoft, Google, and Amazon were among the top buyers of H100 processors last year, their focus on in-house chip development is no doubt a threat to the semiconductor giant's bottom line.

Investors, however, should focus on the bigger picture

While it is true that Nvidia's customers are looking to reduce their reliance on it, the fact remains that they are expected to continue buying its powerful GPUs. For instance, when Nvidia announced the launch of its next-generation Blackwell AI GPUs last month, all the companies mentioned above said they would deploy the new chips once they were available.

That's not surprising. Nvidia's upcoming GPUs are expected to be significantly more powerful, enabling customers to train even bigger LLMs. The chipmaker claims that the Blackwell GPU can run LLMs "at up to 25x less cost and energy consumption than its predecessor." Given that it is likely to price these new GPUs competitively compared to the H100, Nvidia's customers could witness stronger returns on their AI hardware investments with Blackwell processors.

As a result, the demand for Nvidia's AI chips could continue to remain robust. Another reason why Nvidia could remain the dominant player in the AI chip market is because of its control over the supply chain. Nvidia's customers and rivals are turning to foundry giant TSMC to manufacture their own AI chips, but Nvidia reportedly consumes 60% of TSMC's advanced chip packaging capacity.

Of course, TSMC is looking to increase its capacity to meet the demand from Nvidia and other customers, but the GPU specialist is likely to lock up the biggest share of the foundry's added output, considering the massive lead it already enjoys in the AI chip market.

So, even if other big tech players continue their chip development efforts, Nvidia is likely to remain the top AI chip player for quite some time. Japanese investment bank Mizuho estimates that Nvidia could sell $280 billion worth of AI chips in 2027 as it projects the overall market hitting $400 billion. So while Mizuho is forecasting that Nvidia's share of the AI chip market will come down over the next three years, it expects its data center revenue to rise significantly.

As such, Nvidia's data center revenue is likely to keep growing at a healthy pace thanks to the secular growth opportunity in AI chips, even if the company loses some market share. That's why investors shouldn't worry a lot about the chip development moves of Nvidia's customers. Instead, considering the impressive catalysts it is sitting on, they should view its recent pullback as an opportunity to buy more shares.

Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool's board of directors. Harsh Chauhan has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Amazon, Meta Platforms, Microsoft, Nvidia, and Taiwan Semiconductor Manufacturing. The Motley Fool recommends Intel and recommends the following options: long January 2025 $45 calls on Intel, long January 2026 $395 calls on Microsoft, short January 2026 $405 calls on Microsoft, and short May 2024 $47 calls on Intel. The Motley Fool has a disclosure policy.

Is This the Beginning of the End of Nvidia's Artificial Intelligence (AI) Chip Dominance? | The Motley Fool (2024)
Top Articles
What the Tech: How to tell when it's time to buy a new computer - WAKA 8
BBC - Test The Nation
Evil Dead Movies In Order & Timeline
Craigslist Houses For Rent In Denver Colorado
Craigslist Pets Longview Tx
Skamania Lodge Groupon
Faridpur Govt. Girls' High School, Faridpur Test Examination—2023; English : Paper II
Falgout Funeral Home Obituaries Houma
Yi Asian Chinese Union
10000 Divided By 5
Violent Night Showtimes Near Amc Fashion Valley 18
What is the surrender charge on life insurance?
Herbalism Guide Tbc
Shemal Cartoon
Fredericksburg Free Lance Star Obituaries
Drago Funeral Home & Cremation Services Obituaries
finaint.com
Conan Exiles Thrall Master Build: Best Attributes, Armor, Skills, More
Hell's Kitchen Valley Center Photos Menu
Yard Goats Score
Big Lots Weekly Advertisem*nt
Drug Test 35765N
Piri Leaked
Cb2 South Coast Plaza
Kirk Franklin Mother Debra Jones Age
27 Modern Dining Room Ideas You'll Want to Try ASAP
Horses For Sale In Tn Craigslist
Usa Massage Reviews
Pay Stub Portal
County Cricket Championship, day one - scores, radio commentary & live text
Loopnet Properties For Sale
Ripsi Terzian Instagram
Word Trip Level 359
Pokemmo Level Caps
Los Amigos Taquería Kalona Menu
Adecco Check Stubs
Beth Moore 2023
Compress PDF - quick, online, free
Orangetheory Northville Michigan
Autozone Locations Near Me
Skyrim:Elder Knowledge - The Unofficial Elder Scrolls Pages (UESP)
Emerge Ortho Kronos
Mohave County Jobs Craigslist
Eastern New Mexico News Obituaries
Daly City Building Division
Craigslist Com St Cloud Mn
2Nd Corinthians 5 Nlt
Flappy Bird Cool Math Games
What is a lifetime maximum benefit? | healthinsurance.org
Gander Mountain Mastercard Login
Workday Latech Edu
Latest Posts
Article information

Author: Aracelis Kilback

Last Updated:

Views: 6125

Rating: 4.3 / 5 (44 voted)

Reviews: 91% of readers found this page helpful

Author information

Name: Aracelis Kilback

Birthday: 1994-11-22

Address: Apt. 895 30151 Green Plain, Lake Mariela, RI 98141

Phone: +5992291857476

Job: Legal Officer

Hobby: LARPing, role-playing games, Slacklining, Reading, Inline skating, Brazilian jiu-jitsu, Dance

Introduction: My name is Aracelis Kilback, I am a nice, gentle, agreeable, joyous, attractive, combative, gifted person who loves writing and wants to share my knowledge and understanding with you.