Nvidia is going after the $30 billion custom chip market with a new module

Nvidia is going after the $30 billion custom chip market with a new module

Written by Max A. Cherny and Stephen Nellis

SAN FRANCISCO (Reuters) – Nvidia is building a new business unit focused on designing custom chips for cloud computing companies and others, including advanced artificial intelligence processors, according to nine sources familiar with the company’s plans.

The dominant global designer and supplier of AI chips aims to capture a piece of the exploding market for custom AI chips and protect itself from the growing number of companies interested in finding alternatives to its products.

The Santa Clara, California-based company currently controls about 80% of the market for advanced AI chips, a situation that has seen its market value rise 40% so far this year to $1.73 trillion after more than tripling in 2023.

Its clients, including ChatGPT creator OpenAI, Microsoft, Alphabet and Meta Platforms, have raced to snap up the dwindling supply of Nvidia chips to compete in the rapidly emerging generative AI sector.

Nvidia’s H100 and A100 chips serve as a general purpose AI processor for many of these key customers. But technology companies have begun to develop their own internal chips to meet specific needs. Doing so helps reduce energy consumption and potentially reduces the cost and time required for design.

Nvidia is now trying to play a role in helping these companies develop the custom AI chips that have trickled down to rival companies like Broadcom and Marvell Technology, according to the sources, who declined to be identified because they were not authorized to speak publicly.

“If you’re really trying to optimize things like power, or optimize the cost of your application, you can’t afford to drop an H100 or A100 in there,” said Greg Raychu, general partner at venture capital firm Eclipse Ventures. In an interview. “You want to get exactly the right mix of compute and the type of compute you need.”

Nvidia hasn’t revealed pricing for the H100, which is higher than the previous generation A100, but each chip could sell for anywhere from $16,000 to $100,000 depending on the volume purchased and other factors. Meta said it plans to raise its total inventory to 350,000 H100s this year.

Nvidia officials met with representatives from Amazon.com, Meta, Microsoft, Google and OpenAI to discuss making their own custom chips, according to two sources familiar with the meetings. Beyond data center chips, the company has pursued telecommunications, automotive and video gaming customers.

In 2022, Nvidia said it would allow third-party customers to integrate some of its networking technologies with their own chips. The company has said nothing about the program since then, and Reuters is announcing its broader ambitions for the first time.

An Nvidia spokesperson declined to comment beyond the company’s 2022 announcement.

Dina McKinney, a former CEO of Advanced Micro Devices and Marvell, heads Nvidia’s custom unit and her team’s goal is to make its technology available to customers in the cloud, 5G wireless, video games and cars, according to her LinkedIn profile. These references have been deleted and their address has been changed after Reuters requested comment from Nvidia.

Amazon, Google, Microsoft, Meta and OpenAI declined to comment.

A $30 billion market

The market for chips for data centers will grow to as much as $10 billion this year, and double that in 2025, according to estimates by Alan Weckel of research firm 650 Group.

The broader custom chip market will be worth about $30 billion in 2023, representing about 5% of annual global chip sales, according to Needham analyst Charles Shi.

Currently, data center silicon design is dominated by Broadcom and Marvell.

In a typical arrangement, a design partner like Nvidia provides the intellectual property and technology, but leaves the chip manufacturing, packaging and additional steps to Taiwan Semiconductor Manufacturing Company or another chip maker.

Nvidia’s move into this region has the potential to eat into Broadcom and Marvell’s sales.

“With Broadcom’s custom silicon business at $10 billion, and Marvell’s around $2 billion, this poses a real threat,” said Dylan Patel, founder of silicon research group SemiAnalogy. “It’s a really big negative, there’s more competition coming into the fray.”

File photo: Computex Taipei technology trade fair in Taipei

File photo: Computex Taipei technology trade fair in Taipei

Beyond artificial intelligence

Nvidia is in talks with communications infrastructure builder Ericsson for a wireless chip that incorporates the chip designer’s graphics processing unit (GPU) technology, according to two sources familiar with the talks.

650 Group’s Weckle expects the custom communications chip market to remain steady at approximately $4 billion to $5 billion annually.

Ericsson declined to comment.

Nvidia also plans to target the automotive and video game markets, according to sources and public social media posts.

Wickl expects the custom car market to continually grow from its current $6 billion to $8 billion range by 20% annually, and the $7 billion to $8 billion market for video game chips could grow with next-generation consoles from Xbox and Sony. .

Nintendo’s current Switch portable console already includes Nvidia’s chip, the Tegra X1. The new version of the Switch console expected this year will likely include a custom Nvidia design, according to one source.

Nintendo declined to comment.

(Reporting by Max Cherney and Stephen Nellis in San Francisco; Additional reporting by Subantha Mukherjee in Stockholm and Crystal Ho in San Francisco; Editing by Kenneth Lee, Peter Henderson and Jamie Freed)

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *