The new in-house artificial intelligence chips were announced by Meta and Google

Precise News

These in-house AI chips that Google and Meta just announced pose one of the first real challenges to Nvidia’s dominant position in the AI hardware market.
Nvidia controls more than 90% of the AI chips market, and demand for its industry-leading semiconductors is only increasing.
AI developers—from Google and Microsoft down to small startups—are all competing for scarce AI chips, limited by manufacturing capacity.
Even if the native AI chips Meta and Google are developing are less powerful than Nvidia’s cutting-edge AI chips, they could be better tailored to the company’s specific AI platforms.
Nguyen said that in-house chips designed for a company’s own AI platform could be more efficient and save on costs by eliminating unnecessary functions.
AI chips developed by Meta and Google are long-term bets.
“The trillion-dollar question for Nvidia’s valuation is the threat of these in-house chips,” Colello said.
“If these in-house chips significantly reduce the reliance on Nvidia, there’s probably downside to Nvidia’s stock from here.

NEUTRAL

One of the main areas of AI growth emerging is hardware. Creating chips in-house allows Big Tech companies with the resources and expertise to customize their hardware to their own AI models, reducing reliance on external designers like Nvidia and Intel and increasing performance and energy efficiency.

One of Nvidia’s first significant challenges to its dominant position in the AI hardware market comes in the form of these in-house AI chips, which Google and Meta recently announced. Over 90% of the market for AI chips is controlled by Nvidia, and demand for its cutting-edge semiconductors is only rising. Nvidia’s stock price, which has increased by 87% since the beginning of the year, might be negatively impacted if its largest clients begin producing their own chips instead.

Edward Wilford, an analyst at tech consultancy Omdia, told Fortune that “from Meta’s point of view… it gives them a bargaining tool with Nvidia.”. It informs Nvidia that they are not the only ones and that they have alternatives. It’s hardware designed with the AI they’re creating in mind. “.

Why do we need new chips for AI?

The vast amounts of data needed to train the large language models that underpin AI models means that they require enormous amounts of processing power. The trillions of data points that AI models are based on can’t be processed by conventional computer chips, which has created a market for AI-specific computer chips—often referred to as “cutting-edge” chips because they’re the most potent devices available.

The $30,000 flagship AI chip from semiconductor giant Nvidia is in high demand; in the last six months, the company’s share price has increased by nearly 90% due to this demand. Nvidia has dominated this emerging market.

Additionally, rival chipmaker Intel is battling to maintain its competitiveness. It recently introduced the Gaudi 3 AI chip to take on Nvidia head-to-head. Due to manufacturing capacity constraints, AI developers ranging from Google and Microsoft to small startups are vying for limited supply of AI chips.

Why are tech firms beginning to produce their own chips now?

Because the entire industry, including Nvidia and Intel, depends on Taiwanese manufacturer TSMC to assemble their chip designs, only a restricted quantity of chips can be produced. These state-of-the-art chips have a multi-month manufacturing lead time because there is only one manufacturer who is seriously involved. That’s a major contributing factor to the decision by Google and Meta, two of the biggest names in AI, to start creating custom chips. Senior analyst at Forrester Consulting Alvin Nguyen told Fortune that while chips made by companies like Google, Meta, and Amazon won’t be as powerful as Nvidia’s top-tier products, that could work to their advantage in terms of speed. In less specialized assembly lines with shorter wait times, they will be able to produce them, he added.

“I’ll buy something every day if it’s 10% less powerful but you can get it now,” Nguyen declared.

The native AI chips that Google and Meta are creating may be more suited to their respective AI platforms even though they aren’t as potent as Nvidia’s state-of-the-art AI chips. According to Nguyen, by removing pointless features, in-house chips made for a business’s own AI platform could be more effective and less expensive.

It resembles purchasing a car. Alright, an automatic transmission is required. However, do you really need the heated massage seats or the leather seats?” Nguyen asked.

An email from Melanie Roe, a Meta representative, to Fortune stated, “The benefit for us is that we can build a chip that can handle our specific workloads more efficiently.”.

The best chips made by Nvidia are priced at approximately $25,000. These are very strong tools that can be used for a variety of tasks, like creating images, training AI chatbots, and creating recommendation systems like those on Instagram and TikTok. Accordingly, a business like Meta, which has invested in AI mainly for its recommendation algorithms rather than customer-facing chatbots, might benefit more from a chip that is marginally less potent but more specialized.

“Although they are general purpose, Nvidia GPUs are great in AI data centers,” said Morningstar’s lead equity research analyst Brian Colello to Fortune. “A custom chip might be even better for some workloads and models. “.

the trillion-dollar query.

Because they can be integrated into current data centers, Nguyen stated that more specialized in-house chips might have provided additional advantages. Tech companies may be forced to relocate or redesign their data centers in order to incorporate liquid cooling and soundproofing due to the high power consumption and noise emissions from Nvidia chips. More energy-efficient and low-thermal-radiation native chips might address that issue.

Google and Meta’s AI chips are long-term investments. According to Nguyen’s estimation, the development of these chips took approximately nine months, and it will probably take several months before they are widely used. The AI community as a whole will remain largely dependent on Nvidia (and, to a lesser extent, Intel) for its computer hardware requirements for the foreseeable future. Indeed, according to a recent announcement by Mark Zuckerberg, Meta is expected to purchase 350,000 Nvidia chips by the end of this year; by that time, the company plans to spend about $18 billion on chips. Nvidia’s monopoly on the market, however, might be loosened by a shift away from outsourcing processing power and toward native chip design.

“The threat of these in-house chips is the trillion-dollar question for Nvidia’s valuation,” Colello stated. The stock of Nvidia is likely to decline further if these in-house chips considerably lessen the need on Nvidia. While this development is not surprising, our main concern regarding valuation is how it will be carried out over the next few years. “.

scroll to top