Microsoft, Google and Meta are increasing their spending on artificial intelligence

Precise News

And with AI potential now unlocked by the advances in chip design, speed, energy consumption and costs, these hyperscalers are seeing perhaps the biggest demand surge on record for their essential cloud services.
That level of cost goes a long way toward explaining the massive amounts of capital spending unveiled by Microsoft and Alphabet, two of Nvidia’s biggest customers, earlier this week as well as the spending plans outlined by Meta Platforms’ (META) Mark Zuckerberg and Tesla’s (TSLA) Elon Musk.
Alphabet (GOOG) CEO Sundar Pichai hinted at capex levels of around $50 billion this year and $57 billion in 2025.
Winning the AI race will cost billions “We are committed to making the investments required to keep us at the leading edge in technical infrastructure,” Pichai told investors on April 25.
However, in much the same way as Microsoft and Alphabet have told investors that data center demand has far outpaced their ability to supply it, Nvidia also faces capacity constraints.
Chip-on-wafer-on-substrate, often called CoWoS, is a crucial component of AI chipmaking and has been weighing on Nvidia deliveries for much of the past year.
Nvidia shares closed at $877.35 each on April 26, after rising 6.18% on the session and extending their year-to-date gain to around 225%.
That puts the chipmaker’s market value at $2.19 trillion, making it the third-largest company in the world by that measure.

NEUTRAL

In an attempt to take the lead in the next wave of artificial intelligence technologies, major tech companies intend to invest hundreds of billions of dollars over the next two years. This could be the largest risk to shareholder capital since the internet’s inception thirty years ago.

Nearly all of the biggest tech companies in the world are expected to see revenue increases in the upcoming years as a result of AI-related technologies. These companies aim to increase sales of everything from complex pharmaceutical testing to drive-through dining by utilizing their massive datasets.

Large investments in computer infrastructure are necessary to access those data; these infrastructures are frequently based in virtual cloud computing environments, which are created and maintained by companies known as hypercalers, such as Alphabet’s Google Cloud, Microsoft’s (MSFT) Azure, and Amazon Web Services (AMZN).

Meanwhile, these hyperscalers are witnessing what may be the largest demand spike ever for their vital cloud services, with AI’s potential now unlocked by advancements in chip design, speed, energy consumption, and costs.

However, creating and maintaining those networks is expensive and demands a large financial commitment from investors. Companies like Google and Microsoft are spending billions on these projects, even though the revenue from their AI-focused products is still quite small.

The focus of Big Tech now is capital expenditure.

That places the subject of capital expenditures, or capex, at the center of the AI investment narrative and solidifies Nvidia’s (NVDA) dominant position in the market, independent of which tech behemoth emerges as the preferred hyperscaler.

As businesses like IBM and Alibaba Cloud pose small challenges to their more established competitors, Nvidia CEO Jensen Huang actually projects the data center market to grow at a rate of $250 billion annually. This is on top of an installed base that he estimates is already valued at $1 trillion.

Furthermore, he seems completely indifferent to the outcome, much like the men and women who shopped pickaxes and sifting pans during the California gold rush.

At the group’s GTC developers’ conference in San Jose, California, Huang told investors, “We sell an entire data center in parts, so our percentage of that $250 billion per year is likely a lot, lot, higher than somebody who sells a chip.”. past month.

Nvidia, which currently holds an estimated 80% market share for processors that power artificial intelligence, used the GTC event to unveil the Blackwell GPU architecture, a new iteration of its lineup that could fetch a 40% premium over the current range of H100 chips, which retail for between $30,000 and $40,000.

This level of expense helps to explain the enormous capital expenditures that Nvidia’s largest clients, Microsoft and Alphabet, unveiled earlier this week, as well as the spending plans that Elon Musk of Tesla (TSLA) and Mark Zuckerberg of Meta Platforms (META) laid out.

Sundar Pichai, the CEO of Alphabet (GOOG), revealed that the company will spend about $50 billion this year and $57 billion by 2025.

It will cost billions to win the race to AI.

On April 25, Pichai informed investors, “We are committed to making the investments required to keep us at the leading edge in technical infrastructure.”. “Our capital expenditures have increased, as you can see. This will enable us to advance AI model development and spur cloud growth. “.

Based primarily on investments in its technical infrastructure, with servers accounting for the largest portion and data centers coming in second, Google’s first-quarter capital expenditures were estimated to have been $12 billion, almost twice as much as the overall amount from the previous year. “.

Microsoft, on the other hand, plans to spend roughly $50 billion on capital expenditures during its upcoming fiscal year, which starts in July. This comes after the company’s third-quarter spending jumped by nearly 80% to $14 billion.

CEO Satya Nadella informed investors on April 25 that “we have been doing what is essentially capital allocation to be a leader in AI for multiple years now, and we plan to keep taking that forward.”.

Although the company is still finalizing its annual spending plans, Amazon CFO Brian Olsavsky informed investors during the company’s most recent earnings call that “we do expect capex to rise as we add capacity in AWS for region expansions, but primarily the work we’re doing with generative-AI projects.”. “.

A key component of AI goals are Nvidia GPUs.

While Elon Musk, the CEO of Tesla, told investors he expected to deploy about 85,000 of the company’s H100 graphics chips this year as part of his new pivot toward artificial intelligence and autonomous driving, Meta Platforms, which has committed to purchasing about 350,000 of Nvidia’s graphics chips, estimated its 2024 capital expenditure bill at between $35 billion and $40 billion.

Early estimates point to an overall revenue total of $24.5 billion for Nvidia, the great majority of which will come from data center sales. The company is scheduled to release its first quarter earnings on May 22.

Even though that represents a startling 245 percent increase from the same time last year, it may be adjusted even further now that some of its largest clients have revealed their spending priorities.

Nvidia CFO Colette Kress informed investors at the company’s GTC conference last month, “Many of our customers that we have already spoken with talked about the designs, talked about the specs, and have provided us with their demand desires.”. And that has really aided us in starting our supply chain work, volume work, and other plans. “.

Nevertheless, Nvidia is subject to capacity limitations, much as Microsoft and Alphabet have informed investors that demand for data centers has greatly exceeded their capacity to supply it.

Chip-on-wafer-on-substrate, or CoWoS, is a critical step in the creation of AI chips and has hindered Nvidia’s supply for the majority of the previous year.

The largest chip contractor in the world, Taiwan Semiconductor (TSM), has cautioned investors that “this condition will probably continue to next year” as a result of its inability to meet demand for its premium stacking and packing technology. “.”.

AMD holds a commanding second place.

In her call with analysts last month, Kress stated, “It’s very true though that on the onset of the very first one coming to market, there might be constraints until we can meet some of the demand that’s put in front of us.”.

This could help Advanced Micro Devices (AMD), the only serious competitor to Nvidia’s hegemony in the global market, and its recently released MI300X chip.

In an attempt to counter Nvidia’s capacity to meet the surge in demand worldwide, AMD said earlier this year that the new chip could generate sales of about $3 point 5 billion over the next year.

Additional AI Stocks:.

Analyst releases shocking target price for Palantir stock following Oracle acquisition.

An experienced analyst issues a direct warning regarding Nvidia’s stock.

Following the close of business on April 30, AMD is scheduled to release its own first-quarter earnings. However, analysts project that sales will increase by only about 4% to $50.5 billion, primarily due to muted demand in the global PC market.

After rising 6 points 18 percent during the session and extending their year-to-date gain to about 225 percent, Nvidia shares closed at $877.35 apiece on April 26. At $2.19 trillion, the chipmaker is now the third-largest company in the world based on market value.

scroll to top