Stock split is not one of them)

Actions of semiconductor giant Nvidia (NASDAQ:NVDA) have gained nearly 217% over the past year. There is no doubt that the rapid advancement and adoption of generative artificial intelligence (AI) applications and large language models have been the key drivers of demand for its AI-enabled chips and systems. The leader in graphics processing units (GPUs) has become both a catalyst and a major beneficiary of the ongoing generative AI revolution.

Nvidia posted a strong performance in its first fiscal quarter 2025, which ended April 28: revenue and profits climbed year-over-year by 262% and 690%, respectively. For the fiscal year, which ends Jan. 31, analysts expect its revenue to rise 97% to $120 billion and its earnings per share (EPS) to rise 109% to $2.71.

Beyond these exceptional near-term prospects, there are also at least three main reasons to expect significant growth from Nvidia in the long term.

A dominant player in accelerated computing

Revenue from Nvidia’s data center business soared 427% year over year to $22.6 billion in the fiscal first quarter. This segment accounted for 87% of its revenue and will play a vital role in the company’s future growth.

Hyperscalers (large cloud infrastructure providers), enterprises across verticals, and sovereigns around the world are upgrading billions of dollars’ worth of installed data center infrastructures built around network cards (network interface cards) and stupid processors by installing accelerated computer hardware. This infrastructure has become essential for training and inference of large language models and other generative AI applications. Nvidia also expects companies to upgrade their existing accelerated computing infrastructure from that based on the current Hopper architecture H100 chips to the next generation Hopper architecture H200 chips and the next generation Blackwell architecture chips.

The economics are very attractive to customers, especially cloud service providers. During the latest earnings call, an Nvidia executive claimed that “for every $1 spent on NVIDIA AI infrastructure, cloud providers have the opportunity to earn $5 in GPU instant hosting revenue over four years.” “.

Demand for Nvidia’s AI GPUs far exceeds supply, even though the company has focused on increasing production capacity for chips like the H100 and Grace Hopper. He expects supply of next-generation H200 and Blackwell chips to continue to lag demand through next year. This will ensure that Nvidia continues to enjoy pricing power, despite growing competition in this niche of the chip industry.

Besides its AI GPUs, Nvidia also introduced the Grace Hopper superchip (CPU + GPU), Blackwell architecture chips, AI-optimized Spectrum-X Ethernet network, and Nvidia AI enterprise software. These products help drive performance gains and reduce user costs when training and running AI applications.

According to Nvidia CEO Jensen Huang, AI is enabling the $3 trillion IT industry to…

Read Complete News ➤


Discover more from The Times Of Update

Subscribe to get the latest posts sent to your email.

Leave a Reply

Your email address will not be published. Required fields are marked *

17 − four =

Discover more from The Times Of Update

Subscribe now to keep reading and get access to the full archive.

Continue reading