Nvidia Stock Is Going to Soar Over the Next 12 Months

Nvidia (NASDAQ: NVDA) is the world’s leading supplier of graphics processing units (GPUs) for data centers, that are utilized in the event of artificial intelligence (AI). During the last two years alone, GPU sales have helped Nvidia add $3.2 trillion to its valuation.

The corporate just reported results for the fiscal 2025 third quarter (ended Oct. 27) after the market closed on Nov. 20, they usually obliterated Wall Street’s expectations. It just began shipping a recent generation of GPUs based on its powerful Blackwell architecture, and demand is heavily outstripping supply.

Start Your Mornings Smarter! Get up with Breakfast news in your inbox every market day. Sign Up For Free »

Nevertheless, the stock sank 2.5% in after-hours trading following the third-quarter report. I predict shares are going to soar over the following 12 months, so here’s why any weakness may be a buying opportunity.

Image source: Nvidia.

Prior to now, data centers were built with central processing units (CPUs), which were great for handling a small variety of specific tasks with high efficiency. Nevertheless, GPUs are designed for parallel processing, meaning they’ll handle quite a few tasks at the identical time with a really high throughput.

That is crucial relating to training AI models and performing AI inference, because those workloads require chips that may rapidly absorb and process trillions of information points.

GPUs built on Nvidia’s Hopper architecture — just like the H100 and H200 — have been the go-to selection for AI development to date. Data center operators like Microsoft and Amazon buy tens of 1000’s of those GPUs and rent their computing power to businesses and AI developers, which might’t afford to construct their very own infrastructure (a single H100 can sell for as much as $40,000).

Now, a recent age of AI computing has arrived with Nvidia’s Blackwell GPU architecture. The Blackwell-based GB200 NVL72 system can perform AI inference 30 times faster than the equivalent H100 system.

A recent estimate suggests a person GB200 GPU inside an NVL72 system costs around $83,333, so developers are getting that 30-fold increase in AI inference performance for a mere twofold increase in price in comparison with the H100.

In other words, the Blackwell GPUs should drive an incredible increase in cost efficiency, so more businesses and developers can afford to deploy essentially the most advanced AI large language models (LLMs).

Nvidia shipped 13,000 Blackwell GPU samples to customers in the course of the third quarter. Microsoft, Dell, and CoreWeave have already began constructing Blackwell-based data centers, and Oracle customers will soon find a way to access computing clusters with a staggering 131,000 Blackwell GPUs.

Leave a Comment

Copyright © 2024. All Rights Reserved. Finapress | Flytonic Theme by Flytonic.