HomeIndustriesNvidia is definitely low-cost

Nvidia is definitely low-cost

Stay up so far with free updates

You, an idiot, might think Nvidia is an enormous fat bubble considering it's value almost as much as the whole UK stock market, trades at greater than 40 times next yr's forecast earnings and even hosts its own winning parties, a nasty support act, etc. a monstrous options ecosystem.

However, FT Alphaville has made some sales comments and might now confidently say that Nvidia is definitely low-cost. Super low-cost! Essentially, it’s now a high value stock.

In fact, it represents a “generational opportunity,” Bank of America said in a note today, raising its price goal to $190.

We reiterate our Buy rating, increasing our CY25/26 pf EPS estimate by 13-20% and increasing our PO from $165 to $190 (unch. CY25E 42x P/E) at top AI pick NVDA. Our confidence in NVDA's competitive edge (80-85% market share) and generational opportunity (400B, MU, optical experts, adoption pace of huge language models, investment comments from top hyperscalers and NVDA execs regarding “crazy Blackwell demand”); (2) NVDA's undervalued enterprise partnerships (Accenture, ServiceNow, Oracle, etc.) and software offerings (NIMs); and (3) the power to generate $200 billion in FCF over the subsequent two years. Meanwhile, NVDA's valuation stays compelling in our view, at just 0.6x CY25E's P/E growth rate or PEG in comparison with the identical period last yr, well below the Mag-7 average. of 1.9x. Data from the BofA strategy team suggests that NVDA is widely owned, but only about 1x mkt. weighted in energetic portfolios.

To lighten the sell-side vocabulary a bit: various things have happened – like TSMC, for instance overwhelming results and Nvidia's CEO said demand for its latest chip was “crazy” – which has made Bank of America much more optimistic than it was a couple of months ago.

Adding in corporate partnerships with firms like Accenture, the bank now predicts Nvidia's earnings per share will rise to over $5.67 by 2027, pushing its price-to-earnings ratio to a more modest 24x by then becomes. Total free money flow shall be $200 billion over the subsequent two years, BofA predicts.

Wall Street analysts are almost universally positive about Nvidia – which, to be fair, has been exceeding expectations for a while – but that's pretty telling.

Of the 64 analysts surveyed by LSEG, 58 rate the corporate a “buy” and there aren’t any sell rankings, but BofA's EPS forecast is the third-highest of all analyst estimates, with only China's Everbright and Brazil's Banco ItaĂş being more bullish. His price goal is barely exceeded by Rosenblatt Securities and Elazar Advisors (neither are we).

That makes Goldman Sachs' price goal, which was raised to $150 just per week ago after a gathering with Nvidia CEO Jensen Huang, seem sober. But Goldman also believes that Nvidia could be very cheaply valued, near its three-year average P/E ratio and well below its recent history relative to its peers.

Nvidia's P/E ratio for the subsequent twelve months
Nvidia's next 12-month P/E ratio in comparison with other similar firms in Goldman's coverage universe

Here are Goldman's predominant arguments for increasing its price goal, which we’ll quote intimately given the high level of interest within the stock:

Continued give attention to accelerated computing: With classic Moore's Law showing diminishing marginal returns (and in turn, the necessity to innovate through architectural advancements becoming more apparent) and the emergence of generative AI offering its customers the chance to extend revenue and/or improve productivity, Nvidia is in it believes that data center operators will proceed to focus their investments on accelerated computing and GPUs specifically. On the much-discussed topic of customer ROI, management noted that hyperscalers with large social media and/or e-commerce platforms where customization is critical are already seeing solid returns on capital. In addition to major cloud service providers (CSPs) and consumer Internet firms, Nvidia expects the subsequent wave of AI adoption to be driven by enterprises, in the shape of digital AI agents that collaborate with and support employees .

Blackwell ramp: Management highlighted Blackwell's architectural transformation over Hopper and the associated expansion of the corporate's market opportunities (e.g. latest CPU configuration, introduction of Spectrum-X, latest NVLink switches). By integrating seven chips, each playing a job in delivering higher performance at the information center level, we view Blackwell's introduction and expansion not only as a near- and medium-term revenue growth driver, but additionally as a dynamic that drives competitiveness from Nvidia strengthens advantage. The rise of Blackwell-based products stays on course. Multi-billion dollar sales are expected within the January quarter, followed by further growth in April and beyond. Customers equipped with liquid-cooled infrastructure are expected to decide on the GB200 NVL72 (i.e. 36 Grace CPUs and 72 Blackwell GPUs connected in a rack design), while others are more likely to select others Configurations will resolve, especially for the HGX B100/200.

Increasing inference complexity: Based on our conversations, investors had historically perceived Inference as a comparatively “easy,” less compute-intensive workload and a market wherein Nvidia would face intense competition. However, as shown by the recent release of OpenAI's o1 models, that are designed to spend more time “considering” or “reasoning” before reacting, the complexity of inference (and subsequently the computational effort required) is increasing significantly . In fact, the demand for inference computing could grow exponentially as model developers look for top throughput and low latency. Most importantly, we consider Nvidia's full-stack approach positions it well to capitalize on this growth opportunity in inference (which already accounts for nearly 50% of the corporate's data center revenue).

Competitive moat: Mr. Huang talked concerning the company's competitive advantage, which relies on a) the corporate's large installed base (which in turn drives the virtuous cycle that brings more developers), b) the corporate's ability to innovate not only on the chip level, but additionally on the chip level to create at the information center level and c) its robust and growing software offerings, including domain-specific libraries similar to Nvidia Parabricks (i.e. genomic evaluation) and Nvidia AI Aerial (i.e. software-defined and cloud-native 5G networks). On the subject of ASICs (application-specific ICs) and their value proposition in comparison with vendor GPUs, management reiterated its view that while ASICs have all the time had and all the time could have a spot in the information center, they’re particularly applications similar to video transcoding and general deep learning ASICs aren’t considered direct competition because they shouldn’t have the agility, breadth (i.e. installed base) and reach (i.e. ability to work with or support any cloud service provider) that Nvidia GPUs can offer.

Forward view: With lead times for its GB200 NVL products approaching 12 months, Nvidia has strong future guidance in its data center business. Importantly, the corporate's contacts, especially with the main CSPs, are deep and extend to their public product roadmap (i.e. until 2027). By committing to a year-long product cadence and providing transparency to its customers, the corporate also hopes to realize a healthy supply-demand balance, where customers procure enough hardware to satisfy short-term needs slightly than prioritizing capital expenditures (which is usually the case). is the case). may lead to undesirable volatility from yr to yr.

Delivery prospects and foundry strategy: Given the present demand environment, Nvidia expects supply to stay tight for the foreseeable future, despite its partners' concerted efforts to support the corporate's growth prospects. At HBM, Nvidia expects to ultimately qualify a 3rd supplier in Samsung, despite the corporate's challenges last yr. Regarding its foundry strategy, management pointed to a) the long-standing and successful relationship with TSMC, b) while TSMC offers leading process technology, the corporate's agility, speed and powerful/consistent execution are other key characteristics that really differentiate it from the competition, and c) While Nvidia desires to diversify its foundry presence, if anything they consider not working with TSMC can be a much bigger risk.

Sovereign AI and AV/humanoid robots: In addition to the launch of Blackwell and the growing opportunities in inference, management cited Sovereign AI, autonomous vehicles and humanoid robots as current and future growth drivers for the corporate. For Sovereign AI, recall that in its earnings release in August, the corporate raised its fiscal 2025 revenue guidance from high single-digit billions to low double-digit billions.

The secret, after all, is that there isn’t any price goal so outlandish that it can’t be made credible by some equally outlandish assumptions.

Making fun of Nvidia and optimistic analysts feels just a little dangerous without delay, considering the corporate continues to view its estimates as annoying little puddles to hop over, but just like the proverbial broken clock, our skepticism will eventually wear off prove to be correct.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read