HomeArtificial IntelligenceNVIDIA CEO: One day now we have 1b robot cars on the...

NVIDIA CEO: One day now we have 1b robot cars on the road

Jensen Huang, CEO of Nvidia, said someday we can have a billion cars on the road they usually can be all robot cars.

It seems like science fiction, but as Huang said: “I'm science fiction.” He made the comments in a telephone conference with analysts concerning the FYQ4 win from Nvidia until January 26, 2025 (here is our complete report on the income). The NVIDIA share is half a percent of $ 130.72 per share in retail with after-hours.

Colette Kress, EPP and CFO said in the phone conference that the information center business had increased by 93% in comparison with the previous 12 months and that 16% rose one after the opposite initially of the Blackwell ramp and likewise sales with the Hopper chip. Blackwell's turnover exceeded the expectations of Nvidia, she said.

“This is the fastest product ramp within the history of our company, which is unprecedented in its speed and scale,” said Kress. “The Blackwell production is in full swing and mere offer in several configurations.
Expansion of customer introduction. Our fourth quarters' specialist center rose by 18% one after the opposite and over 2x. Year a 12 months. Customers run for the scaling of the infrastructure to coach the following generation of the latest models and to unlock the following level of AI functions. “

A wafer filled with Nvidia Blackwell chips.

With Blackwell, it would be common for these clusters to begin with 100,000 graphics processing units (GPUS) or more, said Kress. Shipments have already began for several infrastructures of this size. After training and model adjustment, fuel the demand for Nvidia infrastructure and software, since developers and company techniques similar to fantastic -tuning, learning strength and distillation adapt to models. Hugging face alone posts over 90,000 derivatives that were created from the Llama Foundation model.

The extent of the variation after training and the model adjustment is huge and may jointly calculate magnitude than before training, said Kress. The demand for inference is accelerated, driven by the test time and recent argumentation models similar to Openaai O3, Deepseek and more. Kress said she expected China's turnover to rose one after the opposite, and Huang said China would probably be the identical percentage within the fourth quarter. It is about half of what it was before the export controls were introduced by the Biden administration.

Nvidia has led to a 200 -time reduction in inference costs prior to now two years, said Kress. She also said that AI is expanding beyond the digital world that the Nvidia infrastructure and software platforms are increasingly getting used for the strength -robotics and physical AI development. In addition, it is anticipated that the vertical income from Nvidia will even grow.

Nvidia Drive Hyperion
Nvidia Drive Hyperion

Regarding CES, she found that the model platform of the Nvidia Cosmo World Foundation was presented there, and the robotics and automotive corporations – including Uber – were among the many first to take over.

From a geographical standpoint to the potential growth of sales of the information center, the USA was essentially the most powerful within the United States, which was powered by the initial ramp. Countries everywhere in the world are constructing their AI ecosystems, and the demand for arithmetic infrastructure is in search of the 200 billion euro -KI investments in France and the 200 -billion -euro investment initiatives of the EU offer an insight into the development to reorganize the worldwide AI infrastructure in the approaching years.

Kress said that the sales of the information center in China, as a percentage of total sales of the information center, is significantly below the insertion of export controls. Without changing the regulations, Nvidia believes that China's programs in China will remain roughly the identical level for data center solutions.

“We will proceed to comply with export controls while we serve our customers,” said Kress.

Gaming and AI -PCs

Nvidia's DLSS 4 AI Tech pays off.
Nvidia's DLSS 4 AI Tech pays off.

Kress found that the gaming sales of two.5 billion US dollars decreased by 22% and 11% in comparison with the 12 months.
All 12 months, sales of 11.4 billion US dollars increased by 9% in comparison with the previous 12 months and demand remained strong during the complete holiday. However, Kress said that the programs of the fourth quarting of supply restrictions were affected.

“We expect strong sequential growth in Q1 with increasing supply. The recent desktop and laptop GPUs of the Geforce RTX 50 series are built here for players, creators and developers,” said Kress.

The graphics cards of the RTX 50 series use the Blackwell architecture, fifth generation tensor cores and RT cores of the 4th generation. The DLSS4 software increases the frame rates as much as eight times as high because the previous generation by transforming a frame into three.

The automotive turnover was a record of 570 million US dollars, by 27% one after the opposite and 103% in comparison with the previous 12 months. The sales of 1.7 billion US dollars increased the total 12 months by 55%in comparison with the previous 12 months. The strong growth was driven by the continued ramp in autonomous vehicles, including cars and robotics.

NVIDIA GR00T generates synthetic data for robots.
NVIDIA GR00T generates synthetic data for robots.

At CES we terminated Toyota, the world's largest automobile manufacturer, its next generation vehicles on Nvidia Orin, which leads the safety -certified Nvidia Drive. Kress said that Nvidia had recorded higher development costs within the quarter than an increasing number of chips switched to production.

NVIDIA expects the turnover of FYQ1 $ 43 billion, whereby the turnover of information centers for bot computers and network is a sequential growth of the information center.

The next big event from NVIDIA is the annual GTC conference from March seventeenth in San Jose, California, where Huang will deliver a keynote on March 18.

When asked about an unlucky line between training and inference, Huang said that there are “several scaling laws”, including the scaling law before training, after scaling after training using the strengthening learning and testing calculation or argumentation scaling. These methods are initially and can change over time.

“We perform every model. We are great in training. The predominant majority of our today's consultation is definitely the conclusion. And Blackwell, with the thought of ​​arguing models, and while you have a look at the training, he’s many more powerful, ”he said. “But what is de facto amazing is for long considering, the reasoning of the test time, the AI ​​models were ten times faster and 25 times higher.”

He noticed that he’s more enthusiastic today than at CES, and he found that 1.5 million components would go down in each of the Blackwell Base racks. He said the work was tough, but all Blackwell partners had done a very good job. During the Blackwell ramp, the gross margins can be within the low percentage points of the Seventies.

Nvidia marries tech for AI in the physical world with digital twins.
Nvidia marries tech for AI within the physical world with digital twins.

“At this point, we deal with accelerating our production to make sure that we are able to provide the client to Blackwell chips as soon as possible, said Kress. There is the likelihood to enhance the gross margins over the course of the time until the mid -Seventies later.

Huang found that the overwhelming majority of software is predicated on machine learning and accelerated computing. He said the variety of AI startups was still very alive, and these agents are increasing for the corporate. He noticed that the physical AI for robotics and the sovereign AI for various regions.

Blackwell Ultra is anticipated within the second half of the 12 months as “the following train”, said Huang. He noticed that the primary Blackwell had a “hiccup” that cost a couple of months, and now it is totally restored.

He threw the core advantage that Nvidia has over competitors to the core advantage, and said that the software stack was “incredibly hard” and the corporate builds up its stack from the top, including architecture and ecosystem, which is positioned on the architecture.

When asked about geographical differences, Huang replied: “Taking it’s that AI is software. It is modern software. It is incredible modern software and the AI ​​has turn into mainstream. AI is used in every single place in delivery services, shopping services in every single place. And so I feel it’s pretty sure to say that AI has turn into mainstream that it’s integrated into every application. This is now a software tool that may appeal to a much larger a part of GDP on the earth than at any time in history. We are currently at first. “

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read