NVDA:The Dawn of the AI Era

This is a great podcast on AI and NVDA. It was created a couple of days ago. It’s very long (almost 3 hrs). But it’s well worth a listen to better understand NVDA.

GauchoRico

26 Likes

Had they concentrated on ‘massively parallel’ they could have cut it down to 15 minutes, But they are in the podcast business… :imp:

It’s all there in one minute at minute 10 https://www.youtube.com/watch?v=nFB-AILkamw&t=600s

Tesla’s Dojo was designed to be massively parallel to run machine learning neural networks. As far as I know it’s the only current competitor with Nvidia’s technology. It might in the future become a ‘product’ or an ‘AIaaS.’

Denny Schlesinger

As side note, Nvidia designed their hardware to process images, not to train neural networks. What happened is what Stuart Kauffman, my favorite Complexity scientist, calls, "The Adjacent Possible.

The “adjacent possible” is the most salient, most shared and perhaps most important of a cacophony of colorful metaphors about biology, information, and networks offered us by Stuart A. Kauffman in his seminal “At Home in the Universe”.

What is the Adjacent Possible?. Why our world is not predictable, and… | by Martin Erlic | Medium.

I loved this book even if some of it was way above my pay grade:

33 Likes

I haven’t seen much discussion on the size of the build out for a more modern data center here. Will Nvidia continue to accelerate sales has been said to be a function of macro economics and/or supply and/or use cases for AI. I’d say, Smorgasbord’s post on what ‘On Allocation’ means for Nvidia’s ability to see future demand was excellent. And on another post he wrote, Nvidia isn’t sitting still on the product development side, either, also just announcing the GH200 chip, which will ship next year. And both Amazon and Microsoft just opened up cloud instances of Nvidia H100 servers. So now companies can try out or ramp up AI work on Nvidia without having to buy hardware themselves. From VMWare to HuggingFace to Snowflake to Accenture to ServiceNow, Nvidia has partnerships everywhere. And yeah, CUDA remains a moat even if an AMD could come out with a competing chip that somehow was faster than Nvidia’s new announcements. In a recent Motely Fool Article, “While Bloomberg Intelligence believes the market will grow at a 20% average growth rate, reaching $133 billion by 2032, others believe the growth will be much faster and bigger. Recently, the CEO of Foxconn parent Hon Hai Precision Industry (HNHPF -0.45%) claimed the AI server market would grow to $150 billion by 2027, up from $30 billion this year.” Full Article:
Why Nvidia Retreated on Friday | The Motley Fool

I think we can all agree that Nvidia’s numbers the last two quarters have moved the entire market. I recently added a smallish position. What took me so long to get back into Nvidia was, getting my head around the future of Nvidia’s business cycle for data center build out.

After roughly six months of neglecting a closer look, I was reminded of what I consider the most salient data point for understanding the demand cycle for GPUs being long lasting is that even after the recent and massive increases in sales of GPUs, Nvidia CEO, Jensen Huang, said that supply for Nvidia GPUs is on track for another 3X in the next twelve months. I heard him say this in one of the many conferences I’ve watched and here’s a link to a review, Nvidia plans to triple production of its $40,000 chips as it races to meet huge demand from AI companies, report says

The moat is similar to that of Pure Storage. But, not only does the customer need to marry optimized hardware with the otherwise unavailable software, with Nvidia once a customer has installed their hardware and implemented the needed Nvidia software both are then integrated into the customers entire development stack.

Along with that build out of the modern data center (both the Enterprise on premise data centers and in the Hyperscalers), profit margin may drop some along the way. But… If Tesla, alone, is 10xing their compute in the next twelve months, and they are - granted video takes a ton more compute than most other needs, then there could be continued constraints on supply for GPUs even with the 3X increase in supply Nvidia is counting on in the next twelve months.

Best,

Jason

46 Likes