AMD up to the $123 mark after hours from the earnings report

I’m wondering if this is a false move up in the aftermarket because the actual numbers aren’t that impressive. I haven’t heard the call yet…doc


That’s always possible. But I think we know tech stocks were very out of favor for quite a while. They were way over valued people thought and many dumped them.

Now thanks to AI the market has discovered that tech stocks have true growth potential. So many are buying to fill the vacant slots in their portfolio.

Will the true earnings growth follow? Or will some over promise? Such is the real world of investing.

Someone pointed out recently that the dot com boom was fueled by real gains in productivity. Many hope the same will be true for AI. But no one knows if they can really deliver the earnings. Those buying clearly hope so. We shall see.


Su said “Su said customer interest in the company’s MI300 series chips is “very high” and that AMD expanded its work with “top-tier cloud providers, large enterprises and numerous leading AI companies” during the third quarter.” which sounds very promising.

More from the report:
AMD has not given a detailed full-year forecast but said it expects sales in its data center business that will contain MI300 sales to be higher in 2023 than 2022’s $6.04 billion total.

Jenny Hardy, portfolio manager at GP Bullhound, which owns Nvidia and AMD stock, said that Nvidia still faces supply constraints, leaving an opening for AMD’s chip.

“So if AMD can ramp production and launch those MI300 chips in the fourth quarter, they will likely see strong demand because plenty of people cannot get their hands on Nvidia chips. So we would assume that AMD can effectively kind of fill some of that supply-demand gap,” Hardy said.

On the conference call, Su also said the company is seeing demand for its MI250 chip, an older offering that is still a “very good option” for less complicated AI tasks.
Let’s see if this holds up tomorrow…doc


“MI300, we’ve worked with the entire supply chain. We feel that we have ample supply for an aggressive ramp in the fourth quarter and into 2024”

To me this was one of the important points from the call-- Supply is an issue and AMD seems confident that it has it.


So there hasn’t been a lot of chatter in the last couple of weeks and we see what the stock has done. It looks like that was a false move up. But when is it time to start laying in more?

I think the Nvidia earnings about to come out tomorrow I believe will impact the next move for tech. If Nvidia blows earnings out of the water, then AI type investments will rise with the tide. If Nvidia earnings just blow, then ai tech will retrace back down. Until AMD sees revenue from the ai gpu’s, we are just waiting. When I look at the potential AI market - well what if its only 100 billion instead of 300 billion. Can AMD capture ten or 20 percent of this market? I think yes. I also think they can capture more than ten or twenty percent of this huge market. What would ten or twenty billion dollars do to AMD stock price? I think we know what that will do. Many of us have watched the last 4 or 5 years as AMD has grown. Some of us have been following AMD since the 90’s here at the fool. Its looking like AMD might have the potential to continue to grow into a 30 or 50 or 80 billion dollar company in the next few years if not sooner. So when to increase ones position is a great question. When I bought into AMD again some four of five years ago when the price was in the $9-10 range, I saw Nvidia had grown from $25 to $250 at the time and my thought was AMD can do that over the next few years. It did. Now I’m thinking the same thing with this AI tech, Nvidia and AMD. AMD can grow just like Nvidia. Right now we have to wait out the general market trends and for AMD to start moving some product, getting contracts, and getting some performance benchmarks out there. I am hoping for another significant win here just like many of you long time AMD holders/followers. Good luck to us…doc


I just sold my AMD. The results from yesterday showed me just how far behind AMD is in AI. I’m not buying the “just wait for…”

1 Like

I have heard the latest AMD ROCm software is as good as CUDA… I have also heard the latest Intel oneapi tool set is as good as CUDA. I think both remain to be seen. If the Intel or AMD product doesn’t “just work” they are not going to sell many.

1 Like

This was discussed on CNBC’s Halftime program today. Nvidia is the clear leader for now in AI chips and the software to program them. Other companies are developing chips. For now Nvidia chips are best for the learning stage. But once set and running less capable chips may well do much of the routine stuff. And some next generation chips may be superior in selected applications.

Yes, Nvidia share will decline over time but the market may well continue to grow for at least another year. Then much depends on which applications turn out to be popular and well ahead of the competition.

Well I certainly don’t pretend to know where the AMD share price is going next, but guestimate that $100 is too low especially with the product pipeline for AI coming out soon. The lack of benchmarks for the MI300 is holding the price back I think. I am fairly sure I saw a Moore’s law is dead video recently where they believed the MI300 and Nvidia’s Hopper H100 would be roughly equal in performance but the AMD MI300 will be 25% of the price and with less power requirements. Have bought a few more AMD and some Nvidia today.

It is somewhat fundamental that with the same fabrication process all comers will have about the same power/performance/cost characteristics. Nvidia is currently extracting monopoly profits, and it is unclear if either AMD or INTC will break that monopoly. It doesn’t matter how cheap the AMD/Intel products are, if they don’t work in the applications they don’t matter.


AMD and even Intel have been doing work on the software ecosystems for a while now, and with demand so high and supply so tight I’m sure there will be interest in porting to alternative hardware. I think AMD will find ways to make money.

Still, I wonder how often Jensen and Lisa see each other, since they’re cousins – must make for interesting holidy dinner discussions :slight_smile:

1 Like

The Palaiseau, France-headquartered Mipsology was founded in 2015 and its “flagship” Zebra AI software “will help accelerate AMD solutions for AI workloads,” according to a statement from the company.1

Zebra is software that customers can deploy on preexisting hardware, though Mipsology has experience tailoring and optimizing for AMD components. Potential use cases range from autonomous vehicles and manufacturing robotics to smart retail and responsive traffic lights.2

AMD also brought their newest gaming cards with FSR3 for smoother gaming (I’m not a gamer) and new pricing below Nvidia’s competing products to Gamescon 2023:

Lets see how this plays out for AMD…doc

1 Like


I just sold my AMD. The results from yesterday showed me just how far behind AMD is in AI. I’m not buying the “just wait for…”

I hear you. Nothing prevents you from betting on the other horse if you think they’ve got a higher likelihood of providing a return you want. I have sizeable exposure to both at this point. And one of my great regrets is that, even knowing the rising importance of GPU-based compute, I didn’t pour money into NVidia starting about 8 years ago. I was even interviewing for jobs with companies doing GPU-accelerated workloads and I should have known better than to not bet on that horse.

I am fairly sure that AMD have developed a CUDA implementation for their own cards. Not .exe compatable but recompile CUDA code and run. That would be in advance on anything AMD has done so far. I am not too worried, I think Nvidia’s price gouging is making everyone want an alternative.


It’s important to remember your thesis and not get carried away by the hype. I am sitting here in regret, having taken a look at AMD and NVDA side-by-side over the winter, and concluding that AMD was the better bet. I believed (and believe) NVDA to be a bit stronger overall as a company, but both are very good companies and AMD was much more attractively priced.

Importantly, AI was not part of my thesis for either company. If I had forecast GPUs to be the sole key to revenue and profit growth, I would have gone with NVDA. But I am excited about the diversity of AMD’s portfolio, the strength of their execution over the past several years, and their prospects of continuing to take market share across the board except perhaps in GPUs. What I didn’t foresee, and neither did anyone else, was the explosion of LLMs into the public consciousness and the accompanying explosion of demand for NVDA’s GPUs.

I’m annoyed I missed that, because all this time I would have been trimming (a LOT) to keep my portfolio in balance, harvesting huge gains. But if ChatGPT hadn’t been released, we wouldn’t be sitting here saying AMD and NVDA are about even in AI. We just wouldn’t be talking about AI at all. Eventually, the AI hype will die down. Companies currently massively ramping up GPU capacity in an arms race will optimize. And we’ll appreciate again all the diverse revenue streams AMD will continue growing in the meantime, and we won’t have paid 50-100x sales to own them.

I’m not saying I’ll definitely be proven right and AMD will turn out to have been the better investment. But I think it’s wrong to look at the current situation and say that AMD is a loser. They are not the leader in AI, of course, because they’re not the leader in GPUs, and that was always clear. But they can be a leader in many other areas that will still, I believe, prove to make long-term holders happy.


AMD’s recent domination of top ten (five?) supercomputers have come with ROCm development funds in the $100 million dollar range. That is not just to improve the user interface and compatibility with CUDA, but to build the software infrastructure needed to work efficiently on ExaFLOPS machines. Yes, that is definitely the very expensive part. Simulation doesn’t help, when you need to test version A against version B, you really do need an ExaFLOPS machine standalone to test on. And it is more like versions A, B, C, D, E, F… of network architectures to map onto the physical layout of the machine.

In other words, right now, if you want to run on an ExaFLOPS machine, you have to use ROCm, not CUDA. Yes, they are close to compatible. That recompilation step is not due to differences at the user interface level, but recompiling to match the physical layout of the target machine, and add all the necessary goodness to take advantage of the huge machine. Does the user need to deal with the grody details of the particular machine? No. Moving from one supercomputer to another is again a recompile. Same for moving from a desktop system or a server to a big machine. But you might as well have an AMD CPU in that desktop or server, and an AMD GPU as well.