Bear's Portfolio through 12/2023

Important context for my portfolio reviews: I run a concentrated portfolio and WARNING the swings can be huge. From the 2021 high to the 2022 low, my portfolio fell more than 60%. For every $100 I had at the top I had just $40 left! Staggering. So, before trying this style, even with a small portion of your total net worth, please understand the downside – it’s much steeper than if you own an index, or a bunch of megacaps. Also, don’t follow or copy me, Saul, or anyone. We may sell a position or buy a new one at any time, so it’s impossible to follow anyway. Also, to succeed with a concentrated portfolio, you must rely on your own decisions.

In December I made some fairly significant changes to the portfolio, including exiting Nvidia and Celsius. With both, we all know they’ll have great sales and profits in 2024, but we have no idea what the growth rate will be this coming year, and more importantly, thereafter. The same applies to ELF, but it just seems to me that the market isn’t expecting them to crush it quite as much, so maybe there’s more upside. Guide for this year is already for $906m revenue, and on average Yahoo analysts are expecting just $1.1 billion in 2025. I just think they can crush that, but I could be wrong. I still trimmed a little ELF which was up more than 20% in December after being up 27.5% in November.

I also trimmed Remitly and Samsara for different reasons. Remitly has great potential, but just doesn’t deserve a large position because they have to prove they can be profitable first. Samsara on the other hand has proven its greatness and is well on the way to expanding profits, but it’s priced to near perfection. I just don’t see much upside, so I’ve got a small position…for now.

A word about the positions I own:

Axon - Their revenue is growing better than most and the multiple is lower than most and profitability is increasing. What’s not to love? Axon is not as volatile as most others I follow, but pretty steady – in fact it was up just over 12% for the second month straight. I trimmed a tiny bit, and I’m happy to keep a very large position.

ELF - As I said above, if they can continue growing even close to how they are now, they will crush the market’s paltry expectations. I could actually see it doubling in the next year or two.

Monday - Following another great quarter reported on 11/13, Monday was up the most of any of my companies in November. I trimmed it considerably in November, but I have added some back in December…honestly because of lack of alternatives. I still like the price and upside here better than CRWD or ZS or any of our old favorites.

Remitly - The market sold them off pretty harshly on what I thought was a good report. Personally I think they’ll need to show some profits to start getting any respect. We’ll see if that happens.

Samsara - I trimmed again as it was up another 20% this month. Again, nothing bad to say about this company. I just don’t see much more upside in the short term.

Aehr Test Systems - Nice to see them up 20% at some points in December (I trimmed a tad). No idea why it was down today. This one is volatile…but they should report Q2 in January, so maybe we’ll learn something good! Fingers crossed!

Wrapping up:

I’m presently not seeing much (other than Axon and ELF and Monday) that I believe merits concentrating into. And by my large cash position you can tell that I’m not finding new companies worth owning…though admittedly, I try to set the bar high.

With a 47% year, I’m happy, and it seems holding 20% or 30% cash on average (and lately even more) hasn’t hurt me – in fact, a lot of the time I think it has helped. I’ve gotten to take advantage when bargains come along, as they often do.

Happy New Year!


“Compound interest is the eighth wonder of the world. He who understands it, earns it … he who doesn’t … pays it.” - Attributed to Albert Einstein

Previous Month Summaries

Dec 2016 (contains links to all 2016 monthly posts): Bear's Portfolio at the end of 2016 - Saul’s Investing Discussions - Motley Fool Community

Dec 2017 (contains links to all 2017 monthly posts): Bear's Portfolio through Dec 2017 - Saul’s Investing Discussions - Motley Fool Community

Dec 2018 (contains links to all 2018 monthly posts): Bear's Portfolio through Dec 2018 - Saul’s Investing Discussions - Motley Fool Community

Dec 2019 (contains links to all 2019 monthly posts): Bear's Portfolio through Dec 2019 - Saul’s Investing Discussions - Motley Fool Community

Dec 2020 (contains links to all 2020 monthly posts): Bear's Portfolio through Dec 2020 - Saul’s Investing Discussions - Motley Fool Community

Dec 2021 (contains links to all 2021 monthly posts): Bear's Portfolio through 12/2021 - Saul’s Investing Discussions - Motley Fool Community

Dec 2022 (contains links to all 2022 monthly posts): Bear's Portfolio through 12/2022

Jan 2023: Bear's Portfolio through 01/2023

Feb 2023: Bear's Portfolio through 02/2023

mid-Mar 2023: Bear's Mid-March Update

Mar 2023: Bear's Portfolio through 03/2023

Apr 2023: Bear's Portfolio through 04/2023

May 2023: Bear's Portfolio through 05/2023

Jun 2023: Bear's Portfolio through 06/2023

Jul 2023: Bear's Portfolio through 07/2023 (and Aug 3)

Aug 2023: Bear's Portfolio through 08/2023

Sep 2023: Bear's Portfolio through 09/2023

Oct 2023: Bear's Portfolio through 10/2023

Nov 2023: Bear's Portfolio through 11/2023


Surely we have some idea of Nvidia’s growth rate for 2024. From a recent TMF free article:

We can see both Data Center and Total Revenue trends:

And next quarter Nvidia has said total Revenue will be $20B. That’s a smaller jump than the previous 2 quarter’s jump, but that reflects management’s expectation of the impact of export restrictions.

I think there are things holding back Nvidia right now:

  1. China exports are basically non-existent in this current quarter. Nvidia has yet to announce a new chip that will meet the US export requirements for countries like China, although they have said multiple times they’re working on a set of such chips. And, they just announced a graphics chip that meets export restrictions, so work is certainly underway for an AI chip.
  2. In the past, Nvidia investors have been burned on previous bumps in revenue and profitability due to market changes that turned out to be short-lived. The most recent and impactful of which was the sales of Nvidia chips for crypto/bitcoin mining. My belief is Wall Street thinks AI could be yet another short-lived pop that will fade away. But, that’s not going to be the case - I previously linked to a McKinsey report on AI that indicates AI is a sea change for how businesses will operate, starting today and growing over the next decade or two.

Which brings us back to Bear’s argument that we don’t know what beyond 2024 looks like. I don’t understand this argument - 2025+ is a guess for any of our companies, if only due to new competition, especially in retail spaces like energy drinks or make-up.

If anything, we know more about Nvidia’s competition for the next 24 months than we do for most of our other companies. Chip design and manufacturing is a slow process. Nvidia’s only potential competition comes from either AMD or Intel. Both of which just announced new products that aren’t better than what Nvidia offers - and we KNOW Nvidia has a new faster chip coming out in mid 2024 (the H200). It’s almost certainly going take AMD at least 2 years before their next fastest AI chip comes out, and companies like MSFT and Meta are at least 12-18 months out before their chips are available, meaning Nvidia will still be the king of speed through 2025 at least.

I will grant that there’s a question as to whether everyone doing AI needs the fastest chips available. And if there is a price-sensitive market for AI chips, then AMD probably does have a price/performance advantage. But, what we’re seeing with AI is the trend is towards massive data centers where they can’t get enough of the fastest chips. Users of AI don’t want to wait minutes or hours for their results. In the kinds of business queries McKinsey envisions for AI, users will want results in seconds, because they’ll want to base new queries based on information returned. The value in AI for business is the reduction in white collar headcount needed, so the ROI on even Nvidia’s expensive data center products is still strong. And, it’s likely that Nvidia will discount older AI chips as they come out with new AI chips, so they may even be competitive on pricing in the mid-market in 2024/2025.

And all this is before even considering Nvidia’s new push into Data Center chips using the ARM architecture. Nvidia has a super-fast “Grace” ARM-based chip and even a combined “Grace Hopper” offering that combines the Grace chip with the H100 AI/CPU “Hopper” chip on a single board, enabling serving up results containing AI and conventional computations.

And even more, Nvidia is pursuing a number of cloud-based server offerings. I suspect most of Wall Street’s analysts covering Nvidia are limited to the semi-conductor space and don’t “get” how cloud computing can expand the AI market.

To summarize, I think WS worries about exports and AI longetivity are artificially keeping the price down. Nearing $500 a share might seem scary, but look at the revenue and profit growth trends, future products we know about, and the competition. If someone has a reasonable scenario where Nvidia doesn’t continue its growth next year at a rate that exceeds its current about 25 forward PE, I’d like to hear it.


Well said. A further comment on the data center retooling (moving data center architecture to Nvidia AI chips) that Huang has spoken about – as much as I understand all this, and I admit to many gaps, MOST companies that are chomping at the bit to use AI will be using it via cloud services provided by the major providers. They will not be buying Nvidia chips directly, probably’ever’. So the talk of meta etc wanting their own chips (wanting and having are two different things btw) and that possibly impacting Nvidia is quite speculative. Nvidia will remain the premier supplier of ai chips, networking tech, software, and who knows what else for the foreseeable future. Recall that there are now sovereign states embarking on ai (synonymous with Nvidia) programs already.


Yes but for companies with subscription revenue it’s a matter of how fast it grows. Will it be 40%? 30%? 20%? With NVDA, the question is will it grow at all. As your chart shows, it has gone negative sometimes, both QoQ sequentially, and also YoY (like in Q4 23 and Q1 24).

But honestly, I’m not trying to make an argument against NVDA. I hope all holders do very well. It’s just not for me.



First, I just want to acknowledge that an investment in NVDA is not for everyone, and I can understand why different people choose differently. I certainly respect Bear’s decision, especially as his past performance has been so great.

I have two responses to that. First is that stock valuation is an expectation game. If a subscription-based company’s revenue growth declines from 40% to 20%, its stock price is going to take a big hit even though revenue is still growing at 20%.

Second is that subscription revenue isn’t guaranteed to grow every quarter/year. For instance, Disney subscription revenue fell 7% earlier this year.

For me, the value in subscriptions is mostly due to the typically easier scalability of the business - but that’s a function of the typical subscription service’s delivery mechanism than its billing model. Snowflake, for instance, keeps insisting that it doesn’t have a subscription model but a consumption-based model, yet it has the same easy upwards scalable business model (via
public clouds) as other subscription services.

Now granted, we’re seeing the non-cloud-service providing effect with Nvidia today. Most of its business is selling hardware, so it has had to ramp up physical production to meet demand. And, as much as it has ramped, demand still outstrips its production capabilities. So, that has actually limited how much Nvidia was able to grow in the past few quarters, which seems crazy given its high growth during those quarters. Had chip production been like video meetings, Nvidia would have had Zoom-like growth! That said, Nvidia is nowhere near reaching its TAM like Zoom did during the pandemic.

Besides, the same non-subscription argument against NVDA applies equally to software drink and make-up suppliers, except that Nvidia has technological moats for competition and isn’t subject to retail customer whims.

This is where, for me, looking at the business is more important than looking at past numbers. In a previous post I pointed out that Nvidia has had more than its share of pops and declines in the past, whether that was gaming or crypto mining. I argue that this time is different than those, as AI is certainly going to be an area that won’t deflate like crypto mining did back in the day, or hit a TAM wall like gaming. AI will power more and more applications and is nowhere near reaching any kind of saturation.

But, your worries in this regard I think mirror what Wall Street analysts are apparently also worried about. They may not believe AI is going to be a lasting force in the world. Or, even they do believe that, they’re concerned it may be like the Dot Com Boom/Bust, where speculation about the future went way ahead of reality, and it took several years for reality to catch up and exceed that speculation. Here again, I argue that that risk is smaller with NVDA today than AMZN’s forward PE was in 1999.

I’m honestly trying to think of scenarios where Nvidia suffers negative growth, and can’t come with any in the short to medium term. Application of AI technology is just starting. Competition for Nvidia’s products is at least a year behind, probably more. There might be a chiplet-enabled path for AMD to stake out a share of the mid-market based on price/performance ratio, but I don’t see/hear of anyone saying they have enough AI compute power for what they want to do and so are looking for cheaper/slower chips. At most, there’s a push for inference on existing end user devices, but the training side remains robust on the server and with AI cloud services, inference on the server isn’t going away.

Consider whether Nvidia itself could get disrupted, let’s remember Nvidia’s hardware has already withstood large architectural changes in the software that powers AI. In 2017, Google introduced Transformers that replaced RNNS (Recurrent Neural Networks) , yet it was Nvidia’s GPU architecture that remained (if not grew more) dominant over Google’s TPU hardware despite the Google driven software change. However the processing is done, true AI needs to look and compare data in a large set, so the parallel processing abilities of GPUs are unlikely to be overthrown in the mid-term. Heck, even if one thinks AI can be done on regular CPUs, don’t forget Nvidia has one of the fasted ARM-based CPUs offerings in the world.

For those that want more technical details on AI software progression, this article:

is quite good and understandable even for non-techies. And while Transformers will certainly be improved upon in the future, what’s most likely to happen is that more efficient computing techniques will be invented, but still rely on GPU style hardware architectures since parallelism remains a hallmark of AI processing .
At first blush, more efficient software may appear to indicate that fewer high powered Nvidia chips will be needed, but I feel that software optimization will mostly result in more applications gaining AI capabilities due to cost reduction, so the need for chips will continue to grow - just spread out among many more affordable clusters rather the few big ChatGPT style clusters. Think about the high powered mainframe computers of mid last century and how as software got more efficient and hardware got both better and cheaper the market expanded, not contracted as more applications for processing came into being.
Of course, it may be possible that other companies can then start to compete with Nvidia, but that’s years off and Nvidia management has been very savvy with its new products and I don’t see why that would change.

Sorry to have turned Bear’s portfolio review into a Nvidia pro/con thread, but it did seem relevant given his arguments. I’ll stop now.


I want to try to say this better. I really feel like we don’t have a lot of great investing choices right now. I do believe Axon and ELF and Monday have a lot of long term potential. Still, while Axon and Monday are SaaS, they’re certainly not growing at 90% YoY like some of our SaaS companies were a couple years ago. Same with Samsara. Those four companies are high-confidence holdings for me…but none are hypergrowth, except maybe ELF, and it’s not subscription. So we’re still not in the greatest environment with tons of amazing choices for our investing dollars. (Oh and Remitly and Aehr…those have big upside, but they are not high confidence for me.)

However, even though it’s only down about 8% ytd, I have built Samsara back up the last couple days, so it’s now my #4 position at ~7% of the portfolio.

Hopefully we can find some new promising companies in 2024.



Smorg and Bear,

I really enjoyed this discussion, so for my part, no apology needed for debating NVDA. I also exited NVDA but I have to say it was due to not doing (all) the work, and therefore potentially passing on a great opportunity.

My thinking went something like this: NVDA had a fantastic couple of quarters, and the $ amount of market cap added was the biggest of any company on the market this past year: they went from half a trillion dollars to 1.2 trillion dollars in a year - how much of that was an overshoot? Can they do anything similar in the coming year? I judged the odds of them going from 1.2 trillion to 2 trillion as small, as that would make them the third most valuable company in the US (on earth?) after Apple and Microsoft - and make them more valuable than Alphabet (current #3) and Amazon (current #4).

China is problematic, and that clouds the outlook for me, and lastly many big companies and customers seem to have the strategic intent of moving away from NVDA as a single supplier. In addition my position was relatively small, and I haven’t spent enough time really understanding the company or the path to an even higher market cap = exit.

NVDA therefore remains very much on my watch list. With that in mind, I was hoping you could expand on your assumptions for the two things I quoted above:

  1. the performance for next year - what are you looking at for revenue and margins for next year and
  2. what do you see as their TAM?

Thanks for the discussion.



Management historically only guides for the next quarter (called Q4 FY24), which they did at $20B +/- 2%. I’ve seen WS consensus of only $58.7B for FY24 or as much as $78B. But those assume negative growth beyond the current quarter. Explain that one to me, please.

As for TAM, Mizuho’s Vijay Rakesh, (very high tipranks rating) , says:

“We see the AI compute market growing 10x over the next 5 years to >$400B/yr as companies continue to invest in AI applications, with NVDA maintaining 75-90% market share throughout the ramp powered by its Grace and Hopper offerings,” the 5-star analyst explained. “With the push for greater AI adoption, we believe NVDA could potentially see a ~$65B revenue opportunity by C27E, up >4x from the $15B we saw in F23.”

McKinsey talks about TAMs in the single and double-digit $Trillions, with AI being a decades-long lasting trend.

And most of the WS analysts are ignoring Nvidia’s push into cloud services since they’re all hardware chip-focused. There’s a “GEForce Now” cloud service for gaming (no need to buy a high-end graphics card to get high-end graphics performance), and for AI there’s the GDX Cloud, which provides hosted cloud-based services of a Nvidia HDX AI platform (using 8 GPUs), which is also available through Azure. I think Nvidia is going to keep adding adjacent businesses.

Heck Nvidia makes more off AI networking than AMD hopes to make in all of AI this year! And Nvidia now has a really compelling ARM-based CPU solution for non-AI servers, as well as combo boards with both CPU and GPU.

Here’s Nvidia’s investor presentation from the last quarter, chock full of information:

From that presentation:

• The U.K. government announced it will build one of the world’s fastest AI supercomputers with almost 5.5K Grace Hopper Superchips
• German supercomputing center Jülich will build its nextgen AI supercomputer, with close to 24K Grace Hopper Superchips and Quantum-2 InfiniBand

  • Will be the world’s most powerful AI system with over 90
    exaflops of AI performance
  • Marks the debut of a quad NVIDIA Grace Hopper

That almost 30,000 Nvidia chips right there, each at a 5-digit USD price tag.

None. The P/E ratio shrunk by a huge amount (ok, up a bit on today’s CES pop).

Mr Market revenue targets suggest growth has stopped or will even go negative. I’d be shocked to see Nvidia get less than $100B in revenue next fiscal year. And if you care, the top-ranked analyst I quoted above, Vijay Rakesh, has a $625 price target for NVDA. BofA has a $700 price target, with a FCF target of $100B.

Has it really been that long since Exxon was the S&P’s #1 company by market cap? Who’d have thought back at the turn of the century that an online retailer would beat them?

How much can AI grow? Even leaders of great companies often don’t see the true growth potential. Here are two famous quotes:

“I think there is a world market for maybe five computers.”
Thomas Watson, president of IBM, 1943

“640K of memory should be enough for everyone”.
Bill Gates, CEO of Microsoft, 1981

For me, the only question is when the shift from AI “picks and shovels” hardware to AI software will happen, and who will be leading that. Behemoth Microsoft has made some savvy plays so far and between Azure and its “CoPilot” productivity aids may do very well. And, Elon Musk just stated:

That all said, there’s going to be lots of “AI buzzword compliance” being bandied about in company press. The trick will be figuring out which companies have truly compelling products/solutions. In the meantime (next couple of years in my estimation), AI picks and shovels from Nvidia will continue to rule the roost.