SNOW: it's not for QoQ investing

This post is based on the three investor conferences SNOW just had (Deutschebank on 8/31, Piper Sandler on 9/13, and Goldman Sachs on 9/13). There are some great nuggets of info I gleaned from these that weren’t really found in the recent earnings call.
Especially some of Slootman’s words that are relevant to the title of this post!

The links to the webcasts are here:

Goldman 9/13: https://event.webcasts.com/starthere.jsp?ei=1566266&tp_k…
Piper 9/13: https://statusproconf.com/Admin/TalkPointRegistration.aspx?s…
Deutsche 8/31: https://event.webcasts.com/starthere.jsp?ei=1564008&tp_k…

1.) Snowflake is not anywhere close to saturation in spending from its large customers.
Average spend could be much greater than the current 3.5 million on the 1M+ customers. It could be as big as 10M, or even greater.
If you do the math on the projections SNOW has put out for FY2029 at $10 billion in revenue, suppose that they grow to 1000 customers that spend over 1M, and average spend of these customers is $10M a year, that’s $10 billion in revenue already, and not even including the customers spending less than 1M! If you look at their 2022 investor day targets, they are going for 1400 customers with >1M spend by FY2029. I am sure this guidance remains conservative and will increase as each FY passes, but you get the idea that the revenue CAGR here is projected to be quite astounding.

CFO: If you look at the Global 2000, we gave you that metric. We have about 500 of them, and the average revenue is $1.2 million. That is not very big spend on a Global 2000.
If you look at our $1 million-plus customers, we’re at $3.5 million. It tells you there’s a lot of room for growth in those companies.
And it’s funny. I get all these questions from people and they’re saying, well, all customers and the macro uncertainty you’re trying to cut costs. Well, you want to know what? $1.2 million in spend, you could let go 5 people, and that covers $1.2 million when you get companies that have 100,000 employees or more. This is not a huge spend for most of our customers.

I would have thought our largest customer [Capital One] was a saturated account 2 years ago, and it’s kind of gone from $29 million to $48 million, $49 million run rate. And I know there’s more opportunity there. So I don’t really know what a saturated account looks like. I haven’t seen it. I’m not saying it’s there. I’m not aware of any that are, I would say, are fully saturated because there’s always more use cases.

…there’s no reason why Global 2000, I’m not going to say this is going to happen within 6, 7 years, but they could easily spend $10 million a year on average…

2.) The time required to get to contracted consumption rates for new customers is speeding up. About 240 days previously (8 months), now down to 210 days (7 months.)

CFO: It was taking 240 days to get them up to that contracted rate that they were consuming within the first year. Now we’ve accelerated that with the help of partners and tools, that is now about 210 days to get them fully ramped into production.

3.) The biggest MOAT that snowflake currently has is data sharing. This is something that none of the competitors has anything like it, including Databricks. This data sharing is the “act 2” of SNOW in progress. (I would read further down, the comments by CEO Slootman of the three SNOW acts.) Other differentiators of SNOW is the simplicity (think about how Apple products or iOS in general beats out Android), ease of use, multicloud deployment.

CFO: I would say the biggest moat for us though or the real differentiator that creates the stickiness is the data sharing…And by the way, we’re the only one doing data sharing the way we’re doing it. People talk about data sharing, but they’re really not doing the same thing…There are some companies out there that talk about data sharing, but they’re really transferring the data. We don’t transfer the data. we continue to keep that in our Snowflake instance…Security and governance is the number one thing, but also cost because you’re only storing it once, it’s a huge benefit.

We coexist with Databricks in many, many accounts. I think they do really well on the data science side. They don’t have the data sharing capabilities. But what really distinguishes us from all of those is the simplicity of use in that you do not need to be a technical – have a lot of technical skills to be able to use Snowflake, unlike those other products out there.

4.) This is a very opportune time for SNOW to continue to acquire companies that speed up their product development on the cheap!
Here, the CFO says they spent $200M for Applica, but it was valued over $600M before the recent market crash.

CFO: It’s about $200 million for the company.
By the way, that was the company, a while ago was looking at valuation of $600 million in pre-money funding. So there’s a lot of those companies out there right now that thought they were worth $600 million, $700 million that are now entertaining $100 million to $200 million to buy them.

5.) SNOW’s gross retention is revealed to be in the mid to high 90% (above 95%)

CFO: The gross retention is like – north of 95%. And I don’t even know at 97% maybe. I don’t – Jimmy is here. He knows the number exactly. So Jimmy, 97% gross retention, thereabouts? Around there.

6.) Now I thought this was a bombshell. Ever since it was revealed on the recent earnings call that SNOW does 80% of its business with AWS, it looks like MSFT executives have already reached out to SNOW to try to renegotiate better competitive pricing (cheaper costs) for SNOW!

CFO: GCP is the most expensive cloud for us to run and their pricing is not very good with GCP, where AWS gives us great pricing. We’re in the midst of renegotiating a contract with AWS. And I know Microsoft will follow…It’s funny, everyone was concerned last quarter that how can you guys be growing when Microsoft was seeing weakness, and that’s why I pointed out 80% of our business is in AWS. It was funny, that got – Microsoft execs to actually call us…

7.) I really do think the $10 billion revenue guidance for FY2029 is very conservative. The CFO confirms their guidance is solely based on the products available TODAY. And so that isn’t including Snowpark…Unistore… that native app development ecosystem in that grand plan.

CFO: What has to go right to get that $10 billion target that you have out there?
I will tell you, we can get that $10 billion with our product set today and the things we have announced already. When we do our forecasting, we forecast based upon historical consumption patterns of products that customers are using. We don’t forecast based upon new products, because you really don’t know until customers start to use it.

8.) Here, Slootman explains the iterations of SNOW 1.0, 2.0, 3.0. The idea that “cloud application development” is analogous to “mobile app development” in the iOS vs Android world made it very clear to me the massive opportunity that lies ahead for SNOW. Just think of all the ridiculous amounts of consumption that can take place on SNOW as a result.
CFO Scarpelli also hammered it home in his conference that the “real end game” of SNOW is the native app development ecosystem.

CEO: We started out by disrupting analytics through something I referred to as a Snowflake 1.0.
Then we moved on to collaboration, data collaboration sharing. That was Snowflake 2.0, and that is in absolutely full flow right now, especially in financial industries.
But Snowflake 3.0, and we’ve been working on this and made announcements over the last several years, it’s all about driving cloud application development.
…we have live data. That is a huge advantage relative to anybody else…
Cloud application development has not been conquered by anybody yet the way it has been, for example, in the mobile side. Mobile, you have iOS, you have Android. But on the cloud side, that hasn’t been conquered yet.
Of course, the public cloud companies themselves would like to conquer that, but here comes Snowflake, we’re across all clouds. We have the workload characteristics, and we have the data. So Snowflake 3.0 is about conquering application development. And that will power our business model perfectly because we’re a consumption model, we drive workloads, we drive revenue. So that’s the method to our madness.

CFO: The other thing about Snowflake that’s unique…We’re multicloud. We can run on any one of those clouds. And that experience, from a customer perspective, they had – unless you – if you never told them, they would not know which cloud they’re running on, it is the exact same experience for them. And that’s a really important thing.
And that’s part of our strategy too and what the real end game of Snowflake is native app development. And when you can offer a platform for native app development that it will automatically work that app in any cloud…when you build an application to an AWS, you have to port that application to GCP or Azure, that’s a lot of work, it doesn’t just work and that’s really important.

9.) The biggest lesson I took from SNOW: There are just some companies you can’t really do ‘QUARTER ON QUARTER’ investing. Some of them you can and SHOULD (like UPST), but then you have the diamond-in a rough company like SNOW that you shouldn’t talk yourself into reducing position size because of some nitpicked quarterly investing detail (of course, valuation is a separate lesson, in which selling out of a company for valuation reasons alone can be very valid).
Slootman gave some scorching remarks to this effect!

CEO: We give 10-year guidance. Why? Because we don’t want people to ‘pick the fly s*** out of the pepper’, so to speak, and you have my views of the quarter.
But say, look, is your thesis intact over the long period of time. And those are always the questions that we try and answer.
I cannot run a company on a quarterly basis. There are quarterly aspects to it. Fundamentally, almost everything that we do has a much longer time horizon. When you look at P&Ls, most of the money we spend is not related to the current period, right? And sometimes it’s hard to convey that.
I don’t really feel that encumbered by public market…if you don’t like it, you can go buy IBM. We’ll tell you what we’re doing. We’ll give you a long-term view. We hope you want to sign up for the journey over a long period of time. Those are the kinds of investors we want. But if you want to get out in 90 days and make a bunch of money, I have nothing to tell you.

169 Likes

Hey Jon,

Good recap, I have been listening live to all their presentations, especially because Scarpelli and Slootman are quite candid and offer very good information. I think there were two more very important points raised by Slootman today.

  1. On the current demand environment/macro

Frank Slootman

We got discussions about the macro all the time. I’m sure it’s on everybody’s mind here in the room. But the macro is like an elephant. It’s a very big animal. And depending on where you touch it, you get a totally different experience. So you and I might touch the same elephant but have a completely different set of signals that we’re getting from our marketplace, right?

We are not getting those signals. And I always tell people, look, I can turn on CNBC and see what they have to say about the macro, but I only react to what our experience is. We’re touching that part of the world. Our world is good. It’s not euphoric.

But it’s solid. It’s sober, rational, realistic. People are not stopping. They’re not cowering. They’re not sort of jumping under a rock and waiting for it all to be over. So we feel very solid as far as we can see about the demand situation.

  1. On competitiona/Databricks

Frank Slootman

Well, I mean, it’s inevitable that companies like Snowflake are going to attract competition and funding of competitive ventures. You’re like, hey, we’re not going to let you have it by yourself. That’s just the nature of capitalism. And we – and by the way, I don’t mind it either, because it actually makes us better. We were just sitting there by ourselves, we get lazy and sloppy over time, because we can. So I like the fact to have competition on a level playing field. I’m totally fine with it. But to answer your question more specifically, Databricks comes from a very different background. They really have inherited the mantle of Hadoop and it was sort of one of the attempts at big data analytics, which we feel glaringly failed because Hadoop is being replaced wholesale all over the place.

So the different world’s views, I would say, and they’re definitely not the same. The other answer I would give you is that Databricks and Snowflake were partners.

And we are de facto still partners in most enterprises around the world. It’s just that they’re ambitions have evolved. So it certainly sets up a future contest. And that’s fine. Because I think for customers, it’s like, hey, you’ve got choices.

akhenaton: The Databricks/Snowflake frenemy battle in the next several years will be one to watch. While many firms today use them both (like where I work), if they both continue to enhance their offerings to move into each others spaces, then that can change.

38 Likes

I lean more toward Frank’s viewpoint. I don’t see how Databricks (DB) replaces Snowflake as a core cloud data warehouse. DB is more to run on the fly analytics and it’s good at heavy computation in short amount of time but for traditional reporting applications , which majority of enterprises would have a need for , all you need is fast retrieval. Which SNOW excels at.

3 Likes

but for traditional reporting applications , which majority of enterprises would have a need for , all you need is fast retrieval. Which SNOW

Until it’s not. When operational ML becomes a higher consumer of data (and not people), then the tides will shift. That’s where Databricks is playing the long game.

Best,
—-Kevin

Long SNOW, but not forever.

5 Likes

Kevin, that’s an interesting point. Could SNOW not plan for that and adjust in time to capture that opportunity if it saw the tide was shifting in that direction?

4 Likes

Yes. But. Spark (and Databricks) is used universally in Data Science curriculums. Snowflake is not. That’s also because snowflake does not support massively parallelized machine learning training at scale to put them into a curriculum. Their architecture is not set up that way.

When I said Databricks is playing the long game it’s also in the investment they made in the practioning community. They have built a deep support community that caters to data scientists who build models professionally and want to operationalize them.

Snowflake isn’t at all in that court.

Best,
—Kevin

Long SNOW for the business and data analysts, long Databricks for the data science practitioners

7 Likes