Reflections on SNOW's Conference Call

Now that the dust has settled, I re-read through the earnings call.

• CFO Scarpelli made a mention of the impact of the macro-environment in his prepared remarks, and it seems a switch flipped on in many of the analysts’ heads, where suddenly they weren’t asking about Snowflake anymore, they wanted insights into the macro-environment. One analyst even felt the need to say:
So instead of trying to dream up the 16th way of asking you about the macro and the impact it’s having, I want to maybe put that aside for a sec, and I mean you’re delivering amazing growth at scale, and certainly, that shouldn’t be lost.

Note that this concern about business slowing due to the macro-environment was finally put to bed with Slootman’s response (snipped for length):
the reality is that, our type of workloads become very heavily grafted into core business processes…these workloads are so heavily grafted into operational processes. So these things are not going anywhere. They’re not optional. They’re not like, what do I feel like doing today? When people put data in Snowflake, I mean, they have some serious intentions on what they’re going to do with that data…it is so embedded into core business processes. It’s not something that you can just sort of shut off for a while until things get better.

As a side-note here, Slootman took a dig at companies like Data Bricks saying that’s for more optional workflows:
That, by the way, there are workloads like that, that’s far more on the data lake side, where essentially you have a massive repository of files.

For me, I take Scarpelli at his word, saying that ups and downs they’re seeing have made them a bit more conservative in their guidance:
…large customers where we saw a decline, we’ve taken that on their forecast, but we have others that are offsetting partially some of that. What I will say is April, we did see weakness in week-over-week growth in our total revenue by customer. But to be honest, the last two weeks of March or May has been very strong. But just given everything in the macro headwinds we’re hearing, we’re going to be a little bit more cautious going into the full year.

Which was later clarified to:
Scarpelli: the last two to three weeks has been very strong and week-over-week growth in revenue that gives us the conviction even with lowering on an individual basis, some of our larger customers that are still growing, just not at the pace we saw them grow last year.

Alex Zukin – Wolfe Research – Analyst
Got it. And is there any – in terms of just maybe quantifying or sizing that impact, is that possible?

Mike Scarpelli – Chief Financial Officer
It is possible, but I told you what I’m going to tell you.


Scarpelli also said about these customers: You hear them on CNBC talking about cutting costs. And those are the ones who are being prudent and lowering some of our forecast consumption from them. And by the way, we’re going into those customers to help them optimize as well, too.

Which addresses at least one thread we’ve had here recently on optimization of consumption.

and when Mark Murphy of JPMorgan Chase tried to wrangle more non-Snowflake information:
So you had commented on slower consumption from consumer cloud. We’re probably associating that with Facebook, Netflix, Peloton, Snap types of public companies that have missed.

Scarpelli responded:
Well, I’m not going to name the customers, but none of them were the ones you mentioned.


But, then helpfully:
…there definitely is a focus on top of mind for many CFOs to find out where they can cut costs. And that’s the beauty of a consumption model is you only expense what you use, and that’s what our customers like as well, too. And so I feel good about the forecast that we just laid out, taking into consideration the macro environment and literally looking at our customer’s continued usage of Snowflake, and I don’t see that changing and in aggregate.

Yes, I’m sure there’s going to be some customers that are going to underperform. But likewise, there’s going to be many customers that overperform. And long-term, none of these customers are moving off of Snowflake and most have plans to do more with Snowflake.

• Customer Adds
Scarpelli made it clear that they’re not focused on the aggregate number of customers, but the number of “quality” customers, specifically We want to land those customers that can be the $1 million plus customers.

• Revenue Prediction
I don’t think this has been discussed enough: It was pointed out that 94% of Snowflake’s revenue in a year comes from existing customers, because it takes 6-9 months for them to ramp up. So, the customers they signed up in Q3 and Q4 will start to come online next quarter and the following quarter.

Additionally, the large customers they signed up last and this quarter are going to take 6-9 months to hit revenue. Since we know they’ve been successful signing up large customers, I feel secure in what Snowflake’s near-term revenue will be.

• Pricing
They pushed back hard on questions for changing their pricing model. Slootman said:
we sell best-of-breed, we sell value, we sell impact. We have verticalized our business because we’re really adopting and owning the customers’ mission, the customers’ outcome rather than just sort of whining about whether we’re going to pay a fraction of a cent more for compute credit, right? I mean those are really non-sensical conversations out with customers when they – when a hospital is trying to save lives or improve quality of life. Those are the core missions of the institutions that want to use Snowflake for really amazing things.

And that’s the conversations that I’m certainly having with customers. And people are not coming at us with a hammer like that’s to be cheaper.

With Scarpelli adding:
I’m not aware of a single customer, a significant customer opportunity we’re going after where we lost it over pricing – not a single one.

• Impact of Performance Improvements
As board readers know, there has been a lot of consternation over Snowflake improving performance without raising prices. Management talked about the actual impact of that so far:

Christian Kleinerman – Senior Vice President of Product
We’ve heard from many of our customers that they see the concurrency improvement, the lower latency, and many of them have expressed intent of bringing additional workloads on to Snowflake.

Mike Scarpelli – Chief Financial Officer
There’s been nothing but positive impact from our customers on these performance improvements.

• Data Sharing
As Snowflake followers know, this is a growing portion of the business, and is close to unique to Snowflake. A “stable edge” is simply data that’s made available to others. It needs to be “stable” in order to have others be able to use it.

From the call:
the number of stable edges grew 122% year on year. 20% of our growing customer base has at least one stable edge, up from 15% a year ago. Snowflake’s Data Marketplace listings grew 22% quarter over quarter.

Scarpelli repeated that he expects NRR to decline but stay above 130%. Interestingly, NRR this past quarter was 174%, down just slightly from last quarter’s 177%, but above the prior quarters. Snowflake has now had 5 straight quarters of NRR above 168%. Amazing!

NRR is really driven by, a, our largest customers continuing to grow quite quickly, but also new customers that are coming into the cohort that are ramping. And this shouldn’t surprise – this is the real focus that we’ve had over the last two years seasonally and this is why this isn’t just the number of customers, but quality customers.

• Long Term Growth.
CFO Scarpelli: For fiscal year 2029, we are reiterating our target for product revenue of $10 billion, growing 30% year over year.

Investors with a longer time horizon might want to consider making an (additional) investment in SNOW at current levels and then not even looking at it for at least half a decade.

Slootman also reiterated that he’s focused on the long term. That with $5B in cash on hand, they will continue to invest the business and not panic over short-term bumps:
We can’t be crash tacking the ship every time people get a little nervous around the table, right? So as long as we can drive growth as we are with the economics that we are producing in terms of the unit economics, in terms of operating efficient, in terms of cash flow, why would we not be doing it?

I mean, the behavior of our customers and how they’re – the type of contracts they’re stepping up to, they are not in a mode yet where they’re sort of in a massive avoidance mode of doing contracts or trying to do a natural acts in terms of the expenses that you’re generating as part of the platform.

And for me, this remains key. While we rightfully concentrate on how our companies’ businesses are doing, we shouldn’t ignore the impact of the macro-environment on those businesses. If our companies’ customers (consumers or other businesses) cut back for whatever reason (macro or not), that’ll impact our companies’ performance. So to that degree it pays to pay attention. What I got from this conference call is that the workloads being put into Snowflake are critical to Snowflake’s customers and so won’t be removed. Since this is a consumption model, those workloads may ebb and flow to some degree, but management was clear that the most they’re seeing is that some customers aren’t growing consumption as much as they had been. Still growing, mind you. And with the sign-ups of new large customers in the past few quarters, it seems that Snowflake’s near-term revenue growth is secure.


Smorg, what a great, insightful write up. Can you tell us more about the significance of stable edges? I also saw that on the call but had no idea if it’s importance.



Can you tell us more about the significance of stable edges?

Yeah, the name “stable edges” isn’t great, but that’s engineers for you. It’s like Tesla engineers talking about “4680” while General Motors has a whole department to come up with the “Ultium Platform.”

Anyway, there really isn’t more to say other than what I said:
A “stable edge” is simply data that’s made available to others. It needs to be “stable” in order to have others be able to use it.

Here’s a recent podcast with Snowflake VP Jeff Frazier:… with the following snippet:

When those two agencies share, which they hadn’t previously done in a meaningful way…they become stable. We call those stable edges. When I’m a justice entity in California, and I’m sharing my data in a partnership with an education entity, I can contrast that with information from census and information from Department of Health and Human Services, et cetera. When I can start to aggregate that and put it into a bucket, I create a whole different world of insight to help people reduce risk and harm or to help them achieve. It’s super powerful.

When you have a stable edge, when one agency is sharing with other agencies, now they can do that much more efficiently through cloud enabled technologies and native cloud technologies. You start to see a very different impact on the mission. We did this with COVID and tracking. In California, we figured out, “Hey, we know where the vaccines are. We can see where the testing is. Now, we see where it’s cropping up. Let’s put the two together and let’s start finding out how to get people help faster.”

So, counting stable edges is one way to measure how much data in Snowflake is being shared with other Snowflake-using companies. From the conference call:

• Last quarter stable edges grew 122% YoY.
• A fifth of Snowflake customers have at least one stable edge, up 15% YoY.
• Listings in Snowflakes Data Marketplace (where companies go to find data shared by other companies) grew 22% YoY, to 1,350 listings.

The significance of Data Sharing is that it’s a way for Snowflake customers to monetize their data that would be hard to do with other Data Warehouses. Imagine you’re GM, and with OnStar you have data on how many miles your cars are driven, at what speeds and on what roads, and also, especially, data on accidents (from airbag deployments or 911 calls). You can now sell portions of that data (anonymized) to entities like insurance companies, which will use that data to figure out what to charge car owners. For GM, this is almost pure profit since it is data they’re already collecting and storing.

GM can charge AllState or whatever insurance company for the data. Snowflake doesn’t charge an additional fee for sharing; They make their money, as always, from the storage of data and compute on that data, which in this case is from the insurance company taking GM’s data and running it through their analysis calculations. The financial arrangements for sharing are done outside of Snowflake - Snowflake provides mechanisms to control and monitor what is shared so GM can bill Allstate and or terminate sharing at any time.

Data sharing both helps drive companies to use Snowflake and once there, to stay on Snowflake.


I want to provide one possible example of what they meant by a “stable edge.”

We have a CRM tool (not Salesforce) that we ingest data through their API and put in our data warehouse for analysis. Before 2021, we use a SaaS data integration tool as part of our ETL pipeline to load data hourly from this CRM’s API to AWS S3, then Snowpipe from S3 into Snowflake for analysis.

But last year this CRM vendor joined Snowflake Marketplace. Now we are subscribing to their Data Share directly. This means we can query directly from this CRM’s published schema in Snowflake when the data is generated by my company. We pay this CRM vendor a small subscription fee on Data Share.

What does this mean? It means Data Share reduced the complexity of our software system greatly.

Before: CRM → data integration tool → AWS S3 → our Snowflake
Now: CRM’s Snowflake → our Snowflake

We cut the number of vendors in the process from 4 to 2. Data arrives at our warehouse faster - this CRM pushes updates to their schema within minutes of new data arriving and we didn’t need any extra infrastructure to support this since it is all within Snowflake and maintained by this vendor. When something breaks we can also complain directly to CRM’s support and not the intermediate vendors.

This is probably what Snowflake meant by “stable” because we are getting data from the CRM tool as long as we continue to use it for customer support - I really don’t see how we will move away from it. It’s not like we are buying a 2020 US census dataset for only one time.

There are many SaaS data integration vendors like Fivetran, Stitch, Airbyte. You can look them up since they are common in today’s data world and you either use them or write your own. Snowflake Marketplace reduces the need of these types of tools.


I have been watching stable edges for a while. Here are some of my notes and one question:

Data Sharing Metrics. Peter Offringa at on Q4
As I discussed in my prior post on Snowflake, I think that data sharing creates strong network effects to draw in new customers for the Data Cloud. In the Q4 report, we received an update on progress with data sharing. Leadership revealed that the total number of stable edges increased by 130% during the year. At the end of FY2022, 18% of customers had at least one stable edge, up from 13% at the end of FY2021. Applying these percentages to the total customer counts at each point in time reveals that the absolute number of customers with a stable edge almost doubled year/year. Data sharing remains a popular feature for customers.

Q1 2021 From the call:
the number of stable edges grew 122% year on year. 20% of our growing customer base has at least one stable edge, up from 15% a year ago.

Me here, from the slide presentation
• Total Customers at the end of Q1 2021 = 6,322.
So, I have 1,264 (20% of growing customer base) with at least one stable edge.
But this above observation only accounts for an +85% YoY for the number of customers utilizing at least one stable edge growth.

Muji Q2 ‘2021
16% of customers are using stable edges (aka are sharing data in a stable consistent way with an external user), from 15% last Q and 10% before. So customers that are fully utilizing sharing capabilities went from 414 → 680 → 798. The overall number of edges grew +32% sequentially, a small acceleration from last Q’s +31%. More and more customers are discovering the data sharing capabilities – critical to my thesis (and Snowflake’s own vision). I want to see these trends continue.


2021      Q1.     Q2.      Q3.      Q4
          680.   798
2022      Q1

I’d love to be able to fill in the missing Q3 and Q4 raw numbers for customers with stable edges and understand what I did wrong to get only +85% YOY growth of stable edges when Scarpeli said growth was +122%. I could work backwards from given percentages to get raw numbers; but, when I’m already getting suspect Percentages from raw numbers…I thought I’d ask if anyone has done this already and could share?





The +122% that mgmt mentioned (and the +130% that Peter O talked about last Q) was for the number of stable edges, which mgmt doesn’t share the raw numbers of – only the % of customers using them.

Since Q321, customers w/ stable edges have gone from 11%, 13%, 15%, 16%, 17%, 18%, 20%. (In that, 11% and 17% are guesses based on % gains they mentioned.) So customer counts would be 538, 680, 798, 921, 1070, 1264, for net adds of 138, 142, 119, 122, 149, 194.

  • muji

(Sorry for not responding earlier, Jason) Muji is correct. The difference in the percentages is that the increase in stable edges represents the the total number of data sharing links between customers. Because stable edges are one-to-many (one customer to many partners), it’s possible (and expected) for the growth of stable edges to be higher than growth of customers with at least one stable edge.

Also notable regarding stable edges is that they have to represent an active relationship in order to be counted (that is the stable part). Snowflake defines this as “the two parties must consume 40 or more credits of Snowflake usage each day over a 6 week period for the data sharing relationship.”

This press release on the Healthcare and Life Sciences Data Cloud provides a good graphic to visualize data sharing relationships:…

Also interesting is the press release below from Capital One that they are launching an enterprise B2B software business focused on providing cloud and data management solutions for other companies. The idea is that they want to share their expertise and best practices developed in managing their large data sets serving over 100M customers.

The first product for that new business unit is called Capital One Slingshot, which provides data management solutions for customers of Snowflake. Overall, it is designed to help other businesses accelerate their adoption of the Snowflake Data Cloud by delivering automation, tools and analytics around usage. Capital One is a large customer of Snowflake and has five years of experience working with the Data Cloud. Now they want to sell consulting and software services to share that insight. One side effect, though, is that they advertise helping companies optimize Snowflake spend. That’s healthy for adoption in the long run, but may smooth out spikes in utilization growth.…