Snow: Did I hear the same Conference Call?

In reading through the multitude of posts on SNOW’s ER and Call today, I can’t help but think that my Seeking Alpha feed was hacked and someone substituted an outstanding call (https://seekingalpha.com/article/4492387-snowflake-incs-snow… ) compared to what others have heard:

• Product revenue grew 106% YoY.
• RPO grew 99% YoY.
• 52% of RPO to be recognized in next 12 months, up 85% YoY.
• NRR 178%.
• Data Sharing “edges” up 130% YoY.
• Data Market listing up 195% this year.
• Revenue from 6 of 10 big customers grew faster than total revenue, again.
• Number of $1M trailing spend customers grew 24% QoQ.
• Gross Product Margin 75%
• 100% of Snowflake customers surveyed said they would recommend Snowflake to other organizations for the fifth year in a row.

Product Improvements:
• Snowpark Java on AWS released, Snowpark Python is oversubscribed in preview.
• ITAR (International Traffic and Arms Regulation) compliance achieved on AWS and Azure government regions.
• Snowflake is constantly improving its product/service. Last year they found a way to compress data even more, and the change happened with no impact to customer workflows. This year, they found a way to increase compute performance, again this is happening to no customer change impacts.
It’s interesting that one poster here was claiming that Snowflake is more expensive than Google’s Big Query service, yet Snowflake’s product market is larger and growing faster - and we see Snowflake’s product getting more economical every year. Since Snowflake compute is now 10% to 20% more efficient, I expect more companies to choose Snowflake over the competition, and existing customers to expand their usage.
• Expect an announcement of a new Security offering this year, on top of a “best-selling” add-on already available.
• Slootman pointed out that uptake on unstructured data has been “quite strong.” If you read my prior post “What is Snowflake,” you’ll recall that Snowflake handles both structured and semi/un-structured data and Slootman talked about “some really interesting new opportunities where data models are looking for relationships between unstructured data types and other types of data types, things that just weren’t possible before that are now enabled by the platform.”

Sales Orientation:
In my view, this doesn’t get enough attention. Under Slootman, the sales organization has shifted from geographically based to industry-based, at least for large accounts.
My impression is that Snowflake has been picking up easy wins (Slootman: “bread and butter”) from companies migrating from on-premise servers, but is now focusing more on helping potential customers solve their use cases.
Co-selling with the cloud vendors bringing in as much RPO as direct selling ($1.2B last year).

Guidance:
• Next quarter (2023Q1), expected product revenue growth of 79%-82%.
• Next year (2023), expected product revenue growth of 65%-67%.
• NRR to be 150% to 169%, down from the astronomical 178% recently.
• In an interview with Jim Cramer (summarized here: https://www.msn.com/en-us/money/markets/snowflake-ceo-projec… ), it was pointed out that last year’s guidance was for 82% YoY growth, but Snowflake achieved 106% YoY growth.
• And predicting the future is hard since this is a consumption model, not pure SaaS. Slootman: “We have tons and tons of customers that we have zero history with that we somehow have to project exactly what they’re going to do and how they’re going to grow.” It’s clear guidance is very conservative, which in these unstable times is something to be welcomed.
• Overall: "I think it’s important for everybody on the call to understand that we are super early innings in terms of the total opportunity. "

My Summary:
It’s clear to me that Slootman and Snowflake management are playing the long game. Their engineers are constantly making the product better, which will make existing customers expand their usage and attract new customers. They have refined their sales approach to be customer-focused - not just on migrations of existing workflows from legacy on-premise servers, but on new workflows that solve customer problems. Snowflake customers contract for minimum usage commits to qualify for lower pricing, hence RPO, and while those are growing Snowflake continues to see customers spend above the contract minimums.

In terms of the short-term headwind on an effective compute price-cut, I think back to the mid-years of Amazon, where their mantra was to cut prices as much as possible. Bezos believed that losing some short-term profits was worth gaining the trust of customers that Amazon would provide their best possible pricing, and that would drive continued business to their site. It worked. I see Slootman and Snowflake doing exactly the same thing here. Putting lots of your company’s data onto one provider (Snowflake) and keeping it there for subsequent analysis is a big commitment, as moving that data to another provider is costly and time-consuming. Existing and potential Snowflake customers now know that Snowflake will always strive to give them the most efficient product possible without gouging them in the future for “enhancements.” I think this provides a huge level of trust in Snowflake that helps with these big commitments from big customers. I’m happy to have a 6-month 10% short term penalty for lasting long-term gains.

My belief is that Mr. Market is giving us a gift in SNOW valuation.

137 Likes

I agree that everything looked good from the conference call, and they clearly have a great long term vision, so I am long SNOW.

But there’s no getting around the fact that they guided to 67% top end revenue growth. Even in their explanation (we beat by so much last year) they point out that they guided to 82% growth the year before. So by their own admission, they are expecting a slow down by 15%. This is probably reasonable as they scale and is certainly something they expect to beat, but the market is not so naive that they are actually expecting 67% growth.

Slootman is experienced and knows the software industry well. His prior long term projections basically apply about 80% growth endurance for 6 or 7 years, and 67% is in the ballpark of that number relative to their guidance from a year ago.

So they’ll have to beat by a good amount. 80% growth next year isn’t going to cut it for a company valued so highly. We know these companies are generally valued based on expected revenues and revenue growth in the next 12-24 months. While they might be introducing improvements in their services for long term success, in the medium term we need them to step up revenue generation. Or we will see expectations and thus valuation reset. Cloudflare has perhaps been the best at managing growth expectations while allowing them to improve longevity.

8 Likes

The Earnings Call was great, I listened to it and read the Transcript.

I am presenting my Q&A notes thematically, not chronologically. I chopped up sentences to get the essential points that illustrate the concerns, and how they were addressed by the management. All condensed for comprehension and future reference.
I am adding my own commentary in a format of (NOTE: commentary) below.
I hope people find it valuable this short

Baconski

PLATFORM ENHANCEMENTS IMPACT REVENUE

(NOTE: by far most questions were asked about the loss of revenue due to platform enhancements for customers’ workloads. You can see the consistent answers from Mike, it’s almost silly that he keeps repeating the same over and over, that customers will move more workloads over to use when they see price improvements and that we have a proven track record on this. – And yet Analysts kept asking the same question it seems. I left his comments on purpose to show me how consistently he kept answering each question.)

Question: how should we think about the slope of the curve and cadence of future improvements that clearly benefit the customer but the impact that then has on the model? I guess maybe the confidence that you have in calling out $97 million.

Mike Scarpelli:

  • Q4, our warehouse scheduling service improvement rolled out for three weeks (January) AND we saw a $2 million impact (to revenue), an improvement for customers: using less (lower consumption) but doing the same number of queries.
  • (NOTE: upcoming improvement) And then, there are other platform improvements that we’re doing that we rolled out in beta at the end of the quarter, and it’s starting to be rolled out now throughout the year.
  • Impact (to revenue) is much bigger than what we were anticipating, and the full year impact of that next year is quite significant.
  • the gross is actually more like $160 million, but I do expect that will be offset by over $60 million in additional workloads coming from our customer (NOTE: hence $97M less revenue FY23)
  • (NOTE: where is $60M coming from?) what we generally see is when we do these things, there’s usually a lag of about six months when we start to see more workloads move to Snowflake. (NOTE: they expect more workloads because of cheaper prices to customer thanks to platform enhancements)

Question: other vendors, if they have product improvements that actually saves their customer cost they usually have a sharing model; the customer gains something and you gain something through maybe like price increases, et cetera. Is that something that over time could happen? Or are you really happy to continue to give all the benefits back to customers?

Mike Scarpelli:

  • our whole philosophy is: any improvement we do will benefit the customer, but it benefits us long term, too, because anything we do allows them to do more with the credits they bought, They can run more queries per credit they buy.
  • when customers see their performance per credit, and it’s trending, that is getting cheaper for them to run things. They realize they can do other things cheaper in Snowflake and they move more data into us to run more queries.
  • We have no intention to increase the pricing for existing customers. What I will say is on new customers coming in, we will be very disciplined around discounting with new customers.

Question: on the platform improvements, how do you think about it conceptually from the kind of benefit you get from delivering those improvement factors to your customers?

Mike Scarpelli:

  • the benefit is, first of all, the customers see an immediate price performance improvement. And our customers are always looking at price performance. And when they compare our price performance versus running it whether in another cloud or running it on-prem in their existing data warehouses, they make the move to move more things into.
  • As a reminder, we have landed hundreds of customers to do these big on-prem Teradata migrations… completely shut down a little over 30 of those, maybe in the mid-30s now. There’s piles of other workloads that they plan on moving and when customers see the price performance, they will accelerate the movement of those other workloads to us and we have historically seen that.
  • we see the gross impact of about $162 million for the year, and we think we will make up about $65.5 million in revenue. … It could be a one-month lag. It could be a six-month lag before they realize that and move more workloads. But based upon what we’re seeing, we think there will be about $65 million coming back in to get to that net $96.7 million, if you want to be precise what we’re estimating.

Question:on these platform improvements. Companies normally don’t willingly make changes that cut 5% out of the revenues. So I’m curious, were you getting pushback from customers around price performance relative to alternative products and you decided to try to alleviate that price pushback by making this change? Like what’s the broader context for doing this because it’s very rare?

Christian Kleinerman:

  • We’ve been doing this since the very beginning of Snowflake.
  • We’ve always been focused on improving the performance of the system, and we are very cognizant that it improves the economics for our customers.
  • And the rationale behind it is that there’s so much more data being created every day.
  • And the more the marginal cost and effort of getting value out of the data decreases, we know that there’s a lot more value for the company to generate out of the data. We see it time and time again. The more improving the economics of the platform, the more use cases come to Snowflake. So we’re looking at this with a very long-term view.

Frank Slootman:

  • This is not philanthropy. This stimulates demand.
  • we can prove that to ourselves by going back years because we’ve done this over-and-over and it does stimulate demand
  • but it doesn’t do it in real time, there’s a lag involved in this process.

Mike Scarpelli:

  • I think this is probably the biggest magnitude impact at one time in any platform improvement that we’ve done since I’ve been here.

CONSUMPTION GROWTH:

Question: about slower-than-normal return to consumption growth in January. Plenty of other software companies saw slower consumption over the holidays. I’m curious, did you get any sense of what occurred in January that might have driven that behavior? And have you seen that change in any direction so far in February?

Mike Scarpelli:
holiday effect going into January, people were taking longer vacations, but we did see it return to more normal in January. Example, last week was President’s Week and ski week for a number of people. We see it decrease there as well. But then we see a return

  • about 70% of our work is really driven by machines,
  • the other 30% is humans and that machine

The machine layer stays consistent on a daily basis and it’s the human interaction that changes.

Question: touch back on the seasonality and consumption. Can you just talk about how the consumption piece came in versus your expectations and maybe compare that with how new bookings and sales productivity came in versus expectations?

Mike Scarpelli:

  • the quarter actually came in pretty much where we were expecting, slightly off from consumption in January, but not a huge amount.
  • we were surprised at an enhancement we rolled out and the profound impact of it. It was only out for a few weeks in January and had a $2 million impact.
  • I was really surprised with the strength in bookings in the quarter. We closed over $1.2 billion in contract value in the quarter, growing our RPO to $2.6 billion. That was well above what we were planning internally.
  • We co-sold with the cloud vendors, $1.2 billion in contract value for the year as well.

DISCUSSING STREAMLIT

Question: As we contemplate the results and the guide for next year… the Streamlit acquisition… Any response to: competitive changes in the market? or is this something that has been teed up in part of the vision for a while?
(NOTE: I think the question he meant to ask is: how are you going to make money from Streamlit acquisition)

Frank Slootman:

  • definitely part of a strategy: focus on driving workloads from the developer to Snowflake. We’ve been obviously super successful to drive it from the data engineering, data warehousing, data analytics side. But with the initiatives around Snowpark, all the programmability options for us to really address the Python developer community, this is going to be a superb asset for Snowflake. We have to address workloads across the spectrum, and this is going to help us do that in places where we historically have not been as well represented as we’ve been in other areas.

Question: You mentioned Streamlit was in the guide but didn’t give us much detail in terms of how it was in the guide. Is there a significant revenue contribution or operating margin impact that we should expect from the acquisition?

Mike Scarpelli:

  • There is about $25 million in expenses associated with Streamlit. There is no revenue.
  • we won’t be having a product ready on Streamlit until the end of the year so we’re not factoring in any revenue that could come sooner.

(NOTE - Streamlit acquisition $800 million, 80% stock, 20% cash)

Question: best case scenario for Streamlit, two or three years from now, what is Streamlit going to allow Snowflake to pursue in the machine learning data center that would consider it to be a success?

Christian Kleinerman:
At Investor Day last June, we shared our vision to help organizations of all sizes build applications, data applications and data experiences on Snowflake. What we see with Streamlit is their super easy-to-use framework powering all sorts of applications, both for internal consumption of data within companies, but also coming to our marketplace and helping entire businesses. Some of them industry vertical business, some of them horizontal experiences. But at the end of the day, unlocking the power of data and creating new data experiences.

(NOTE: initiative around Snowpark, drive workloads from the developer to Snowflake. Streamlit is all python ecosystem. Get Python Developers using Snowflake data).

FOCUS ON VERTICALS

Question: as of February 1, you implemented changes with respect to verticalizing your sales motion. Can you elaborate on that? It sounds like you’re seeing real progress with respect to verticalization. Just trying to get a sense of how substantial these changes may be as you kind of go after that opportunity.

Frank Slootman:

  • this is not a reorientation of our entire selling motion.

  • it’s the upper stratum in terms of our large account focus.

  • we replaced the geographical backbone with an industry equivalent of that because we don’t think, for a large account, the geographical breakdown really adds anything. This is not new, we’ve discussed it on previous calls

  • Sales organization working all of last year on making this transition happen. So by time Feb 1 came around, everybody was fully up to speed, we’re locked and loaded to let that go.

  • today, 9 out of 10 conversations are industry-specific, very, very industry-specific, oftentimes not necessarily with IT types with business people and data science types, people are really trying to drive predictive insights into the business, things that are becoming possible that have never been done before.

  • the company really wants to evolve towards this posture in the marketplace. We think the industry posture is all about us assuming the customer’s point of view rather than our own, and we think that’s the correct way to do things.

SNOWPARK OUTLOOK:

Question: eventually getting Python to be GA is going to become a big deal and Streamlit clearly enhances your exposure to Python. But what are your expectations of adoption of Snowpark over the next 12 months? How do you see this progressing?

Christian Kleinerman:

  • I don’t know how to estimate the percentage of the overall consumption. But if you just look at current adoption, Java is trending quite well. We see migrations from Spark from Hadoop and other workloads. And I can also share that for Python, right now, we have way more customers requesting access to the review that we can onboard currently. So they introduced super high, they generate a lot of consumption. Trends are very positive.

CUSTOMERS GROWTH

Question: How should we think about the pace of total customer growth going forward? Is it beginning to stabilize? And are we getting to a point where maybe you’ve landed a large portion of the Fortune 500 and the focus begins to shift more from landing new customers and more towards expanding within existing?

Mike Scarpelli:

  • to be honest, we don’t focus on the absolute number of customers.
  • it’s more on the quality of customers.
  • Fortune 500 is not a great metric because it’s too U.S.-centric,
  • and we’re actually focused more on Global 2000 (G2K customers).
  • Global 2000 excludes the public sector and the large private enterprises.
  • really going after quality large customers is what we’re going after.
  • fluctuation in the number of new customers we land in the quarter tends to be from small customers.
  • these large customers, we don’t find an opportunity in the quarter and close in the quarter for a new deal. These are one, two, sometimes three-year sales cycles to break into these large organizations, and those are the ones that become the $10 million-plus customers.

SECURITY PRODUCT

Question: how much demand are you seeing around customers wanting to build security data lakes or security analytics on your platform. And is there anything you guys may look to do to lean in more aggressively?

Frank Slootman:

  • going to make announcements on this topic later on this year.
  • one of the best add-ons selling motions that we have in large accounts.
  • Snowflake is an ideal platform for hosting that type of capability.
  • you will see us lean into that opportunity a lot more going forward.

DATA SHARING

Question: the customers that have embraced data sharing, how long does it typically take for them to get there? Can that happen quickly or is it typically more with customers that have been on the platform for a bit? AND how much of an inflection consumption does that typically drive?

Frank Slootman:

  • Most of the time, our customers have other priorities in terms of transitioning their databases and their workloads before they get on to data sharing if they weren’t doing that before. But it’s also quite possible that we have workloads that are driven by data sharing as a core premise.
  • We now have 18% of our customers having at least one stable edge as part of their platform, and that was up from 13% last year. So the data cloud is really happening, and that is what a rapidly growing customer base underneath it.
  • it has NOT been an extraordinary inflection in consumption in terms of data sharing, driving consumption per se. But data sharing is a core underlying capability of an overall workload footprint. really important in that sense rather than sort of separating data sharing out as a specific workload driver.

FCF

Question: your long-term free cash flow guide at Analyst Day was for 15% at $10 billion of product revenue and it looks like you’re guiding to 15% free cash flow margin for FY 2023. give us a sense for what’s driving that big outperformance and how you’re able to achieve that so much sooner? Is it mainly the efficiencies that you’ve discovered over the last year? Just help us understand that.

Mike Scarpelli:
A couple of things, we’re entering into larger customer relationships, and you can see customers’ consumption is picking up and which is resulting in renewing contracts early, which drives that free cash flow. I do fully expect, as I said on last call, that we will be revisiting our longer-term free cash flow and operating margin guidance.
I do expect it will come up considerably, as I told you guys before, And I want to remind people; don’t be surprised when there’s a really big free cash flow number in Q1 because of how big our bookings were. But over the year, that 15% is the full year, and there is seasonality with Q1 and Q4 being the highest of the four quarters.

UNSTRUCTURED DATA

Question: any update on the unstructured data opportunity? Because I remember that was a big focus for this year.

Frank Slootman:

  • on unstructured data, the uptake has been quite strong on that. And we were expecting
  • really interesting new opportunities where data models are looking for relationships between unstructured data types and other types of data types, things that just weren’t possible before that are now enabled by the platform.
  • we’re driving this hard and we have tremendous expectations for unstructured data in general and the potential for data science, innovation and new data applications also in the context of the Streamlit acquisition. This is going to get very interesting for us.

NEW WORKLOAD TYPES

Question: large number of enterprises turning to Snowflake for supply chain, customer support, sales enablement, even machine learning workloads. I get data warehouse migrations will be the bread and butter business, but how big of an opportunity do you see in expanding the Snowflake footprint into these departmental areas? And how fast is that; are those workloads shifting to Snowflake?

Frank Slootman:

  • important for everybody on the call to understand that we are super early innings in terms of the total opportunity.
  • many things that have never been done before. And it’s not like throwing a switch all of a sudden, and everything is blinking green. (NOTE: ha ha, funny guy you are Frank)
  • Every day customers try to do predictive things with data that they’ve never done before; challenges they have is with skill sets
  • very normal natural friction in the evolution of that, that we’re trying to learn how to do these things.
60 Likes