SNOW Q1 2024 Earnings

I’m always interested in SNOW because outstanding leadership is #1 for me. Leadership with capability to burn will almost always find a way to bring the company along through difficult times. Pretty sure we have a consensus that Slootman is one of the best.

However, thought I would pass on a cautionary comment made by CNBC tech commentator John Forte who passed along a bearish take from an unnamed analyst, which tells me it might be Forte’s take. He said that he was skeptical that SNOW would be an AI winner. Rather, he said the emergence of AI, LLMs, etc., are all about obtaining their data from Internet sources, and the need for SNOW type service is at risk of withering.

Forte did not seem to have the time to elaborate and I have little competency to assess the assertion, which is why I disclose it here. Perhaps the tech pros here will think it worth checking out.

Forte is a credible technology guy, but like all tech revolutions, where this is going is likely full of surprises to even the most astute.

10 Likes

Data for machine learning needs to be stored somewhere. Also, the raw data goes through several processing steps before it can consumed by the model. Data that has been transformed to these intermediate structures must also be stored somewhere.

For example, OpenAI is collaborating with Microsoft and running their processes on Azure cloud (Microsoft and OpenAI extend partnership - The Official Microsoft Blog).
“As OpenAI’s exclusive cloud provider, Azure will power all OpenAI workloads across research, products and API services.”

So presumably all data captured by OpenAI is stored in Azure. Within Azure, there are many ways to store data, including Microsoft products as well as products from other providers such as Snowflake and MongoDB as just two examples.

22 Likes

Hi everyone,
I’m confused and hoping for illumination: Snowflake claims an NRR of 151%, but they say their year on year revenue increase was only 50%. And they also claim that the number of customers increased significantly. Normally revenue comes from both new customers and old ones. How are they calculating NRR? Are they ignoring the customers that stopped using their product? If you’ve got an NRR of 151% how can you have a growth less than 51% even if you didn’t capture a single new customer?
My thanks to the brain trust,
Larry

6 Likes

Hey chemfool2,

That’s because it’s a backward-looking metric, trailing two years. The current NRR is likely already lower than 151% and might even be as low as 130%.

Once businesses begin to consume more again, you can expect the opposite effect: NRR will take some time to recover and may appear worse than it actually is for a while.

From Snowflake’s investor presentation:

Net Revenue Retention Rate. To calculate net revenue retention rate, we first specify a measurement period consisting of the trailing two years from our current period end. Next, we define as our measurement cohort the population of customers under capacity contracts that used our platform at any point in the first month of the first year of the measurement period. The cohorts used to calculate net revenue retention rate include end-customers under a reseller arrangement. We then calculate our net revenue retention rate as the quotient obtained by dividing our product revenue from this cohort in the second year of the measurement period by our product revenue from this cohort in the first year of the measurement period. Any customer in the cohort that did not use our platform in the second year remains in the calculation and contributes zero product revenue in the second year. Our net revenue retention rate is subject to adjustments for acquisitions, consolidations, spin-offs, and other market activity, and we present our net revenue retention rate for historical periods reflecting these adjustments. Since we will continue to attribute the historical product revenue to the consolidated contract, consolidation of capacity contracts within a customer’s organization typically will not impact our net revenue retention rate unless one of those customers was not a customer at any point in the first month of the first year of the measurement period.

Hope it helps.

mooo

34 Likes

The best differentiation is selling products with higher value and a faster ROI. The fact that they’re not sacrificing future business by giving away value now, especially during a challenging macro environment, differentiates them from their competitors that have to play games with pricing in order to get incremental business.

Slootman is supposed to be a great sales leader. Great sales leaders know how to lock up customers without giving away value and helping them in times of economic distress.

This is what Snowflake is doing.

When economic times change and on-hold projects get approved and optimizations run their course, we’ll most likely see a significant improvement in Snowflake’s growth.

I don’t know how long this will take. But I’m willing to wait a little while for this to happen.

DJ

21 Likes

Hi everyone,

And thank you for the great insights!

I keep learning from all of you every day and I’m very thankful for that.

I’ve spent more time thinking about SNOW and also read and heard a lot of great input from you and other fellow investors.

I organized all my thoughts again into a better-to-read and slightly adjusted write-up.

It’s not unique “new insights” now, so please don’t expect too much, I just want to share it with you for the sake of completeness.

Here we go.


Snowflake’s Q1’24 earnings report brought us a mixed bag of results. Customers scrutinize their Snowflake bill at all costs. On top of that, we got yet another reduced full-year outlook. However, it wasn’t all doom and gloom.

Let’s dive in and unpack Snowflake’s financial performance.

Revenue & Guidance

Snowflake’s product revenue for Q1 was a better-than-expected $590.1M, up 50% YoY and 6.3% sequentially . Beating guidance by 3%. The sequential deceleration in product revenue seems to settle:

  • 9.7%
  • 18.2%
  • 12.1%
  • 6.2%
  • 6.3% this Q

Looking ahead to Q2, guidance projects $625M in product revenue, indicating a 5.9% QoQ increase. A realistic beat of 0.5% or more would lead to sequential acceleration. I expect around $640M.

While Q1 and Q2 numbers look fine for a consumption-based business amidst the current environment, it was the reduced full-year guidance, that disappointed investors. For the second time!

  • Snowflake preliminarily guided for 47% YoY growth in fiscal 2024.
  • In Q4’23, they down-revised it to 40% YoY ,
  • only to reduce it again to 34% YoY this Q.

Needless to say, the market was disappointed and reacted with an 11% aftermarket sell-off. At least, Snowflake reiterated its goal to quadruple its annual revenue to $1B by fiscal 2029 (aka actual 2028).

Customer Activity

Net Retention Rate declined further from 158% to 151%, as large customers optimize their usage and purchase credits only based on short-term needs.

Customer Growth:

  • Snowflake added 43 large customers (> $1M in trailing 12-month revenue), matching the Q4 2023 record. Impressive!
  • Total new customer count of 317 is the lowest in three years. While Snowflake prioritizes large customers, it’s important to note that most start small and increase spending over time. Thus, the weak performance in total new customers is not great.
  • New Forbes 2000 Global customers are comparatively low, but it’s a lumpy metric due to multi-year sales cycles, so I don’t read too much into it. @mooo recently covered Snowflake’s Forbes 2000 in his https://discussion.fool.com/t/moritz-portfolio-summary-march-2023/90883#snowflake-10.

Remaining performance obligations (RPO), an indicator for future customer commitments, was $3.4B, up 31% YoY, but a significant -7.1% sequential drop. Moreover, current RPO growth turned negative QoQ for the first time, signaling limited short-term outperformance.

It’s worth noting that Q1 RPO is typically low, but this quarter’s level is even lower than usual.




Snowflake’s FCF. Source: Q1’24 Investor Presentation.

Cash & Profitability

Snowflake generated a record non-GAAP adjusted free cash flow of $287M, up 58% YoY. Representing a record FCF margin of 46%.

Non-GAAP Product Gross Margin increased by 2pps to 77%, powered by “favorable pricing with our cloud service providers, product improvements, scale in our public cloud data centers, and continued growth in large customer accounts ”.

Op Expenses and Operating Margins are worth watching. Operating Expenses (GAAP & non-GAAP), R&D, and S&M expenses went up in total amounts, but look healthy as a percentage of revenue.

Non-GAAP Operating Margin was 5% , lower than the 6% that Snowflake guided for. Still, relatively strong for a seasonally weak Q1, “benefiting from revenue outperformance and savings on sales and marketing spend.

So much for the “raw numbers”. Let’s dive into more qualitative commentary to get more pieces to this puzzling report.

Snowflake on Artificial Intelligence

In the current AI gold rush, we don’t get through the first analyst question without extensive AI commentary. And that’s good. Amidst the hype, it’s crucial to recognize that AI advancements are real and will reshape the world we live in.

I’ve written about AI generically in the past on personal blog.

In Q1, 1,500+ customers used Snowflake for Data Science, Machine Learning, and AI use cases. Up 91% YoY.

This strong increase, 3X (!) Snowflake’s YoY revenue growth , shows traction.

One core success factor of AI is data. Oversimplified, we can say that AI is a model:

  • ingesting input (data),
  • doing something to it (AI),
  • then providing output (data).

If you enter garbage, you’ll get garbage (“Garbage In Garbage Out”). This is why data quality matters. Unlike public data, Snowflake data is controllable, governed, and secure.

And this is a great reason to use Snowflake (data) to feed AI models. Slootman, CEO of Snowflake, called out a large US financial institution using Snowflake data to train its AI model.

Moreover, data is more useful in the correct format. Recently, Snowflake acquired Applica . Its language model enables users to turn unstructured data like documents, contracts, etc. into structured properties and reference them for analytics, data science, and AI.

Finally, Snwflake acquired Neeva , an AI-based search engine. It chose Neeva for its team of senior, AI-savvy engineers and the AI-based search technology. The acquihire will “enable Snowflake users and application developers to build rich, search-enabled and conversational experiences.

The north star is that Neeva’s tech can be integrated and extended to enable “asking really hard analytical questions that take people weeks and weeks or even months to figure out” and answering them in seconds.


→ Snowflake will discuss Generative AI in detail at its Snowflake summit.


Products & Usage



Snowflake’s Data Cloud Metrics shared in Q1’24 Investor Presentation.

Data Sharing Stable Edges

This Cloud Metric tracks customers with stable edges, one stable edge representing an active data sharing relationship between two businesses.

Active data sharing drives Snowflake consumption and network effects, as well as value creation for its customers.

The share of customers with stable edges grew 61% YoY, from 23% to 25% this quarter. Sequentially, new stable edges were slightly lighter but still healthy:

  • Q1 2024: 13.4% QoQ
  • Q1 2023: 18.4% QoQ
  • Q1 2022: 26.4% QoQ

Powered by Program

It enables customers to build apps on Snowflake, reducing infrastructure overhead and time to market while increasing Snowflake consumption.

The numbers are strong: customer adoption is up 128% YoY and accelerating sequentially, up 18.1% . Despite limited historical data, the program shows a stable trajectory and a record total customer adoption.

Data Marketplace Listings

This was my lowlight of Data Cloud Metrics, growing only 3% QoQ . A sharp deceleration! It was not addressed on the conference call. I’m puzzled: How does this fit the AI “powered by data” narrative?!

Subscribing to Snowflake’s data marketplace grants customers access to 3rd-party data, enhancing their own data, insights, and decision-making.

On top of that, it boosts Snowflake’s “data gravity” and ecosystem while maximizing data value for customers.

This is the definition of Marketplace Listings:

Each live dataset, package of datasets, or data service published by a data provider as a single product offering on Snowflake Marketplace is counted as a unique listing.

It’s not actually data usage that is decelerating, but rather the number of new datasets listed in the marketplace by customers. I can only speculate about the reason behind the deceleration, but @CMF_muji had some more insights that this is likely connected to all sorts of new marketplaces popping up, e.g. by the hyper scalers.

However, understanding the definition of this metric better makes me less freaked out by the sharp deceleration in terms of “product adoption”.

Cloud Cost Optimization

Snowflake is dealing with multiple optimization angles at once. On a positive note: some of these help its customers do more with less, which is a customer-centric approach and should pay off in the long run.

Currently, Snowflake calls out three optimization patterns:

  1. Optimizations by cloud vendors : better hardware results in more efficient performance for customers.
  2. Snowflake’s regular optimizations: improve performance and make things cheaper for customers. Optimization 1 and 2 create a 5% annual headwind to revenue.
  3. Change in storage retention policies : Snowflake’s largest customers reduce data storage, pressuring revenue from storage. Additionally, queries run quicker: another headwind for revenue. Similar optimizations in retention policy were seen by one of the hyper scalers, presumably AWS.However, this lower data retention comes with a cost for customers: It makes queries on datasets that are now distributed across multiple sources more challenging. Going forward, I expect one or both of these scenarios:
  4. Retention policies revert to previous levels, and consumption and storage tick back up, or
  5. Data retention remains at a reduced level. In this case, optimization would likely be a one-time thing, since customers cannot indefinitely cut storage time.

Besides these optimizations by existing customers, Snowflake also mentioned that the “[…] CFO being in the business […] artificially constraining the demand because of the general anxiety that exists in the economy. So that really needs to start lifting. And that will happen.”

We see: Snowflake is at the epicenter of cost optimizations from all sides, but this quarter, especially from older customer cohorts with a high Snowflake bill.

Couple that with uncertainty and lack of visibility by Snowflake’s management, et voilà: We’ve got a revised full-year outlook.

Here are some exemplary quotes about uncertainty, optimization, and lack of visibility:

“This may well continue near term , but cycles like this eventually run their course. Our conviction in the long-term opportunity remains unchanged.”

“It is challenging to identify a single cause of the consumption slowdown between Easter and today.”

“This, in the past, is we literally saw four weeks in April where there was no week-over-week growth per se or not material.”

“[…] customers remain hesitant to sign large multi-year deals.”

“[…] and that we don’t have real strong visibility in terms of, "Okay, when is it all going to be different?”

Sounds dark.

Fortunately, we can also report positives:

The number of queries is up 57% YoY , outpacing revenue. However, queries are running more efficiently, so upside might be muted by the efficiency gain.

Another positive:

Snowpark went from 20% in “one quarter to 30% of our customers using it on at least a weekly basis. We think that’s going to go to 100%.” Awesome! Snowflake also highlighted the competitive advantage of Snowpark, not just from a technical perspective, but also from a price perspective.


Details on Snowflake's newest data cloud for manufacturing.

Snowflake’s Manufacturing Cloud. Source: snowflake.com

Finally, Snowflake launched another industry vertical of its Data Cloud: The Manufacturing Cloud:supply chain management will be the most network segment of all industries that we’re operating in. ” Exciting!

Long-term thesis intact. Short-term: cloudy

I got reminded that AWS is the closest proxy for Snowflake’s performance. AWS also issued an uninspiring full-year outlook. Guiding down two quarters in a row shows that Snowflake has little to no visibility.

I also disliked low total customer growth and decelerating RPO.

Healthy data sharing metrics, record Free Cash Flow and strong large customer growth are bright spots amidst cloud cost optimization.

Short-term , the upside for Snowflake seems limited. Its proud valuation is not supported by its fundamentals, even if that could change quickly with consumption-based businesses like Snowflake. I just wouldn’t count on it for the remainder of this year.

What if we look beyond 2023?

Long-term , Snowflake is a promising company in the early innings of its potential, powered by an in-demand product, strong network effects, and multifold tailwinds:

  • AI & Data Gravity
  • Cloud transformation
  • Platforming undigitized industry verticals, like supply chain

My Investing Decision

The lack of short-term upside, uncertainty, and lack of visibility reduce my current conviction in Snowflake. For now, I’ll opportunistically reduce my allocation to a low-teens position.

Long-term, I’m optimistic, and I think the market is as well: Despite the after-hour sell-off, Snowflake is up 19% since its report. Partially, this is also due to the general NASDAQ rally.

The main question investors need to ask themselves are:

  • Investing short-term or long-term?
  • There’s not just buy or sell: How to size an allocation?

Ultimately, that’s a very personal decision to make.


I hope this is helpful and I wish all fellow investors good luck (and returns) :slight_smile:

Cheers: LisaOnCloud9

99 Likes

If I were SNOW I would partner with Salesforce as they just announced an architecture to apply LLMs / GPT against enterprise data fully respecting privacy. That is SNOW’s opportunity: allow customers to apply the new Generative AI and LLMs to their own data without leaving their realm. That should be a massive upside for SNOW, and for Salesforce.

13 Likes

I think you’re touching on a very interesting topic: bringing the compute to the data which NVIDIA and Snowflake just announced they’re going to do. Snowflake has the data (“Data Factory”, as Huang called them), as well as the governance and infrastructure to properly secure and process the data, and NVIDIA will add the computation part by integrating NEMO.

From what I saw, Salesforce and Snowflake already have a zero-copy partnership, but I think the use cases will become ever more exciting, as you suggest, with emerging LLM use cases helping regular, non-techie users query them (e.g. customer data from CRM) in natural language.

For me, this is the most exciting part about LLMs as of today and MongoDB said it best:

“Generative AI is creating a once-in-a-generation shift in how end-users interact with applications.

24 Likes

For some ideas you could actually search online for Salesforce AI Cloud or search on YouTube, there are several visionary demos already.

If you would like to see the investor day materials Snowflake just gave - both the presentation slides and the 2.5 hour recording are here:

https://investors.snowflake.com/events-and-presentations/default.aspx

If you want to see a replay of the keynote sessions that Lisa was referring to then you can watch here…

The investor day recording is a must view for any holders IMHO.
Ant

16 Likes