HDP - Guidance

HDP is not necessarily the type of company I generally invest in. To get a sense of why that is, just take a gander at the Operating Margins of -110% and -59% for the quarter (GAAP & Non-GAAP). In fact, those numbers would generally drive me directly away from an investment. I’ll admit to being a bit enamored with Hochfeld’s article on the company, and admit to not doing the typical amount of research I generally would do prior to entering an investment.

I’m questioning my investment primarily based on the company’s guidance for the full year 2017 revenues between $235 - $240M. At the top end that represents 30% growth above 2016 in a market supposedly expected to increase 50%+ per year for the foreseeable future. I can understand giving conservative guidance, but really have a problem with guidance so far below overall market growth.

Now, Hochfeld’s article was more focused on the company achieving Adj. EBITDA which they did this quarter. And if the company continues on this path, they avoid another round of dilution. This achievement was likely the reason the stock ended up on Friday after earnings on Thursday. However, it ended up about 5% and was up 15%+ during the day (very small cap stock).

The revenue guidance is concerning. I checked to see if the company guided overly conservative in 2016. They did not. Revenues came in about as expected back in Feb '16.

Finally, they continue to increase deferred revenue which I haven’t taken into account simply because I don’t have a true understanding of the impact of deferred revenues.

I would certainly like to hear from others here.

Take care,
A.J.

Hi AJ, I have some ambivalence as well with HDP, but here are some of the things I look at:

For the Year:

Total revenue was $184.5 million, up 51%. There aren’t many companies growing revenue at 51%.

Operating billings (the aggregate value of all invoices sent to their customers), were $270 million, up 63% from $166 million.

Adj gross profit was $118 million, up from $69.5 million. That’s up 70%.

Adj gross margin was 64% up from 57% a year ago.

Adj EBITDA reached break even last quarter.

Deferred revenue was $185.4 million up 18% sequentially, and up 74% from a year ago. I assume that’s money already taken in on annual contracts, that they can only book to revenue monthly.

Cash totaled $89.2 million

Dollar-based Net Expansion Rate – 131% average four years.


In the conference call they said the following:

We continue to target operating cash flow breakeven sometime between Q3 and Q4 of this year.
Since going public our operating model has evolved. At that time we offered a single Hadoop based solution which was deployed on-premise and primarily linked to an annual support contract. For customers that required a fully managed service it was HDI and Microsoft Azure.

Now let’s fast forward to today where we offer a comprehensive connected data platform. Customer use cases are varied and deployment options range from on-premise to hybrid environments and may also span many public clouds.

Customer consumption behavior has evolved as well requiring flexibility cross our respective business relationships with them. This evolution has in fact shaped many of the larger deals that we executed in 2016 including several of the nine deals that we did over a million in Q4 2016. We’re now operating at a scale with a $200 million a year revenue run rate where these broader deployment and consumption trends impact traditional metrics.

In particular, billings becomes less relevant when upsells come with renewals, durations change, and usage models in the cloud become more prominent. In some cases an isolated billings number may not adequately illustrate the economics of a particular transaction.

So we provide the following guidance. For the first quarter we expect total revenue of $52 million, adj gross profit margin between minus 65% and 60%. On a full-year basis revenue between $235 million and $240 million, GAAP operating profit margin between negative 85% and negative 80%, and the adj operating margin between negative 50% and 45%.

Some of that I don’t understand, I’ll admit, but they sound confident.


In December I had the following conversation with Bert:

I asked: What am I missing??? I see that in each of the past three quarters their loss was greater than the year before. And, we’re not talking about a Shopify-like symbolic loss of 1% or 2% of revenue. For Horton the loss is of the same order of magnitude as revenue. That’s massive! And Horton isn’t growing nearly as fast as Shopify. How do you imagine Horton will reach break even within a year? Even with deferred revenue? And not need a cash raise? That sounds a bit overly optimistic to me. I really don’t understand how it will be possible.

Bert responded: So far this year, their bookings have grown by 122%, 49% and then 66%. That is a pretty strong pattern of growth. Not many company’s are growing quite that fast. It’s hoping to reach EBITDA profitability this current quarter and has laid out that path pretty specifically. It forecasts reaching positive cash flow generation in the middle of next year. It sells large deals to large companies and it books far more than it reports as revenues and so it builds up a large balance of deferred revenues.

Even though Q2 was disappointing, and led to some restructuring, 49% growth is not all that terrible. I focus on bookings growth, EBITDA, and cash, as opposed to reported earnings and revenues, which poorly represent their progress. The shares are priced for a miss on the EBITDA projection. Reaching EBITDA break-even is predicated on another quarter of significant bookings growth. I think the math requires another $15 million. That is what I hope will happen. Hadoop is amongst the fastest growing spaces in software and this is one of 3 companies that sell the tools necessary to install Hadoop at scale.


There were also a couple of very useful posts by a Hadoop user in January:

I can make a few comments on Hadoop. I’m a Hadoop user. I routinely access data on Hadoop cluster for statistical modelling. To put simply, Hadoop is a data dumping ground in a distributed computing system. It’s cheap, it’s great for storing data, and it allows you to run applications on clusters of commodity hardware.

What is Hadoop good for? It’s good for "big data " and “machine learning.” Because Hadoop is a cheap data dumping ground and puts no constraints on the data format, you can store all kinds of data you wanted. In a world where businesses are collecting endless amount of data about customers and goods, Hadoop serves as an easy solution. Better still is that many open source softwares can be used when working on Hadoop clusters (and most are open source).

One comment earlier worries that HDP may be seen as just another IT cost with dubious benefit. Yes and no. It has a cost, and the benefit depends on what you can do with your data. Businesses that live and die by real-time data, like Netflix and Amazon, can realize lots of benefits by going with Hadoop.

I am just an average employee who uses Hadoop. What I have just said above is just a simple and overly generalized picture of Hadoop based on my understanding. I am basically just telling you what IT folks have told me when our company decided to go with Hadoop.

Finally, it is true that you don’t need HDP when you go with Hadoop. But people have told me that HDP provides better support, and is easier to work with, than others such as, say, IBM. We actually just dumped IBM for Hortonworks, so there is that.

-------------
Hadoop is an open source organization. A worldwide pool of enthusiasts contribute code to this. However, select organizations and people are allowed to write to the open source code and they are called committers. The below is a link to who those folks are….What is important is how many folks from Hortonworks are there. Out of the 112, Hortonworks has 34. That’s pretty dominating position. This gives Hortonworks a great influence in terms of setting the agenda, to deciding what goes in the final version, to all sorts of mundane things. Also, this implicitly means that HDP will always have a great knowledge and resource pool on Hadoop.


Jan 2017 – My Conclusions (Saul) – After reviewing the whole thing, my conclusions are the same: Stay with Hortonworks. They will reach EBITDA break-even this quarter (Dec). Moving towards cash-flow break-even in 2017. Remember they have a lot of deferred revenue that doesn’t figure into revenue and earnings!

I hope that helps. It’s certainly not a sure thing, but it seems to be coming off a bottom and is 62% below its top a year or so ago.

Saul

For Knowledgebase for this board,
please go to Post #17774, 17775 and 17776.
We had to post it in three parts this time.

A link to the Knowledgebase is also at the top of the Announcements column
on the right side of every page on this board

13 Likes

Incredible analysis, Saul. I’m continually in awe of your focus and clarity of thought. I had all the information I needed to put all this together, but until you did I had completely failed to see it this way. However, I will venture to add something on valuation.

As you said, this company has recognized 184M of revenue in the last year. At a market cap of 650M, that’s a PS ratio of roughly 3.5 – ABSURDLY low for a company growing this fast. However, that’s only the half of it, because as you pointed out, the actual billings this year were 270M. Comparing the market cap to that run rate yields a price-to-billings ratio of an even MORE ABSURDLY LOW 2.4.

I just feel the need to point out that this is the same world where SPLK and PAYC are growing possibly slower than HDP but have PS ratios over 9. Maybe another way to say what I’m trying to say is that HDP shares would have to almost triple to have a PS over 9.

Then there’s the profitability issue but the fact that they came through on their promise of EBITDA break-even is a strong signal in the right direction.

Wall Street is usually pretty smart, but this company does kind of defy traditional valuation. I think it’s severely undervalued and I’m loading up. I’ve also taken a small position in TLND, another company that works with HDP and which Bert has written about.

Bear

3 Likes

Hadoop is amongst the fastest growing spaces in software and this is one of 3 companies that sell the tools necessary to install Hadoop at scale.

Really - only 3? How does one draw the boundaries to get the number this low?

Hadoop is amongst the fastest growing spaces in software and this is one of 3 companies that sell the tools necessary to install Hadoop at scale.

Really - only 3? How does one draw the boundaries to get the number this low?

Overall, Hadoop is not overly complex. It is very simple but very good at what it does. I believe the 3 mentioned refer to Hortonworks (HDP), Cloudera, and MapR. Of course there are a couple other distributions, but probably not as widely adopted or as supported as the above 3.

http://www.networkworld.com/article/3024812/big-data-busines…

http://hadoopilluminated.com/hadoop_illuminated/Distribution…

2 Likes

So, Amazon Web Services, Microsoft Azure, IBM, and Pivotal don’t count?

Ignoring the cloud-based alternatives is just silly in this day and age, don’t you think?

1 Like

So, Amazon Web Services, Microsoft Azure, IBM, and Pivotal don’t count?

Ignoring the cloud-based alternatives is just silly in this day and age, don’t you think?

Not sure what you’re getting at. I was just trying to clarify a statement that wasn’t made by me.

Also, Hortonworks also runs on AWS and Microsoft Azure so their distribution is not necessarily just “on-premise”. I cite this just as an example to be clear that even though AWS has their own Hadoop offering (EMR), there’s no reason why you can’t run Hortonworks on AWS or Microsoft Azure which essentially is a cloud implementation of their distribution.

http://hortonworks.com/products/cloud/

2 Likes

Not sure what you’re getting at.

Just that there are more than 3 players in the Hadoop space. The article you linked ignored 4 others - two of them because they weren’t on premise.

And while you can run Hortonworks on AWS, you can also run MapR’s distribution. Perhaps even easier is to run Amazon’s own distribution.

Hi Smorgasbord1,

You can deploy Hortonwork’s Hadoop service within Azure and AWS. As a matter of fact, our company dumped IBM and decided to go with Azure and HDP. Azure and AWS are not direct competitors as HDP can work and are working with both cloud-based services.

All the best,
FG

3 Likes

So, Amazon Web Services, Microsoft Azure, IBM, and Pivotal don’t count?

AWS, and Azure don’t count. IBM is a competitor for HDP and I don’t know about Pivotal.

Of course AWS counts:
https://aws.amazon.com/emr/details/hadoop/

https://aws.amazon.com/emr/pricing/

With EMR you don’t need a third party provider like Hortonworks.

http://blogs.perficient.com/integrate/2016/05/19/two-choices…

1 Like

With EMR you don’t need a third party provider like Hortonworks.

With Hortonworks it is pure-play open source. You can move at will and with EMR basically you are locked with Amazon, at least that’s my understanding. Also, EMR is few versions behind Hadoop, in an emerging field like Hadoop that is a severe limitation.

2 Likes

With Hortonworks it is pure-play open source.

I don’t think that’s the primary driver for most companies.

Also, EMR is few versions behind Hadoop, in an emerging field like Hadoop that is a severe limitation.

Depends on how cutting edge your application is. Most companies stay behind on purpose for stability reasons. Like how many people don’t update their phone software right away - they wait a couple weeks to see if other people report issues.

Here’s a recent article on Hortonworks on Amazon: http://www.zdnet.com/article/hortonworks-comes-to-the-amazon…

Back to AWS, the obvious question is why use HDCloud instead of defaulting to EMR? Hortonworks is differentiating by optimizing for Hive and Spark workloads by leveraging a feature borrowed from Ambari that optimizes configuring compute nodes. Hortonworks is also promoting its ability to provide more granular security to Hive at row and column level.

EMR has long had the edge with its own proprietary data access optimizations. HDCloud is leveraging recent enhancements that came with Apache Hadoop 2.7 to get in the same ballpark with EMR performance against S3.

At any rate, I stand by my previous post that not counting AWS in is wrong.

I don’t think that’s the primary driver for most companies.

Actually it is. I have seen that all the time. Remember Big Data is a very nascent field and the corporate buyers are worried about things like staying power of the providers, ability to quickly adopt leading edge, etc.

Often you see Big data and analytic goes together and this is employed on true differentiation initiatives like “Market research, customer research, product research” etc. That’s why companies go with “cloudera, hortonworks” vs IBM’s of the world. They are willing to go with small niche players rather than trusted big names. In such scenarios the small things like staying power, leading edge all matters.

Most companies stay behind on purpose for stability reasons.

That may be true for an ERP system or systems that are in place for a long time where the bias is stability and there is not a whole lot of reasons to be at leading edge.

At any rate, I stand by my previous post that not counting AWS in is wrong

Feel free. AWS can close the gap pretty quickly too. For now, AWS offers Hortonworks. The nature of the eco-system is such that they will co-operate and compete. I need to check out Gartner magic Quadrant to see where AWS is on this.

2 Likes

I need to check out Gartner magic Quadrant to see where AWS is on this.

http://www.gartner.com/newsroom/id/3051717

You need a login to get more than the summary data, but that includes:

Only 26 percent of respondents claim to be either deploying, piloting or experimenting with Hadoop, while 11 percent plan to invest within 12 months and seven percent are planning investment in 24 months. Responses pointed to two interesting reasons for the lack of intent. First, several responded that Hadoop was simply not a priority. The second was that Hadoop was overkill for the problems the business faced, implying the opportunity costs of implementing Hadoop were too high relative to the expected benefit.

“With such large incidence of organizations with no plans or already on their Hadoop journey, future demand for Hadoop looks fairly anemic over at least the next 24 months. Moreover, the lack of near-term plans for Hadoop adoption suggest that, despite continuing enthusiasm for the big data phenomenon, demand for Hadoop specifically is not accelerating,” said Merv Adrian, research vice president at Gartner. “The best hope for revenue growth for providers would appear to be in moving to larger deployments within their existing customer base.”

1 Like

Only 26 percent of respondents claim to be either deploying, piloting or experimenting with Hadoop, while 11 percent plan to invest within 12 months and seven percent are planning investment in 24 months. Responses pointed to two interesting reasons for the lack of intent. First, several responded that Hadoop was simply not a priority. The second was that Hadoop was overkill for the problems the business faced, implying the opportunity costs of implementing Hadoop were too high relative to the expected benefit.

Those numbers seem meaningless without context…if last year 5% of respondents were planning deploy Hadoop, 26% this year would be massive growth.

Bear

Only 26 percent of respondents claim to be either deploying, piloting or experimenting with Hadoop, while 11 percent plan to invest within 12 months and seven percent are planning investment in 24 months. Responses pointed to two interesting reasons for the lack of intent. First, several responded that Hadoop was simply not a priority. The second was that Hadoop was overkill for the problems the business faced, implying the opportunity costs of implementing Hadoop were too high relative to the expected benefit.

Could you find out who were the respondents? If the sample was representative of the overall economy, then I would say 26% is phenomenal, as big data is still a very small part of the overall business activities.

It is also important to know who the respondents were. If they were CIOs, then I am sure that Hadoop would not be on the top of their agenda since only a small part of business depends on such architecture, unless you live and die by big data, like credit bureaus, Google, Amazon, etc. However, ask any leaders that head a data-driven group and I am sure the percentage of people who says that they are converting to or are already using Hadoop would be way above 26%, if not 100%. This is in no way indicative of Hortonwork’s prospects, but Hadoops is a niche. It is a niche for a specific type of field. And the field is a very specialized one that require people to have certain requisite training, knowledge, education, and interest.

Just my 2 cents,
FG

1 Like

Only 26 percent of respondents claim to be either deploying, piloting or experimenting with Hadoop, while 11 percent plan to invest within 12 months and seven percent are planning investment in 24 months.

Hi Smorgasbord, Just curious. How do you reconcile that with HDP’s revenue being up 51% for the year, and billings being up 63%?

The report is old and our internal (my employer) projections are showing at least 65% of our big accounts are planning to deploy Big Data/ Analytics/ Digital.

AWS, and Azure don’t count.

Sorry to keep harping on this, but it appears that Microsoft’s own Azure’s HDInsight offering (https://azure.microsoft.com/en-us/services/hdinsight/) was actually developed by Hortonworks with Microsoft and that Hortonworks gets a fee per usage hour from Microsoft.

If you read the earnings transcript, you would have seen this from CEO Rob Bearden: “We have an arrangement with Microsoft, we’re based on usage we get paid per usage hour and we can’t disclose that contractually.”

3 Likes