A first look at TomG's Appian

Appian is one of two companies (the other being Varonis) that TomG suggested might fit in with the investing style here. They IPOed in may 2017 at a price of 12 , peaked at 28 and now sit around 20 due to a secondary offering at 20.25. Appian has created a platform that allows companies to develop apps without programing or with minimal programming. I’ve been hearing more and more about this “low code” app development lately as it allows companies to have non-programmers build a program using a graphical interface without being experts. A friend of mine that is a computer engineer that has worked for various heavy tech companies (facebook, yahoo, etc) has a love hate relationship with these tools. Generally the output from these low code efforts work, but is totally not customizable so he has to go back and rewrite to optimize and intergrate other peoples efforts. My feeling is the low code enviroments are great for non tech businesses who want to get a customized app out there but don’t want or need to have the absolutely most efficient or best app.

I’m finding it difficult to get much in the way of financials because of the recent IPO so much of this is taken from the 10-Q on their website, their most recent earnings call transcript from seeking alpha as well as the following article. https://seekingalpha.com/article/4126347-appian-stuck-rut,

Earning reported 10/2. Revenue of $44.65M (+45.2% Y/Y) beats by $3.76M and increased guidance.

(60.2 million and $52.4 million basic and diluted shares outstanding for the third quarter of 2017 and the third quarter of 2016 respectively.)
Total Revenue 44.6 million up 45%
Subscription revenue 20.1 Million up 35% (cloud subcriptions)
Professional services revenue in the third quarter was $22 million up 4% (consulting on using their platform)
subscription revenue retention rate was at 122% (suggest customers are increasing use once they sign on)
non-GAAP Operating loss 4.9 million (they had guided 9.1 million) down from 6.2
non-GAAP gross margin was 63% compared to 64% last year
non-GAAP operating expenses were $33.2 million, an increase of 29% from $25.7 million in the year-ago period. Sales and marketing was 43% of revenue in the third quarter compared with 47% of revenue in the prior year

Total deferred revenue was $71.8 million up 32% year-over-year. (they specifically pointed out that their billing is different for each company, some quarterly, some annually, some monthly so that deferred revenue is not the most useful metric for them)
(an increase in guidance) For the full year 2017 subscription revenue is now expected to be in a range of $81.5 million and $81.7 million representing year-over-year growth of 36%.
Total revenue between $167.6 million and $168.1 million. Non-GAAP loss from operations is now expected to be in a range of $23.6 million and $23.1 million with a non-GAAP net loss per share of $0.39 and $0.38.
This assumes 57.1 million basic and diluted common shares outstanding.
For the fourth quarter of 2017, subscription revenue is expected to be in the range of $22.2 million and $22.4 million representing year-over-year growth of 34% to 35%.
Total revenue is expected to be in the range of $41.4 million and $41.9 million (essentially flat from this quarter with professional services decreasing and subscription services increasing)
non-GAAP loss from operations is expected to be in the range of $9.7 million and $9.2 million with a non-GAAP net loss per share of $0.16 and $0.15.
This assumes $60.5million basic and diluted common shares outstanding.
I especially loved this quote from the earnings call,
“I feel there’s so much potential everywhere. The real emphasis I want to leave you with is, every one of these verticals is exceptionally fertile. These organizations, every medium to large organization in the world needs to be a software company, needs to customize some of their processes and procedures and behaviors with software and they just looking for a way to make it easy to build that and to change it. So I feel like we’ve got a winning position in every one of these verticals. The ones who are investing in and the ones we haven’t yet. But if I had to pick one that will be the next breakout, it will be healthcare.”

How often do you hear a ceo say their target market is “exceptionally fertile” They sound excited about their prospects.

My comments:
A company like this actually has a pretty moat for two reasons. 1) As they penetrate into industries they are able to build the software hooks that allows them to send and receive data with that industries specific set of software. The first person in an industry that uses their platform has to do the most work, the second person gets to build on top of the first person’s work, and so and and so forth until it because relatively trivial for people in that industry to build. 2) I’d also think their product is relatively persistent because software needs to be continuously updated and once you go through the effort and expense to develop you are pretty committed to that platform.

I ‘m still learning about Appian, so I don’t know if they are the company that is going to reach (or has reached) that critical mass where they become the obvious choice. In their call they mentioned they have good integration with financials and government and they think that their next markets will be pharmaceuticals and healthcare. On a side note,Mulesoft provides an integration platform for Appian.

Appian has two different sources of revenue that I think are worth taking a look at. One is subscription revenue which is your standard cloud based revenue (per user monthly pricing), the other is their professional services revenue which is basically consulting with companies on how best to use the Appian platform. I think in a perfect world if the low code approach was totally great then companies wouldn’t need to consult with appian but I’m not sure that is a realistic goal as no matter how good your platform is you are ultimately taking a very complicated engineering process and trying to democratize it. As referenced above they are growing their subscription revenue quickly and their professional services are basically flat although I would expect that to grow in a lumpy fashion as companies will need more help up front.

So is now a good time to invest? Well, I think the price is being pressured lower due to the secondary offering and lock up expiring on Nov 21st. That pressure will be felt for a little while…how long? Who knows. The company has shown accelerating revenue growth. (33 to 45%) although they are guiding to be in the mid 30% growth range. They are ahead of their goals on profitability and revenue is growing faster than expenses, plus their EV/FTM is around sort of middle of the pack for these Saas(Paas) companies. I got those numbers from https://seekingalpha.com/article/4126347-appian-stuck-rut . Seems like they have the ability to grow far into the horizon but I do think their growth will be lumpy as they try and break into new industries, get their first few customers and then the easy growth comes.

I’m curious to hear what you all think,



Essentially, Appian has a code generator tool. I have no experience with the tool and haven’t the foggiest about how it works, but I have some very dated experience in the field.

Quite a number of years ago I was the manager of a project that was chartered to find a code generator for a very large aerospace firm. The executive who launched the project observed that we spent a ton of time and effort in developing specifications from which programmers created the code that delivered functionality to the end users. His thinking was that if we set forth disciplined specifications standards we should be able to automate the coding part. It was not an unreasonable thought. At the time I was the IT manager of Methods and Tools and had already done a study of specifications written by various groups (the IT department was pretty much organized by which functional organization they supported). The results of my study was that there was little commonality among the specs written by different groups. In fact, the only commonality was they all contained user interface designs, but logic, dataflow, execution schedules (not everything was real time in those days), test cases, etc. were all over the place.

We’re going back quite a few years. COBOL was the most frequently used language (with occasional bits of Fortran and assembler when COBOL just didn’t cut it). JCL was the job scheduler (an IBM shop). IMS and DB2 SQL was the DBMS code with Oracle SQL a close second (BTW, don’t for a minute think generic SQL is used anywhere). IMS-TP was the telecommunications protocol. On-line presentation was fixed screen 24 line x 80 column green on black display (orange on black monitors were introduced a little later).

I won’t belabor it. My team made a recommendation. It was ultimately rejected. The reason being that the generated code was virtually impossible to maintain over time. Debugging was a nightmare in that it was extremely difficult to map a piece of flawed code back to the exact spec that generated it. Testing presented the same problem, a test failed, which piece of code was responsible, which part of the spec created it? If the programmer changed the code rather than the spec, the linkage between the code generator and the actual code was broken. The DBA community flat out said they wouldn’t accept machine generated SQL code as it would require optimization anyway. These were all valid concerns.

None of that has really changed. We all use apps everyday and so we think the app is the user interface because that’s what we see. But, there’s a backend. There’s a DBMS involved. There’s a mountain of security issues (especially if the app supports financial transactions). There’s communication protocols. So on and so forth.

I’m not saying it can’t be done successfully. To a rather large extent this is what SHOP does with their store builder. But, it’s not trivial.

That being said, I think there’s a market for the tool, but I think it’s also somewhat limited. If a company becomes successful, they are going to want to add unsupported customization to their apps. They will abandon the tool and hire some IT contract folks to get the job done. Or maybe even put on some dedicated IT staff.

The world of IT has changed a lot very rapidly. I am ready to admit I could be all wrong, but I’m not putting any money in this company. Not now anyway.


For the record, the customization problem can be addressed by a thorough commitment to Model Based Development. Done properly, one makes all changes in the model or in the templates which drive the generation and never touches the generated code (except maybe temporarily for debugging). MBD has been around for a while and there are certain domains where one simply can’t compete any other way, but it has been very slow to spread to other domains.

1 Like

For the record, the customization problem can be addressed by a thorough commitment to Model Based Development. Done properly…

True, MBD was not a practice when I performed my study. It took root among the folks developing complex product embedded real time systems long prior to business system development. My recollection is that the first time models were introduced to the specification process was with the slow adoption of Structured Analysis as set forth by Yourdon/DeMarco. These models (primarily Data Flow Diagrams, DFDs) were not rigorous enough to generate code.

A little later Entity Relationship Diagrams (ERDs) were introduced with respect to database design. Oracle introduced a tool that allowed database definition SQL to be generated from ERDs. Some DBAs (that I knew) used it to generate a “first draft” from a 3NF model (3rd normal form) but then proceeded to tweak the DD-SQL for the purpose of optimization and the incorporation of stored procedure calls which could not be generated from the model.

There was a lot of competition regarding different modeling languages, techniques and graphic representation for a while but eventually in the 90s the Unified Modeling Language (UML) emerged as the defacto standard. I don’t know if this is still true or if it has been superseded by more recent techniques.

In any case (and in all cases when it comes to specifications) the devil is in the details. It’s the Done properly part of your comment that’s the rub. At least in the business systems arena UML gained wide spread adoption, but in many cases the rigor was lacking and the models were more like cartoons than disciplined design models.

In all honesty, I moved on from my focus on methods and tools in the 90s and retired in 2010. I’ve made no effort whatsoever to try to keep up with system development methods. Quite possible MBD has advanced to the point of being able to generate robust code. Then you’re only stuck with the knowledge, skill and discipline level of the practitioner. Most likely the appropriate knowledge, skills and disciplines are not going to be found with some entrepreneur running a small business who wants to launch an app.


I don’t know if this is still true or if it has been superseded by more recent techniques.

UML has continued to evolve in an apparent attempt to become the Universal Modeling Language, which some of us think has weakened the language. To do MBD, one really only needs three diagrams plus Action Language, a tiny subset of the whole.

The progress of MBD has, in my opinion, been weakened by the fact that the dominant engines are highly proprietary. There are open source versions, but they are pretty skeletal.

If anyone is interested in the subject, I highly recommend the following:

1 Like

I am also in SW dev field. These tools tend to work for simplistic situations. I have no experience with Appian. Tom


The DBA community flat out said they wouldn’t accept machine generated SQL code as it would require optimization anyway. These were all valid concerns.

None of that has really changed

We manage over 5000 databases and typically we use about one dba per 100 database. Our DBA’s don’t know what the application does, don’t care, often they don’t even know who the customer is. The paradigm shifted. Database is just a platform keep it running well, what runs on it whether optimized or not is someone else problem.

Hiring a DBA, even average one in south, in US will easily cost 100K (fully loaded cost) and it is lot cheaper to buy extra compute capacity, memory, faster disks or just buy flash.

The same goes for codding. A very talented programmer obsessing with getting efficient code is relevant for some areas. For most tasks, just get the code out to meet the business needs and don’t waste time with optimization. Optimization done at the tool level is better and can be leveraged more efficiently.

Same goes for security. Trusting individual programmer will write his code with proper security is insane.

1 Like

Maybe you’re correct. I retired in 2010 and have made zero effort to keep up with the rapidly changing pace of IT technology. At the time I retired, cloud based computing and specifically how a major corporation might participate in it was still a matter of debate, now not so much.

Application development was rapidly moving to COTS products for all mainline business applications. I saw several of applications I had helped bring to life get retired in favor of purchased software (with the users often being forced to give up functionality). The cost savings argument was too powerful. Only highly specialized applications that simply didn’t have a wide enough audience to make it profitable for vendors to offer a product were still being developed in house.

Once you move your storage and apps to the cloud, optimization is no longer your problem. It becomes a matter of the specified performance requirements that have been contractually agreed upon. How the performance is delivered is up to the cloud provider. As you noted, there are hardware solutions. Not every bit of performance needs to be wrung from the software. And the middleware (i.e., the DBMS, network s/w, etc.) can also provide performance improvements. When I retired our shop was in the process of making a major shift from one unnamed relational DBMS to a different one due to the price/performance differential.

And that was before Hadoop, Mongo and other Big Data stores were in general use. Although it appears that the Big Data world has since more or less consolidated to Hadoop as the store of choice.

“And that was before Hadoop, Mongo and other Big Data stores were in general use. Although it appears that the Big Data world has since more or less consolidated to Hadoop as the store of choice.”

Hadoop isn’t for data storage, it’s for data analysis. Components HBase/hdfs can be used for that, but has other problems. Hadoop itself is becoming obsolete, since the map/reduce model is better suited to batch processes and more problems are stream processing in nature. Spark has taken share. And Flink, Akka, Storm and others are at Spark’s heels. For data storage there are many alternatives. You can go to https://db-engines.com/en/ranking and look for items in “document store”, “wide column store”, “key/value store”, or even “search engine” (which overlaps as a category). Graph databases are worth mentioning, too.

Mongo (MDB) is the only public one of interested. Google (spanner), Microsoft (cosmos), and Amazon (dynamo) are well known products but are miniscule parts of their operations overall. There are a few private big data stores, also, beyond the scope of this post.

1 Like

Thanks for the clarification. “Big Data” was just beginning to be a subject of interest when I retired in 2010. I’ve obviously made no effort to try and keep up with the pace of developments in information technology. Seven years doesn’t seem like such a long time, but with IT it represents maybe 4 generations or more.

I’ve made a number of comments on this board based on my former 30 years of IT experience in a Fortune 50 company. I try to couch my comments with the fact that my observations, especially about specific technologies, are somewhat stale. What’s probably more relevant are my comments about how new information technologies are received by managements, IT staff and the general business user. It was my experience that these attitudes did not change very much over 30 years. Yes, there have been changes, but they came much more slowly.

A couple of examples: not long after I joined the IT staff I heard the comments of a senior VP in the company where I worked. When asked about the future of computing in business, he held up a pencil and observed, “All the computing power I will ever need is right here in the tip of this pencil.” Prescient . . . Another very senior officer of the company was being shown how to use the new Mac that had just been set-up in his office. The tech showed him how the mouse worked and told him (as an obvious dig at the PC) that the mouse was super easy to use because it only had one button. The officer disdainfully replied, “I see. That’s exactly one button too many.” No executive would ever touch an office machine of any sort. Operating office equipment was the role of the women of the secretarial staff. Executives were far too important to interact with a trivial machine. They wouldn’t be caught dead with their fingers on a keyboard of any sort.

Now we have a president who can’t keep his fingers off his phone’s keyboard. Some things have changed.


Since they are in a space form BPM software essentially which allows companies to leverage all the IT software across CRM , ERP and legacy systems and make it work at the Front office more efficiently I think this market will happen because at the moment thats the biggets issue in most organizations because of teh silos of DATA . This is typically where software will be going in the future because current ways of developing software is not as nimble and costs a lot to maintain .

Its growing at ~25% though I suspect once this new wave of software goes mainstream it will catch fire. Another company that does work in this area is PEGA which is a much bigger company going thru its transition from license to SAAS revenue streams and they seem to have a 500m + in revenues last year form BPM


1 Like

I had missed this going through the first time, but now plan to read through this thread. My apologies for this post having no value add other than my selfish desire to be able to more easily navigate back to it.

need to have the absolutely most efficient or best app. not a programmer but it would seem that one of the advantages of ever faster computers and SSD rather than HDD is that programs do not have to be as sparse or elegant, “good enough” will do.

not a programmer but it would seem that one of the advantages of ever faster computers and SSD rather than HDD is that programs do not have to be as sparse or elegant, “good enough” will do.

That is generally true. Programers (WetWare) are expensive, hardware is cheap. Good enough code is efficient. But even when you need high performance, a relatively small part of the code needs to be optimized, that part of the code that repeats a lot.

I am a programmer from way back, my first computer used vacuum tubes! I spent a lot of time getting the code of my first professional program to fit into that “big iron” mainframe. To optimize it I had to get the code to run fast enough not to slow down the card reader, losing a read cycle would have doubled the time it took to run it. Those difficulties are long forgotten. :wink:

A low code environment probably can solve 80% or more of the coding jobs – just a guess.

BTW, the cost is not in writing the code but in debugging it! A low code environment is probably also a low bug environment - just another guess.

With code running on multiple remote machines debugging gets even more complicated. I was having difficulties with a website that was reporting errors but only from remote clients so I wrote Class phpDebug v 1.0.0 which works together with phpErrorReporter v 1.2.0 which sends me emails when website code bombs. If anyone here has a use for them you can download from softwaretimes.com

Class phpDebug v 1.0.0

phpDebugHas this ever happened to you? A website that has been running spotlessly for years needs some maintenance work to bring the code up to date with new versions of the software or to add a feature and as a consequence a new bug shows up but one that is only triggered by a remote client. To complicate matters, the error log does not have enough information to let you reproduce the bug. What to do? Write some code to solve the problem! That’s how Class phpDebug was born!


Update: phpErrorReporter v 1.2.0

phpErrorReporterNew versions of php often deprecate some old function and this change can generate thousands of lines of warnings in the php error log to the point of bringing down the server. To solve this problem I added code that checks the log’s file size and sends timely email warnings to the administrator via the noticeReporter() function added in version 1.1.0



Denny Schlesinger

PS: I no longer write commercial code, just stuff for my own use. My current project is my Portfolio web-app which I use to track and manage my stocks, bank accounts, and credit cards.


BTW, the cost is not in writing the code but in debugging it!

Not to mention the cost of changing it in response to changing requirements … that is what gets really expensive, particularly if the original code has been written in such a way as to be hard to maintain.

So called “orphan code” can become a really big problem.

BTW, code generators are almost as old as computers themselves. While at IBM we had RPG (Report Program Generator) which was a quick and easy way to produce reports. I won a bottle of Chivas Regal Whisky with RPG. I bet a fellow that I could create bar charts with it. LOL

Back then NCR had NEAT-3, a “macro language” to generate assembly language code.

Stuff like WordPress are also in the same category for creating web pages.

Putting on my investor’s hat, I wonder how much traction these products can have. But then, I have a bias, I like writing code.

Denny Schlesinger

The historical problem with most program generators and 4GLs and the like is that they significantly reduced the time to produce part of the code, but were incomplete and one either had to resort to time consuming tricks to get the rest or even to break out of the system and hand write the difficult bits in some other language. There is an 80/20 or 90/10 type rule with code as well, i.e., that 80% of it can be easily written by almost anyone or even automated by fairly simple systems, but that the 20% or 10% which remained was very hard to automate and, even if one was writing the whole thing by hand, the hard part would take 80% of the time to get right. So, you can see that a tool which provided a 4X improvement on the 80% part actually only gave one a 15% improvement over the whole of the project since that piece only took 20% of the hand coding time.

Occasionally, one encounters complete systems which break through this barrier. The Progress ABL/4GL was one of those in which one could write complete systems without breaking out of the 4GL except for very specialized tasks and achieve close to 10X improvements on the overall project. That is more complex today, since web-based systems often include a whole range of technologies, especially if they include support for phones. While there are systems which provide significant productivity improvements in each part, orchestrating them is non-trivial.

I have personal experience creating such tools and they can produce transformative results. It can suddenly be cost effective to make a change that would have been neglected in the past, so systems can be significantly more responsive to changing business conditions. The big difficulty which one can often encounter is that the system which was wonderful at supporting one architecture, cannot support some new architecture, e.g., a shift to a web UI.



I have been trying to get through the Harvard CS50 course for well over a year. Life keeps getting in the way.

But I find it interesting and challenging. However, if I have a choice between writing code and figuring out problems with calculus, I will take the math everytime. It is just as creative, but not as messy.

On the other hand I find they go hand in hand.



On the other hand I find they go hand in hand.


A tip from an old hand, when math meets code remember that series don’t start at ONE but at ZERO. Possibly India’s greatest contribution to mathematics. Start at ZERO and you’ll have fewer bugs and other difficulties.

An entry-level course taught by David J. Malan, CS50x teaches students how to think algorithmically and solve problems efficiently. Topics include abstraction, algorithms, data structures, encapsulation, resource management, security, software engineering, and web development. Languages include C, Python, SQL, and JavaScript plus CSS and HTML. Problem sets inspired by real-world domains of biology, cryptography, finance, forensics, and gaming.

That sounds terribly complicated. Computer code, like buildings, have form and function. All the rest are just technical (engineering?) details. Object Oriented programming recognizes it which is why it creates objets (the buildings) that execute methods (the functions). Think of yourself as an Architect of Code.

Happy Coding!

Denny Schlesinger

1 Like

I just started to explore this company… Thought I would share Gartner’s opinion of how Appian stacks up


They are a leader in the industry. Gartner is generally pretty good at identifying trends in the industry.


1 Like