Why Cloudflare Looms Among Emerging AI Stocks Amid OpenAI Cloud Deal

https://www.investors.com/news/technology/ai-stocks-cloudflare-earnings-openai-chatgpt-microsoft/

OpenAI’s launch of ChatGPT, a conversational chatbot, has triggered an AI war between Google-parent Alphabet (GOOGL) and Microsoft (MSFT). Both Microsoft, the biggest investor in OpenAI, and Google aim to embed content-creating generative AI technologies into their internet search products.

“What investors might have missed is that the ‘performance and security’ of chat.openai.com is done by Cloudflare,”

In a preview of Cloudflare’s fourth-quarter earnings, RBC analyst Matthew Hedberg said: “In terms of incremental 2023 drivers, we look to Apple and ChatGPT as interesting growth opportunities.”

One concern for Cloudflare shareholders, though, is that the company could lose 2023 revenue from cryptocurrency customers. That could offset some upside from OpenAI.

According to Auty: “The demand for ChatGPT drives usage of Cloudflare’s network, resulting in more revenue for Cloudflare. In our eyes, Cloudflare’s opportunity to be the CDN (content delivery network) and security vendor of choice for not just ChatGPT, but other AI platforms is massive.”

(I did not know they made money on crypto transactions, but I guess they go over the internet)

Pete

56 Likes

Thanks, Pete for this interesting piece of information. I was wondering, why NET stockprice is outperforming since the beginning of this year. This might be one reason.

Looking forward to the ER from NET tomorrow.

Cheers,
Nick

Long NET (biggest position)

11 Likes

Microsoft CEO Satya Nadella declared. “I have not seen something like this since I would say 2007-2008 when the cloud was just first coming out.”

I grant that Satya is a credible source. I’m left wondering, are there any AI pure plays that us retail investors might take advantage of (OK, other than Upstart)?

4 Likes

As a non-techie I can suggest two different approaches to finding companies that relate to the area: and ETF and a book.

I cannot buy companies in my 401k and my main ETF out of two there is LRNZ, a small concentrated ETF that specializes in companies that relate closely to the AI space. Sam, the PM rarely offers webinars but he has been a tech PM since before the .com bubble. He looks at various areas, including biotech and prefers newer companies over behemoths like MSFT. So you could look at the holdings for some ideas for further study. Lots of good friends over there. Normally they do not rebalance mechanically but they did for 2023 and I like that due to the peculiar 2022 we had. SNOW, DDOG, CRWD, ZS used to be the largest holdings.

On to the book.
As a non-techie, I am not sure what a pure play would be though. On paper C3AI. The CEO wrote an interesting book, Tom Siebel, Digital Transformation.

In the book, you can find a table or flow chart or whatever it was that allows you to see all that goes into making AI work. [cannot find the book right now so cannot give you the page number] That will help explain the content of LRNZ. This includes a data warehouse/lake, so SNOW is critical, it includes network speed, processing, security, IOT-connectivity, so NET, ZS, CRWD, S all relate. And so on.

One of his main points is that companies like JPM like to try to build everything in house and then, when they inevitably fail, turn to specialists like his company. So there is a divide between those who think AI is an easy add on/in-house project and those who think that deep learning AI in particular is the preserve of a few specialized companies. Siebel’s favorite examples are predictive maintenance for USAF and also AI for major utilities, which help identify cheating patterns for example in southern Italy.

I briefly owned the stock, glad I lost very little :slight_smile: The problem was slow growth relative to the giant story sold by the CEO (I have a similar problem with NET which I finally sold but it is worse with C3 IMO). If they are that great, I want higher growth. I get it that government and very large business contracts are slow but it is either too early or there is more competition than Siebel likes to acknowledge. I need to “reconnect,” have not followed in a year.

Anyway, now everybody will be selling AI in their speeches, reports, marketing materials, etc.

13 Likes

I’m astounded to be posting about this, but C3.ai is the very small position I’ve been holding for my brother since March of 2021. I add to it for his birthday and Christmas each year. It is his to sell or not as he wishes, but it sits in my brokerage account, so I watch it. Ticker is AI.

The first shares I bought two years ago were at $84.94. His Christmas present that year only cost me $39.50/share. His birthday present in early 2022 only cost me $18.81 and by this past Christmas, when the shares were just over $11, I decided it was throwing money away and got him a more tangible gift instead.

I haven’t followed the business that closely, since it’s not mine to decide, but when it jumped 27% at the end of last month, I started to pay attention. The announcement that sent it soaring was the launch of a toolset for generative AI applications. From a Dow Jones article:

“C3 Generative AI fundamentally changes the human computer interaction model of enterprise application software,” C3.ai CEO Thomas Siebel said in a statement. “Combining the full potential of natural language, generative pre-trained transformers, enterprise AI, and predictive analytics will change everything about enterprise computing.”

The company said its software will “accelerate transformation efforts across business functions and industries, including supply chain, sustainability, reliability, CRM, ESG, aerospace, oil and gas, utilities, CPG, healthcare, financial services, and defense and intelligence.”

The general release of the new tools is set for next month. They report March 2. The stock price is up from it’s December low by a whopping 132.5%. It has not had a red day since that announcement, with several days being up double digits.

So, now I’m sorry I didn’t give him some $11 shares for Christmas!

While the stock price is certainly riding the Chatbot wave, I have questions about the business itself. Here’s a link to an interview with the CEO. Statements like this trouble me:

Despite all of the investor excitement, Siebel said he hasn’t figured out yet how he will seek to monetize the new AI search tool which his customers will get beginning next month. He said he has been primarily focused on getting the product developed and in the hands of customers.

That’s not how the CEO’s of best-of-breed companies think. Of course they think about product development and deployment, but if you don’t also think about monetization, you have a hobby, not a business.

Also from that interview:

C3.ai in December reported a second quarter net loss of about $68.9 million, about 21% larger than in the previous year. Revenue grew nearly 26% to $62.4 million.

Unlike many tech companies in the past year, C3.ai grew its workforce. It employs about 867 now, compared to 691 in January 2022. With more customers coming on board, Siebel said he expects to increase employment again this year.

“We have cut back on marketing by a lot and found other places to reduce our costs in order to meet our goal to become profitable,” he said. “We need the people to handle our new business.”

If the new tools are good, then the company should do well. It has always touted itself as the only real pure-play AI company. It will ride that tailwind. How high will it ride? That remains to be seen.

JabbokRiver

10 Likes

Thanks for this post, @JabbokRiver42 .

“That’s not how the CEO’s of best-of-breed companies think.”

I wonder if he got caught off guard with the popularity of these “minor” things.

The reason is that he seems to have build the business around the monumental projects about which he talks about in the book. He touts C3 AI as the way to get a true AI solution, built from the ground up for the customer’s needs as opposed to being stitched together by using multiple existing ad-hoc tools.

Uses like Chat GPT were simply not part of his vision. His preferred customers are massive utilities like the French utility company, Shell, the DoD.

Good problem to have though if they can suddenly find a revenue stream that they had not previously anticipated.

6 Likes

What investors might have forgotten in the current AI stock frenzy is an almost two year old press release from Cloudflare.

Today’s applications use AI for a variety of tasks from translating text on webpages to object recognition in images, making machine learning models a critical part of application development. Users expect this functionality to be fast and reliable, while developers want to keep proprietary machine learning models reliable and secure. Cloudflare is seamlessly solving for their security, performance, and reliability needs while NVIDIA provides developers with a broad range of AI-powered application frameworks including Jarvis for natural language processing, Clara for healthcare and life sciences, and Morpheus for cybersecurity.

The combination of NVIDIA accelerated computing technology and Cloudflare’s edge network will create a massive platform on which developers can deploy applications that use pre-trained or custom machine learning models in seconds. By leveraging the TensorFlow platform developers can use familiar tools to build and test machine learning models, and then deploy them globally onto Cloudflare’s edge network.

Internally, Cloudflare uses machine learning for a variety of needs including business intelligence, bot detection, anomaly identification, and more. Cloudflare uses NVIDIA accelerated computing to speed up training and inference tasks and will bring the same technology to any developer that uses Cloudflare Workers.

Artificial intelligence is a core tenet behind my investing and constantly on my mind, so I remember this release very well and been patiently waiting for an update ever since. Anyway, this is a capex-heavy investment, so in the current environment it might be a long wait.

26 Likes

Circling back to this, since recent information and bread crumbs might suggest imminently rather than a long wait.

Next week, Cloudflare will have Developer Week with focus on developer productivity and AI. Judging from what’s been said and from looking at recent activity in their (public) code base, I expect some interesting announcements.

I’ve recently noticed a thing called Constellation showing up in their code base, which allows you to list and upload AI models to do… well… something. From a pull request (here in the developer documentation, it’s now pretty clear what Constellation is about, namely: Running machine learning models natively on Cloudflare Workers.

Looks like Cloudflare will provide a set of ready-made models for common tasks, along with the ability to upload and run your own pre-trained models.

Speculation: Seems quite possible that they’ll have an announcement about this next week, possibly as open beta. But do note the word speculation.

27 Likes

" Running machine learning models natively on Cloudflare Workers

I’m starting to get lost, and I’m a tech guy!

Don’t machine learning models require access to a large database?

If so: in the case of machine learning models natively on Cloudflare Workers : Where does the database go?

Surely the plan is NOT to deploy a big ol’ database to each node in the Cloudflare network?

5 Likes

Once models are trained on their data, they have a series of weights that are embedded in the model object. In the simplest case, imagine you trained a linear regression model and the result was y=5x + 2. You don’t need to store all the data you used to train the model, just the 5x + 2 result.

LLM’s and other large machine learning models can have billions of parameters so that’s obviously a lot more data to store for the weights/embeddings, but still not on the order of the data it was trained on.

11 Likes

DSNerd,

…so a trained model that can answer open-ended questions to pretty much anything can be cached on a Cloudflare node?

I’m still missing something…

1 Like

Once models are trained on their data, they have a series of weights that are embedded in the model object. In the simplest case, imagine you trained a linear regression model and the result was y=5x + 2. You don’t need to store all the data you used to train the model, just the 5x + 2 result.

This is how ChatGPT4 can now run on your cell phone without an internet connection.

This stuff is moving fast, one reason why I’m not trying to time things as much anymore.

Best

Jason

8 Likes

A common data “platform” used for AI/ML training is a data lake. R2 can serve as a data lake. In fact, PLTR is using R2 as a data lake instead of hyperscalers. In addition, NET talked about their R2 increasing 25% QoQ, then later talked about AI use cases increasing over 20% QoQ. Coincidence? I don’t think so. I think Cloudflare’s biggest $$ benefit in AI is going to be R2. Can still run your models from where you want (i.e. Databricks, SnowPARK).

16 Likes

This type of deployment isn’t my area of expertise, but I would imagine so (or they’re at least working towards it).

There are two phases of ML models–the training phase and the inference phase. When you hear of a model taking huge sets of GPUs, massive datasets, and months of processing, that’s the training phase.

When you prompt ChatGPT through the web browser and it responds almost instantaneously, that’s the inference phase. The reason it can respond so quickly is that it’s already been trained and your question is an input into the model and then ChatGPT predicts a response based on the weights it’s developed during training. (In my earlier example of 5x+2, it’s the same as inputting x=10 and the model returning 52).

Here’s an article where they walk through installing and running Facebooks LlaMa model on an “entry level PC”: How to Locally Run a ChatGPT-Like LLM on Your PC and Mac | Beebom.

In the article he says “To give you some idea, my PC is powered by a 10th-Gen Intel i3 processor with 256GB of SSD and 8GB of RAM. For GPU, I am using Nvidia’s entry-level GeForce GT 730 GPU with 2GB of VRAM”. He even mentions some users getting it to run on a Raspberry Pi (albeit with very slow responses).

12 Likes

DSnerd is correct. Don’t confuse the training phase and the inference phase. By “run”, the speculation was referring to the inference (using the model in production) of AI models.

Cloudflare would NOT be used for training, which for current LLM models requires a supercomputer with 100s/1000s of NVIDIA GPUs or specialized cloud offshoots. Hyperscalers rule that space right now. However, the use case of AI engines using R2 is for this purpose, as it gives the AI engines more flexibiity in where they can train. But that train will not be Cloudflare, it is more to have the R2 data available “everywhere” (very near the cloud/region they decide to do training).

the types of models that could run on Cloudflare would be of the extremely small variety … more tiny AI models for IoT, not LLMs. Though Google just announced that PaLM 2 has a model small enough to run on a phone.

PS nice find on the doc commit @Raylight !

-m

13 Likes

In layman’s business terms, which product is now easier to sell? Is it Workers? R2? Something else?

I know NET doesn’t break out sales segments, but I’m wondering exactly where this fits.

Inference will drive Workers compute more than data, though will require GPUs and so likely will have its own pricing model apart from the extremely generous Workers price tiers.

this new Constellation feature means we will finally see that NVIDIA partnership from long ago (Apr-21) finally come to fruition.

ps Constellation is very different than the AI Gateway they hinted at during the Connect keynote

13 Likes

The progress in Generative AI has been really fast. While I am not convinced of the conclusions drawn in this article, the timeline part is very interesting. In a nutshell, foundational models(Llama, GPT-4, etc) are extremely expensive to train, but fine tuning is extremely cheap. File tuning allows to customize the foundation model for personalized or domain specific use case. It also allows the model size to be drastically reduced. At one extreme, you can have one model to respond to all of your customers(eg: chatGPT) and at other extreme you can have every customer have their own fine tuned model which can be run at the edge.

Cloudflare has a huge opportunity to pivot the workers platform to run fine tuned models which can be customized for location, people preference a the region, etc. Cloudflare should support either their own foundational models or partner with existing foundation model provider, but focus on making it trivial for its customers to fine tuned models based on their preference and deploy it in a region close to them. I really want to see Cloudflare have some product announcements in this area within this year.

10 Likes

When Llama was ‘leaked’ from Meta to the open source community, yes it quickly became much much more. My understanding is that Meta still owns the rights to using what Llama has become. Individuals may not be sued for use of this, soon to be the best AI. But, I’m not sure Cloudflare could even support their now open sourced outcome.

Anyone here understand licensing?

3 Likes

No, I wasn’t talking about Cloudflare supporting Llama or the fine tuned open source versions. I was talking about Cloudflare either creating their own foundational model or partnering with existing providers. The key part is making it trivial to deploy a fine tuned model at the edge for their customers.

6 Likes