Very upbeat ER call. The CEO covered the main points of how AI is impacting their business. If you want to listen, it starts at about 11:15 on the call recording. I’ll copy that portion of the transcript below.
I wanted to end by talking a little bit about what we’re seeing in AI. We believe Cloudflare has four distinct opportunities in AI. The first is the same as every other company. We’re getting more efficient in our business processes using AI systems.
That’s not particularly interesting these days. Though I’m proud, more often than not, we’re finding we can build these functions on our own infrastructure rather than contracting with others. The second is AI makes our performance and security products smarter for customers. At some level, although we never really had the hubris to describe it this way, Cloudflare has always been an AI company.
The thesis was that if we could get enough Internet traffic flowing through us, we can spot security threats that no one else could see. And today, our machine learning-based security systems regularly discover new security threats that no human had identified before. That, again, feels like table stakes for us. The third opportunity is where I think things start to get interesting.
The killer application for Cloudflare Workers is turning out to be AI. The model of programming is uniquely suited for building tools like AI agents, and our serverless architecture, which allows you to pay only for what you use based on CPU or GPU type, positions Workers to become the go-to platform for developers who want the best price performance for AI inference and agentic workflows. I talked last quarter about a large AI customer that was building their interface on top of our inference platform. From their perspective, the partnership has gone extremely well.
But behind the scenes, our team has been able to do the hard engineering work to drive the efficient use of our GPU infrastructure. Inference tasks are generally highly variable, and there’s a two times difference between these customers’ peaks and valleys, which means they would have to pay approximately 250% more to stay provisioned on a hyperscaler to run the same number of inference tasks compared with Workers AI’s efficient pay-per inference serverless model. Additionally, more and more developers are discovering AI Gateway, with some realizing more than 10x price performance improvement for their AI agents by serving requests directly from Cloudflare’s cache instead of the original model provider. And just last month, the world was amazed that the efficiencies of the team of clever engineers in China were able to deliver in the field of AI training with the DeepSeek model.
We are seeing that there are equivalent optimizations that can be made with AI inference on Cloudflare’s platform, resulting in faster performance and lower prices for customers and higher margin, and less capex for us. We believe inference is a bigger opportunity than training and our team continues to find step-function breakthroughs that put us well ahead of any alternative. But all that may pale in comparison to the fourth opportunity. Cloudflare counts many of the most important AI companies as customers.
We also count a huge portion of the world’s content creators as our users. Being between those two puts us in an important role to help figure out the business model of the post-search web. Cloudflare sits in a unique position to help figure out how content creators are compensated, what agents are allowed where and on what terms, and how the AI-driven web of the future will fit together. It’s early days, but the conversations we’re having with all the relevant parties feel foundational for the future.