What are Edge Networks?

Hi folks. I finally wrote up (more than) a few words on Edge Networks like Fastly and Cloudflare, how they came to exist from their CDN background, and what is so appealing about them now and going forward.

https://hhhypergrowth.com/what-are-edge-networks/

In reading it, you might be able to tell why I moved my long form writing off of the Fool boards. I drew a few images to help explain things along the way and could then embed them right into the commentary - which I found to be a really really helpful way to explain some of the highly technical concepts.

I had many epiphanies while researching Edge Networks. I think the biggest for me was seeing that CDN features are but one application on top of the edge networks at Fastly and Cloudflare, the first of many more to come. Do not limit your thinking by lumping them in w/ legacy CDNs in a commoditized market! That CDNs are but an app is supported by how Cloudflare used Workers (their edge compute platform) to create a cybersecurity platform that clones Zscaler. And they plan on using Workers for all new product lines from here! Fastly is assuredly doing the same w Compute@Edge.

Another epiphany was how the strength of edge networks is in both their programmable network AND edge compute. The power is BETWEEN the edges. It all makes for a potent combination that is adept at handling traffic flows in ANY direction - not just origin to user that CDNs do.

Edge compute does not mean a standalone server. It can control the network inflows and outflows. It can intercommunicate with other compute nodes to distribute workflows and compute jobs. It can maintain network health and redirect traffic and compute jobs as needed. And most of all… it can scale! It can handle the massive increase in data originating from endpoints, and the increase in traffic from 5G and IoT. All while saving companies money in less infrastructure and bandwidth.

Edge networks are not aspirational … they are solving today’s problems while also being well positioned to solve tomorrow’s problems too**. It is an exciting time to be an edge network! FSLY and NET are both quite compelling for similar reasons (yet divergent strategies).**

Hope you enjoy! Any epiphanies stand out to you?

-muji
long FSLY and NET

109 Likes

Great article that cleared up many misconceptions for me. Thank you Muji.

1 Like

@muji

Two items in particular came to me while reading and I am hoping for your input.

1). Does edge compute open up additional market opportunity for ZS? Or, given the direction the first movers are taking by self-developing security features for the edge, will this restrict ZS TAM?

2). Regarding “the edge”: As we have evolved through time to discuss “the internet” and “the cloud” as pseudo-geographical locations in the world of technology; is “the edge” moving in a similar direction?

Thanks Muji. It was a great read.

1 Like

Muji,

That write up was truly incredible. Everyone should read that just because of the time you must have put into it!

I honestly have not followed many here into Fastly. I still not sure I understand these edge network companies technology (although your explanation was great and I am getting much closer to understanding).

Here is the one question I have:

Is it easy for a developer to switch from Fastly to Cloudflare? Why or why not? Here is my thinking. If Fastly is charging on usage and Cloudflare has one time charge not based on usage, would not everyone who starts off on Fastly just move to Cloudflare eventually once their usage spiked? Is this a concern?

Thanks in advance for your reply and the hard work with which you share freely.

1 Like

Fastly is charging on usage and Cloudflare has one time charge not based on usage, would not everyone who starts off on Fastly just move to Cloudflare eventually once their usage spiked? Is this a concern?

This is a terrible misunderstanding, buy one haircut and get all future haircuts for free.

Cloudflare charges a MONTHLY FEE,

Pro $20 a month
Business $200 a month
Enterprise Ask for quote
https://www.cloudflare.com/plans/?&_bt=378710819190&…

Denny Schlesinger

8 Likes

Any epiphanies stand out to you?

This may be controversial, but it appears to me there’s quite a bit of irrational exuberance around Edge Computing and Edge Networks.

There is nothing magical about the edge. Doing compute or serving up content from the edge can reduce latency. That’s it. A number of advantages being attributed to Edge Computing actually come from distributed computing, and the big cloud providers already support distributed workflows.

There was a thread here a while back on 5G and Edge Computing. 5G doesn’t demand Edge Computing. 5G itself reduces latency from endpoints to get into the network, so one way to look at it is that edge computing is needed less, not more. Sure, there may be new use cases requiring even less latency than is possible today that might be enabled with the combination of 5G and Edge Computing. But, those Use Cases may not be well described or known, and their business value maybe hasn’t been fully established. Some people have taken 5G/Edge computing too far, with examples like 1000 mile apart remote surgeries that have speed of light obstacles that require science fiction, not 5G nor Edge Computing, to overcome.

Back to the edge and latency, we have a great example of “being closer to the edge is always better” being wrong with Fastly. Akamai is arguably closest to the edge with its hundreds of thousands of POPs, yet Fastly often serves up content faster. That’s because network latency is not the only factor determining performance.

The same looking at overall performance, not just network latency, needs to be done with Edge Computing use cases. Many applications require central computing services, such as querying a central database for information. If your database is mostly static (doesn’t change often), then you can distribute the database out to the edge and then potentially gain some performance advantages by servicing the query at the edge. This would work, for instance, on authorizing users when they log-in. Their (hashed) passwords don’t change frequently, so copies can be kept and updated at all the edge servers, and when people log in from anywhere you can authorize them without the network hop to a central repository. Cool.

However, many databases do have their contents change frequently. Moving these to the edge means you’re constantly updating values from the central origin server to all the edge servers. This creates additional bandwidth charges and may actually decrease effective latency to the user, since a query to the edge may not have the actual latest information yet, and may not even know that it doesn’t. So, if you have a world-wide eCommerce store and want to show your customers what’s in stock and what isn’t, you’re probably not going to have your in-stock database at the edge. Every time someone bought something you’d have to tell all the other edges, even if no-one there was looking at those items. And if you delayed updating them, well, then the edge has to contact the central server and you’ve lost the network latency advantage.

This is why CDNs are at the edge - there is much content that doesn’t change frequently. Like a company’s JPEG logo or even an article in the NY Times. Once published, it’s rarely updated (eg only for corrections). Again, even here we see from Fastly that closest to the edge isn’t always best. A 7-11 on every corner won’t have a full stock of items, but you can’t put a Costco warehouse on every corner. Maybe a few Safeways centrally located in urban centers is more efficient.

When we look at edge computing, what are the use cases that are improved or enabled? Cloudflare, for instance, cited “image processing” as an example for their expanded serverless edge compute offering. I can imagine a case where a camera might send an image to an edge computer to be analyzed for changes from a previous image to detect movement. But, if you need to compare that image to some other image from another camera located on the other side of the country, or against some large stock of images gathered from multiple cameras and updated frequently, then you may need central computing anyway.

And even if the compute can be done at the edge, the question because whether you should push the edge to the actual endpoint itself. Apple does facial recognition right on the phone. That eliminates the network completely. Yes, that increases the compute you need at the endpoint, but there are often other solutions. For instance, Wyze (maker of cheap home surveillance cameras) just came out with a new outdoor camera that runs on a battery and so it can’t do complex computations or constant network uploading that would kill battery life. Instead of image processing to detect movement, they added a cheap PIR sensor that reduces both battery, computation, and network costs.

So, what I’m saying is that not all, heck, not even most, computing is going to move from central cloud computing to the edge, and even some that does move to the edge may move beyond the network edge to the endpoint device itself. How big is the edge computing market? No-one really knows yet.

I think we need to understand just what use cases are enabled by edge computing, understand the differences between edge computing and distributed computing, and what alternatives companies have for their compute needs. Like 5G, Edge is becoming a buzzword. I don’t invest in buzzwords. As Fastly has shown, a few additional milli-seconds of latency is not always worth eliminating.

I am bullish on companies like Fastly. I’m just trying to be realistic about what the long-term future really holds for them. They may announce blow-out earnings Wed afternoon based on their CDN offerings but I’ll be most interested to hear if they can add any color around what value their Beta customers are seeing with edge computing.

72 Likes

Thanks Muji,

In this article,linked here you wrote-
‘Edge compute could be making decisions around how to best maintain latency, such as stepping down the bandwidth requirements of a feed until the optimum combination is found, or lowering all of your traffic speeds across all simultaneously occuring network requests when you are nearing bandwidth capacity.’

Is his how Cloudflare got ‘zero nanosecond cold starts’?

And, do you see AYX’s new cloud initiative working well with as you put it, ‘cloud edge network’ capabilities in ‘stitching’ information?

Thanks again and love your new site. I’m a happy new member.

jason

Hi Smorg,
5G + Edge + AI will be a really interesting convergence of technology for Augmented Reality, and in particular AR games. Think PokeMon Go on steroids. A 5G headset that overlays reality with realistic, high res augmentations that operate and respond in real time. Games are like social media sites with regard to the potential addressable audience. Combine large audience with micropayments for game features and the opportunity is huge.

B

2 Likes

The Captain wrote out a few of the tiers of Cloudflare’s price structure; but, he didn’t include the their consumer facing product.

Cloudflare charges a MONTHLY FEE,

Pro $20 a month
Business $200 a month
Enterprise Ask for quote
https://www.cloudflare.com/plans/?&_bt=378710819190&…

Denny Schlesinger

I’ve been using Cloudflare’s 1.1.1.1 App for my personal use for a few months now. The App is free unless you add their WARP product for $5/ month, which I believe gains you access into their Argo tunnel. It’s advertised as increased security and not fully explained on their website.

The App has a toggle for turning it off, for when viewing add supported streaming video as far as I can tell. Those are the only sites I can’t get on (eg. Hulu) when the app is running.

I haven’t seen anyone mention this in Cloudflare’s TAM assessments. Everyone seemed to get excited when OKTA was going to get consumer facing. Do you think Cloudflare’s Bottom lime could be demonstrably increased with this?

Thanks,

Jason

1 Like

Some initial answers and responses. Longer retort to Smorg later.

HMC:

Does edge compute open up additional market opportunity for ZS? Or, given the direction the first movers are taking by self-developing security features for the edge, will this restrict ZS TAM?

ZS is likely using some kind of compute on their edge servers, but unlike Cloudflare and Fastly, they are not they granting outsiders access to internals. Nothing w/in their edge network is programmable or controllable by customers. You enter a point of their edge, and you get routed to the service you asked for. End of story.

So my answer is NO. I don’t see ZS moving towards Cloudflare and being a programmable edge network - but I obviously DO see Cloudflare moving towards Zscaler (and Fastly eventually too).

With Cloudflare now a direct competitor, certainly that affects the total market share that ZS can get. TAM (total addr market) is not affected as it is TOTAL. I’d also say TAM is probably growing in this environment of everyone having a highly scattered workforce.

How quickly Cloudfront replicated Zscaler’s platform on their own Workers platform is what should scare Zscaler investors. I mean, it was a year or more in the making as the individual products were created, but the fact Cloudflare wrapped them up and bundled it all into a Zscaler clone was pretty damn impressive.

Fastly could easily do the same given their extremely similar architecture, or if not them, a customer could develop the same thing on their edge network as a service.

PS this highlights something I only skirted the surface of… who knows what customers, aka the developers on these edge networks, are going to build from here! But I fully expect Fastly and Cloudflare to start being the “Salesforce” of next-gen networking and cybersecurity focused product lines, just like CRM does for Veeva and Ncino.

Regarding “the edge”: As we have evolved through time to discuss “the internet” and “the cloud” as pseudo-geographical locations in the world of technology; is “the edge” moving in a similar direction?

Absolutely. Cloud providers will try to blend edge networking into the cloud more and more - but as a term and an architecture, it is here to stay as its own thing. Cloud intrastructure vs Edge infrastructure.

RetirementDough:

Is it easy for a developer to switch from Fastly to Cloudflare? Why or why not? Here is my thinking. If Fastly is charging on usage and Cloudflare has one time charge not based on usage, would not everyone who starts off on Fastly just move to Cloudflare eventually once their usage spiked? Is this a concern?

Unknown. I’d guess porting code is probably not that terrible between edge platforms as both support WebAssembly binaries. But the change to the other providers APIs to control the edge network probably is where the difficulty would be. So, that probably nets out to the answer being NO it is not easy. And likely there are different features between the edge network APIs that aren’t 1-to-1. So I would think that these edge networks are going to be extremely sticky, as once a custom application is created and deployed and successful, there is not going to be much impetus to change it.

“One time charge”? Not sure where you got that from. Cloudflare has tiered pricing on Workers (what they now call Workers Bundled), which are capped usage levels that are very restricted in the amount of compute/memory they can employ. The new Workers Unbound is going to be priced like what we assume Fastly’s will be: usage-based and not capped. Once both are in GA we can compare pricing and features better.

WillO:

“Edge compute could be making decisions around how to best maintain latency”…
Is his how Cloudflare got ‘zero nanosecond cold starts’?

That wasn’t what my point was about - I was referring to how the nodes in the edge network could be work together to heal itself, work around issues, etc. This isn’t about how Workers loads in the customer’s app modules.

Cloudflare is cheating a bit on how they greatly reduced cold starts. It seems a gimmick and it’s a bit surprising given the great CEO blog post that I linked where he claims speed is not everything. I think they want the head-to-head stats to look better against FSLY but the way they did it is a band-aid.

And, do you see AYX’s new cloud initiative working well with as you put it, ‘cloud edge network’ capabilities in ‘stitching’ information?

Now that’s an interesting observation. AYX could absolutely be using edge compute to run models in a place more local to the data source. But we need to see exactly what their cloud efforts on server side are even going to look like. It’s interesting to ponder the potential though.

Do you think Cloudflare’s Bottom lime could be demonstrably increased with [consumer-facing apps]?

Yes absolutely. But I don’t believe it will contribute anything near what the enterprise side does.

-muji
long FSLY, NET

21 Likes

Denny appreciate the response. My choice of words was poor. I realized that Cloudflare had a recurring revenue charge. Question still remains.

Fastly is charging on usage and Cloudflare has recurring revenue charge not based on usage, would not everyone who starts off on Fastly just move to Cloudflare eventually once their usage spiked? Is this a concern?

2 Likes

Fastly is charging on usage and Cloudflare has recurring revenue charge not based on usage, would not everyone who starts off on Fastly just move to Cloudflare eventually once their usage spiked? Is this a concern?

I don’t have an answer except to use myself as an example. I love free, who doesn’t, but I’m more than happy to pay for mission critical stuff. If that is the way the rest of the world sees it, then your question changes to, “Is Fastly more mission critical than Cloudflare?” Or put another way, “Can Fastly charge more because it is better?” It’s not just what you pay but what you get for your money. Traditional CDN has pretty much become a commodity but Fastly re-architected the whole concept. Is that good enough to charge more? I wouldn’t dare to guess but revenue data will give us the answer.

I was asked by email what I considered “basic numbers.” That varies case by case. To answer this question, revenue data IS “basic numbers.”

Denny Schlesinger

2 Likes

Traditional CDN has pretty much become a commodity but Fastly re-architected the whole concept. Is that good enough to charge more? I wouldn’t dare to guess but revenue data will give us the answer.

My impression from the SSI reports on both companies and the company website postings is that while there is some overlap NET and FSLY serviced different user communities. With completely restructured and reprogrammed architecture FSLY targets the enterprise developer community offering not just speed but superior tools. NET less so.

Both , I think. offer better avenues for interactive
applications but I can’t discern the character of any differences there except for that of system structure the benefits of which I can only guess at.

draj

I mentioned in this earnings-related thread, “I just want to understand a bit more about Workers and how it compares with, or lives along side, Fastly’s Compute@Edge. I don‘t like how NET advertised this zero-second cold-start when it doesn’t apply to most sites (I’m not even clear if this is CDN or edge computing related)…” (https://discussion.fool.com/cloudflare-and-fastly-have-to-build-…)

First, the “zero cold start” thing is indeed about edge computing (glad I didn’t have to rethink that one!):
https://cloudflare-rss.livejournal.com
"When are zero cold starts available?
Now, and for everyone! We’ve rolled out this optimization to all Workers customers and it is in production today. There’s no extra fee and no configuration change required. When you build on Cloudflare Workers, you build on an intelligent, distributed network that is constantly pushing the bounds of what’s possible in terms of performance.

For now, this is only available for Workers that are deployed to a “root” hostname like “example.com” and not specific paths like “example.com/path/to/something.” We plan to introduce more optimizations in the future that can preload specific paths.

The reason my memory of this was confused is that this is about edge computing and not delivering content for webpages (CDN). In this context I’m not sure how domain URLs are related to deploying Workers (perhaps “deploying” is about being triggered to answer a request and not distributing a version created by a developer), which means I don’t really know how big of a deal this limitation is. I just don’t understand the language used here. I haven’t seen this limitation discussed outside of this blog post and a few articles that re-publish the feed though.

Regardless, as mentioned up thread the “zero” part of this is just a trick. It simply starts up while the page is loading so you don’t notice the startup time. It isn’t really loading instantly!:
https://engineeringjobs4u.co.uk/eliminating-cold-starts-with…
“Previously, Workers would only load and compile after the entire handshake process was complete, which involves two round-trips between the client and server. But wait, we thought, if the hostname is present in the handshake, why wait until the entire process is done to preload the Worker? Since the handshake takes some time, there is an opportunity to warm up resources during the waiting time before the request arrives.”

…The load time simply isn’t noticed because the user is waiting for the page to load. Again, I’m not real clear on what edge computing has to do with web page loading. Why do you have to load a webpage to do some computing? It sounds like they need this extra layer to seem fast? Maybe this is all so narrowly focused on webpage usage that I’m missing the explanation of the bigger picture. After all, it isn’t the webpage requesting the work really, it is just one possible trigger for it. What about a process that calls another process? No page loads going on there, right? Chain enough processes and that “zero” will become a number bigger than zero real fast!

Apparently Cloudflare’s edge computing platform, Workers, has been “generally available” for over 2 years and “more than 10% of all requests flowing through our network today use Cloudflare Workers.” This is a good read: https://blog.cloudflare.com/cloudflare-workers-serverless-we…

You will notice while reading the above link that Cloudflare continuously points out that they see high-performance needs as a niche that isn’t all that important to them. It seems to me that this is the fundamental difference between Cloudflare and Fastly. Fastly is built to handle the pricier niche need for high-performance applications while Cloudflare is focusing on solving a wider need. This is how I have been thinking of the difference in their CDN offerings, but it seems this will be the difference between the companies in all things. I guess this makes sense if both of their networks are built for this dynamic and CDN was just built on top of that anyway.

Thoughts on all of this? Am I totally off base here?

  • Rafe
    Long both Fastly and Cloudflare and getting increasingly more excited in Cloudflare
    P.S. I didn’t get a chance to read all of your blog post Muji, so I apologize if this was covered. I am looking forward to reading it all though! Thank you for creating it!
4 Likes

Thoughts on all of this? Am I totally off base here?

It’s very simple, there is no such thing as a ZERO cold start, you have to fake it. They say how they do it:

But wait, we thought, if the hostname is present in the handshake, why wait until the entire process is done to preload the Worker? Since the handshake takes some time, there is an opportunity to warm up resources during the waiting time before the request arrives.

BTW, this reminds me of a computer program I wrote in the early 1960s. Nothing seems to ever change! The input was IBM punched cards. The input had two types of cards, one took longer to process than the other. The shorter process was short enough to run the card reader at full speed but too long to wait until the end of the process to start the next card. Pre-starting the next card saved one complete read cycle and practically cut the processing time in half.*

They are doing essentially the same thing 60 years later! LOL

Denny Schlesinger

  • IBM charged by the hour for this job and this little “trick” cut the customer’s cost in half!
5 Likes