If your website is popular enough
The site I'm concerned about is indeed a top shelf site with a solid corporate reputation, great backlinks and decent traffic, but not like massive viral traffic. I'm new onboard and haven't seen analytics yet, but I'm guessing it's only 200K visitors per month or so.
That said, these services have a lot to offer to less popular sites. I'm testing a couple of them with less important sites that only have a 5K to 10K page views per month. The benefits, though, are things like:
- reducing the number of requests on the origin server
- serving up assets from servers closer to the user
- splitting requests between the cache and the origin server (again speed advantage)
- identification of bad bots, spammers and so forth and various filtering as a first line defense against comment spammers. I'll know how effective this is in another week when I can look at Mollom/Akismet data before/after applying these.
- some offer various file aggregations (combining CSS files or, sometimes, inlining it), image caching optimized to device size (essentially "responsive" images).
So you can get a nice performance boost reported in YSlow or Pagespeed even if the site only gets one request a day. Okay, in practice you need enough requests to keep assets live in the cache but a page request per minute would do that for shared assets.
But that brings us to....
Are you having slowdowns due to heavy traffic?
working on other aspects
You're certainly right that it's preferable to simply reduce requests at the origin if possible.
That's phase I and if we get the numbers we want, we'll stop there. I think we can handle this on our end by reducing the number of requests, static caching (possibly a local reverse proxy like Varnish rather than a service like Cloudflare), and so on.
But I'm trying to think ahead in case we can't get there and just generally trying to raise the topic. The site is already using sprites, for example, so I can't reduce request that way. At a certain point without a redesign (not in the budget), it may need extra help.
So to answer your question: I don't think the slowdowns are due to high traffic, but I've just come aboard and don't really have the data to support that. So hopefully not, but if so, I want to be prepared and understand the implications.
extra potential point of failure
It is and it isn't.
1. It is
- nameserver fails and poof! You're offline
- bot blocking goes astray and blocks legit users
2. It isn't
- if the origin server is down or slow, the reverse proxy can still serve up cached documents and keep your site online when it would otherwise be offline.
So before deciding on impact on uptime, I'd have to think about the whole system and watch the numbers, but it's not a simple is/isn't thing.
local traffic for sales should get a local host
Thanks for that observation. It doesn't apply to the site in question, but this was one of the issues that I was thinking about in terms of SEO effect of a CDN. Any additional thoughts? My thought is that since local hosting is relatively rare, it's far more important to have the address, including zip code, and phone area code on the site (even if the main phone number is toll free).
- the tighter your security is the more visitors you will label wrongly and block.
- the more security checks your CDN performs the slower your site becomes.
Good points. It's relatively hard to see who is getting blocked illegitimately. For testing I've turned security to the lowest settings to do the fewest checks and block only the most nefarious (90% of the blocked IPs are from China and Russia and the test site has little to no appeal there; I don't think I've seen one US IP blocked yet; on the flip side, I'm still letting most bots through).
Anyway, thanks for the thoughtful feedback guys. I appreciate it.