Forum Moderators: skibum

Message Too Old, No Replies

Amazon.com launches "Amazon CloudFront"

Associates e-mail plugs new Amazon Web Service (AWS)

         

Pfui

9:10 am on Jan 15, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



FWIW, from the site:

"Amazon CloudFront is a web service for content delivery. It integrates with other Amazon Web Services to give developers and businesses an easy way to distribute content to end users with low latency, high data transfer speeds, and no commitments. ..."

[aws.amazon.com ]

farmboy

6:49 pm on Jan 26, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I read that email. Still not sure what type of content they have in mind or why I would want them to deliver my content.

FarmBoy

stajer

7:37 pm on Jan 26, 2009 (gmt 0)

10+ Year Member



Farmboy - think about all your static content:

- images (even images within a page that is otherwise server from your servers)
- media: video, audio
- text files (pdf, doc, xls, .js)

If you store that content with them, they can deliver it much faster to your end user. AWS puts that content on the edge of the network very close to your end user. It is replicated in many different places. It speeds up page delivery times and increases user satisfaction with your service.

farmboy

8:26 pm on Jan 26, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



images (even images within a page that is otherwise server from your servers)

So if I have a page of content and an image on that page, the image would be served to the visitor via Amazon and the content would be served via my server? That seems like a problem waiting to happen and there doesn't seem to be any problem with the speed of my images now.

I guess I still don't understand the value of this.

FarmBoy

topr8

10:19 pm on Jan 26, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



no as far as i understand you upload your files: eg. css, images etc to amazon, they distribute these files onto their servers around the world.

the image is called they serve it from their server (probably you have to do some dns stuff on a subdomain pointing to aws.

when i looked into this there is a caching system, basically it benefits files that are very, very popular ... but it is less beneficial for less popular files

ergophobe

11:21 pm on Jan 26, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Actually, it would be like have Google host your jQuery library (if it's not jQuery it's something like that) or using the YUI libraries.

So consider the case of files that change infrequently and get used repeatedly.

Benefits to you
- file distributed throughout thousands of servers in the world so you reduce network latency.
- browsers that can only open two concurrent streams per host can download two files from your server and meanwhile grab two static files (javascript libraries, basic CSS, header image and so forth).
- and of course you take some load off your own server.

Pfui

9:46 pm on Jan 29, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



FWIW Redux: They announced new, lower pricing tiers [aws.amazon.com] yesterday. Effective Feb. 1.

-----
ergophobe:

If they host our content, there goes any current ability we have to control who gets to -- or more importantly, who doesn't get to -- access that content, right? For example, I'm a major fan of mod_rewrite. But if AWS hosts, their servers => their access controls; and all logs are theirs, too...

Hmm. Even if my data would be aggregated (dunno if that's part of any AWS deal, actually), the scheme feels too remote for my needs. Perhaps a huge operation would benefit -- but I would think they'd be just as keen on the privacy of their logs. And wouldn't they be scaled up already?

Anyway. I'm with farmboy about the iffy utility, at any price. Besides, after amazonaws.com's unceasing, unauthorized conduct [webmasterworld.com] on my server, I'm disinclined to trust them with my, and my visitors', data.

[edited by: Pfui at 9:47 pm (utc) on Jan. 29, 2009]

ergophobe

11:27 pm on Jan 30, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>>that content.

Despite the email, this isn't really about content hosting in the usual definition of that word. It's about resource hosting or hosting frequently used objects that are part of your page. As they say on the site

Amazon CloudFront is designed for delivery of objects that are frequently accessed – “popular” objects... Websites often contain a small number of files that are shared by every page on the site. This can include graphics, like your site’s logo or navigational images, or supporting files such as cascading style sheets or JavaScript code.

So do you really care who accesses your CSS files? How about your JQuery library? The latter isn't even unique to you, it exists on every site in the world that runs drupal, as just one example. Or lets say you have a CMS with TinyMCE or some toher not so tiny Rich Text Editor. You could get it off your server if you're Dugg/Slashdotted.

I don't know what sort of tracking Amazon provides, but except for the bandwidth cost of some super busy site hitting your JQuery library, why would you even care who is grabbing it and why?

Nobody is saying you should give up control of actual content, the URLs on your site or any of that. I wouldn't host unique images there, but I might host resources there to decrease latency and speed up a site.