|Alltheweb prefers small page sizes - WHY?|
doesn't this encourage cloaking?
I believe alltheweb strongly favours small page file sizes, from my own research, and from that of others, but I can't see a good reason for it.
Okay, some large pages take a while to download to the browser but there are many websites with more than 50k on the index page which download perfectly well, and users wouldn't know any different unless they scrolled down the page immediately - Other search engines, i.e. Google, don't seem to have a problem so why does alltheweb?
Surely this system positively encourages cloaking.
It may just be that smaller pages tend to be more focused than longer pages, which tend to ramble, and thus they would rank higher.
On Google, off-page factors enter into the algo more so than in Fast, and long rambling pages, particularly if they're linked well, probably have a better chance.
Pardon my ignorance, but I just cannot make any relation between the absolute necessety of using cloaked pages and download time. In my book, only unsensitive and irresponsible Web designers get some huge part of the market waiting after any page to load. If visitors have to wait for a minute before knowing were they are, and where they can go from there, they will click elsewhere. If cloaking can be some way of making things happend without hurting some big star Web design ego, then cloakers should go for it!
Some niche market sites can live with + 300 k flash or else gizmo stuff, most don't.
I believe that if Fast ajusted the algo for fast loading pages (just as many others), it was to respond to preferences of users, not cloakers. May be is this why a couple cloakers do fast loading pages?
Fast likes fast loading pages because users want fast loading pages.
I think my point was perhaps not made so clearly. If I could respond to your points:
> I just cannot make any relation between the absolute necessety of using cloaked pages and download time.
There is no absolute neccessity to cloak - I didn't say that. Take, for example, a well themed legitimately optomized website doing well on major search engines inc, Google. The index page is 50K in size which in my opinion will load up quick enough to keep the user interested even when using a 56K modem.
Now, to achieve a high ranking in Alltheweb (and I am fully aware of exceptions) you are faced with a choice: either,
a) shave some 20k (or more) off your index page, thus potentially losing the successful listings with other search engines - who rather like the website as it is. or,
b) use cloaking techniques to give alltheweb a smaller page size.
these choices are of course dependant on the thoery about alltheweb preferring smaller page sizes - I have no proof of this, I just share a belief in this theory.
> Some niche market sites can live with + 300 k flash or else gizmo stuff, most don't
300k is quite ridiculous - I was only talking about some 50K.
I am not advocating nor comdemning the use of cloaking in any way - I personally don't use it. I am simply offering a theory that alltheweb may well be attracting cloaked pages.
I know the two choices I offered above for gaining success with alltheweb are simplistic, and there are other factors to consider, but if you have exhausted all other possibilities and feel sure that your file size is the factor preventing you from a good listing, then the choice may become reality.
take this search [alltheweb.com]. Should be fairly well optimis(z)ed sites, right? Watch the spread of page sizes. I admit the #5 with 120 Kb is a bit out of the way, but still I cannot see a strong preference for small pages.
thats true, Heini. but..
the site with 120k size you mention, in reality, is actually only 9k.
the next largest site in the list is supposedly 51.3k - it is actually only 13K.
everything else is less than 26k.
right stavs, had checked the 120 K page also, saw it was a wee bit smaller :)
but I must admit I donīt grasp it (being not much of an expert when it comes to cloaking): to my understanding Fast has indexed and ranked 5th the page in a 120K version. What the user sees is a small version. Am I missing something? (Itīs waaay after midnight over here, possibly should have some sleep...)
heini, its late here as well - UK? - I'm using matchsticks to keep my eyes open ;)
I don't know if those sites are cloaking or whether alltheweb have miscalculated file size or what - it is bizarre.
Seeing small pages on the top does not mean that the size of the page is an important part of fast algo. The causality does not work that way.
As Robert pointed out, it is easier to keep a small page well focused. Small size is just a by-product of well designed and well-focused pages.
Splitting your pages into subsections instead of cloaking should allow you to focus each page on the right keywords, so that your users can find exactly what they are looking for. That is what you want I guess.
I do not think cloaking is necessary in this case, while would help you attract more traffic, it would also result in users getting the "heavy" page which is slow to load and probably too long to find the info they are looking for.
If the page is already "unsplittable", and you still need to give it a push, cloaking is probably the right approach...