Welcome to WebmasterWorld Guest from 34.201.121.213

Forum Moderators: mademetop

Message Too Old, No Replies

Does page size bury links in the footer?

     
9:46 am on Sep 17, 2000 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


I just had one of those panic attack thoughts.

On several sites, I list text links at the bottom of each page. I cross link the site this way, and also ensure navigation for browsers with images turned off.

I never gave this a lot of thought before now, but don't spiders have a limit on how big a page they will take? In other words, if the page is over a certain size, do those text links that I placed so carefully down at the bottom just languish, unread, uncrawled, unnoticed?

2:16 pm on Sept 17, 2000 (gmt 0)

Senior Member

WebmasterWorld Senior Member rcjordan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 22, 2000
posts:9138
votes: 0


Yes, but I don't think it's supposed to be a problem unless you approach 45-50k on the page. Anyone have a breakdown of the K the major SEs index before they leave a page?
5:06 pm on Sept 17, 2000 (gmt 0)

Administrator from US 

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 21, 1999
posts:38252
votes: 111


For pure indexing purposes, you don't get into trouble until the html gets around 60k. The cutoff point is 64 with Alta, Excite, and Ink. Google and Fast will take larger pages up to 102k for Google and 128k for Fast. However, I think there is pretty clear evidence that pages over 32k html suffer in rankings (in general). Google has the worst page size bias, with Alta running a distant second. You can find many pages in the top ten on Alta that are in excess of 32k, while it is the rare exception if you find one top ranked in Google that is that large.
6:36 pm on Sept 17, 2000 (gmt 0)

Preferred Member

10+ Year Member

joined:Dec 21, 1999
posts:370
votes: 0


Tedster, I have also resolved to this for my top clients site. The site is approximately 1000 pages and provides the referral platform for a massive offline marketing campaign. As a result, it has been decided by commitee that no compromise can be made on design for search engines. So, the menu structure (with all the main keywords in it)is dynamically built and provides no food for spiders to crawl. I have very little license regarding the copy and layout and they are using quite a bit of javascript as well. The content 2 code ratio doesnt look rgeat either!

As a result I have gone for a very basic optimization of the current site including the text links in the footer (which feels like providing the engines with the crumbs instead of the bread). The average page size is about 20k, so well below the 32k stipulated by Brett. This comes as a relief. Initially, I thought it was around about the 40k mark but wasnt sure.

I will focus primarily on using other domains to provide feeder traffic. Its a pity. This site would have been a fantastic site to build as a collection and I think would have done well in the engines.

I am quite keen to see how you shape up with this Tedster and if you have any problems getting spiders to crawl. Our site will be completed at the end of this month. I will be watching my submissions and spider activity like a hawk. Will let you know if I encounter any problems.

7:18 pm on Sept 17, 2000 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Thanks for all the input. This forum community has saved me so much time and aggravation. I'm very grateful.

I was wondering why Google would prefer the shorter pages. My first thought was that sites with shorter pages are stickier, but I'm not certain about that.

From what I recall, Direct Hit measures stickiness. Does anyone know if DH also shows a preference for shorter pages? Do any other SEs measure stickiness and not merely clicks?

pete: I'm very interested in how exclusively using feeder domains works. I also have a client who has untouchable copy. Right now, they're relying on banner ads.

7:21 pm on Sept 17, 2000 (gmt 0)

Senior Member

WebmasterWorld Senior Member nffc is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:June 22, 2000
posts:3604
votes: 0


>As a result, it has been decided by commitee that no compromise can be made on design for search engines.

In a similar situation pete, great fun isn't it.

My approach has been similar to yours in trying to gain at least some minimal input on the main site. This has been very hard work and in my case will have very little impact on the overall placement, even the title tag is out of bounds. Looking on the bright side the main site is superbly optimised for the phrase "Welcome to". ;)

The only success has been convincing them to have a "printer friendly" version of the site, although everyone is aware of the main purpose of these pages. We have almost full control over both the layout and link structure of these pages, subject to a quick once over from the "committee". The important factor is that link popularity is retained, in contrast with "promotional" domains.

11:39 am on Sept 21, 2000 (gmt 0)

Preferred Member

10+ Year Member

joined:Dec 21, 1999
posts:370
votes: 0


Ted, apologies for the delay but have been hell of a busy.

Setting up seperate feeder domains or 'promotional domains' is pretty simple really.

1) Register domains and open up hosting accounts - depends on clients demands and budget.
2) Design search engine friendly pages with optimized content. Used to have single doorways which did very well in the engines and provided great traffic. Have started to extend these to satelite sites (ala themes) which all lead back to the main site. It is critical that these promotional sites do not create the impression of being glorified banner ads. This is a design challenge.

The above strategy is far from bullet proof. Firstly, one needs buy in from the client to extend their brand. NFFC, I like the printer version idea!. Secondly, very difficult to get these satelites into directories and build up decent link popularity. I have extended some of these satelites to 60 - 70 pages to conform to themes and created new identities to help with getting directory listings.

Also, if not implemented correctly or abused, this strategy could lead to all the domains and the main site being banned (even if it is a dog!).

In saying the above, this has worked for me (definitely if your sole provider of traffic = banner advertising). Give me a shout if you need any other input.

Regards,
Pete

11:08 pm on Sept 21, 2000 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Thanks, Pete. Also NFFC, for the "printer-friendly" idea. That one just might fly.

In terms of feeders, my guess is that full blown satellite sites will be the answer here -- not just one or two doorway pages on a domain. Maybe this will evolve into something Air will let me call "driveway pages [webmasterworld.com]";) Visitors will navigate through the satellite site just like a driveway, and eventually be attracted by links to the main site.

Pete: Using your present feeder site strategy, does the main site also start to show up in the rankings, at least on the engines that count links? Or is that too much to hope for.