Forum Moderators: martinibuster
Page A: home page
Page B: contact us
Page D: category pages
Page D: product 1 page
You could create a basic link structure like this
A links to B and B links to A
A links to C and C links to A
C links to D and D links to C
A
/ \
B C --D
A
/ \
B---C-–D
A links to B and B links to A
A links to C and C links to A
A links to D and D links to A
B links to C and C links to B
B links to D and D links to B
C links to D and D links to C
A --- B
¦ X ¦
C --- D
thanks to Martinibuster for helping me straighten this out
What you are trying to do when you mesh a website is have all of the pages link to each other, or at least the category pages all link to each other.
example
main categories
produce
snacks
frozen
sub categories
produce - apples, oranges, lemons
snacks - cookies, chips
frozen - tv dinners, ice cream
When you hit the produce main category page it will link to the two other main category pages AND ALL of the produce sub category pages. Additionally each of the produce sub category pages will link to ALL of the other produce sub category pages AND each of the other main category pages.
What this does is make as many pages as possible available to the spider right away without having to climb thru a hierarchy.
I have one website now that has a just over 40 pages, all of the main sections are connected like this, about 30 pages, each of the those pages has a PR5. The homepage has the same PR as the category pages and as the links pages. Makes it a heck of a lot easier to get links if people see they are going on a PR5 page instead the traditional links directory with a PR2 or PR3.
low value - 1 page
medium value - 2 pages
high value - 4 pages
money pages - 8 pages
thats 15 pages x 4 topics = 60 pages
60 pages + 1 homepage = 61 pages
Well below the 100 link limit. You need to establish the hierarchy visualy with font sizes, indents or some other element.
look at his quote
Try to link to as much deep content on your index page as possible - yes I know - it is a tall order. However; spiders like first level content. If it is linked off your root page, then many se's will tend to follow at least those links or rank those linked pages higher.
He had it right back in 1999.
However if you have good content deep inside your website, there's no reason why you can't have some partners link straight into there, that way you are assured of being spidered from deep to deeper. This is a good way of overcoming home page limitations, a way to "push" content to the top.
my homepage has links to all products I'm listing and it's around 500... do i need to break this down into 5-6 subpages with 100 products each?...
never heard of this 100 link limitation - I've heard about the 100kb page size though...
GoogleGuy has mentioned this on numerous occasions. Google clearly mentions it at their webmaster faqs [google.com].
When learning about the search engines, it's a good habit to look in the search engine "about us" and "help" pages.
"If the site map is larger than 100 or so links, you may want to break the site map into separate pages."
This is just a mere suggestion. There IS evidence however (and a great thread on it six months to a year or so ago) that only the first 100K on a page will be spidered/followed.
It's SO tiring hearing people publish half-truths like they know what they're talking about, but hey, it's stuff like that that makes guys like me rich. ;)
This is just a mere suggestion. There IS evidence however (and a great thread on it six months to a year or so ago) that only the first 100K on a page will be spidered/followed.
How much content is indexed weighed by the kb, and how many links are on a page and subsequently spidered are two different issues.
If Google and people from Google are recommending to keep links to under 100 on a page, I'm not going to play Edgar G. Robinson to Google's Moses and insist for a detailed explanation of why they published those recommendations. I'm going to take the hint.
I'm not saying everyone should do it. It's up to you. Me, I'm going to take the hint.
Because most sites have standard menus on each page, the advice to limit links to 100 is to give some breathing room to the page so that you can have your normal menus on the page. I usually have a dozen menu links and a couple of other buttoms or text links on each page that each point to interior pages, thus I can easily add 100 additional links and they'll all get indexed, and won't bust the real limit.
Because PR is divided amoung all pages linked to from a page, the limit was probably put in to keep from having to share PR with more than 100 pages, the calculation could get tedious with more than that number of links. Remember, PR calculation is an iterative process, one that is repeated until RP settles to a stable value not necessarily to a precise value. Why did they limit the PR iterative cycle to only acquire a stable value rather than a precise one? Because the dang calculation can get out of hand if you don't limit it, and they found that values stabalize after only a few iterations, and that additional iterations of hundreds of calculations did little to make results more accurate. So, if they found they needed to limit the number of iterations in the calculation, I can reasonably assume they set limit of links on a page to do PR calculations on for the same reasons.
Google only uses integers to rank pages. Pages in SERPs are ranked #1, #2, #3, #4, #5, and so on and so forth, not #1, #1.02795, #2.8473, and #6.2240. So, what is important is that stable relative values be calculated, not necessarily exact values.
I may have just gone off an a tangent, but two cents often are.
According to G they will only crawl 100 links. It's hard to say without seeing but you probably are going to need to have some categories and sub categories.
They definitely crawl more links. I have a page that links to about 500 internal pages. Some intoductory text and then just about 500 links. Everything crawled, everything in the index, everything ranking.
Ok, it wasn't planned to be so. The list grew because the underlying data grew and I didn't have the time to break it down into a hyrarchical strucure.
but there ist just one of those pages in the whole site and
it has been there for more than a year
and the PR of the page is rather high and
they seem to like the pages - they all rank nicely.