|Will Google Follow These?|
Reducing the number of links per page...
One of our sites recently fell significantly in the Google SERPS. When checking what the cause might be, I see on the Google guidelines that it says to keep the links on a given page to a reasonable number (fewer than 100). It also recommends breaking up site map pages if there are more than 100 links.
We do have several pages on the site that have more than 100 internal links. These are clearly and cleanly listed and are not designed to try to get better indexing, etc., but rather for making navigation easier for our users.
Here is the sample code:
<select name=go onclick="go(this)">
Should I be doing this in another way to make sure that Google or search engines such as Yahoo don't think we're trying to over-optimize?
Thanks in advance for any help or insights anyone can provide...
That code should stop Google from indexing the links. Since Googlebot doesn't submit forms, it shouldn't look at the option values.
Thanks very much for your input on this. Somehow, it still bothers me that the full URL is just sitting there in the HTML code, so just to be safe, I finally decided that I'm going to have it use an external perl cgi script so that the urls are not anywhere on the html page, or in any accessible file.
I generally structure my sites to be user-friendly first and foremost as Google suggests, but I still also have to live with their guideline that a maximum of 100 links on a page is reasonable, and the implication that they don't like pages with more than 100 links and thus it would add some negative weighting for the SERPS of the page if it exceeds their limits... (Interesting that major sites like Yahoo and others often have pages with more than 100 links though).
It may not work with the rumblings of a JS parsing testbot, but anyway.....
<select name=go onclick="go(this)">
Over-optimise, more like de-optimise. If you are planning on doing this on your own site don't forget that you will also stop the flow of PageRank to these pages.
Also it will stop it being counted as a backlink and there will be no anchor text passed or no other goodies ;)
|Over-optimise, more like de-optimise. If you are planning on doing this on your own site don't forget that you will also stop the flow of PageRank to these pages. |
Thanks for this important point. In addition to the drop down box on all of the pages that uses the cgi script, I still do have a standard link on all of the pages to the single page that does include the regular list of links. Kind of like a site-map page.
So, now I no longer have the identical list of 100+ links on all of the thousand plus pages of the site, eventhough it is user-friendly.
So the user can now use the drop-down list to find the specific categories (still user friendly), or if they don'tlike drop-down lists, they can still link to the single page that has the full list of standard html navigation links (and google can too). That single page lists the first 80 - 90 links and continues on another page to keep below the 100 google limit.
In this case, wouldn't I still get the benefits of having all of the links to the pages, since google can still follow the trail to all of the pages?
To continue the subject:
I have a web map of let say 500 links.
Google's cache of this web map shows all the 500 links.
I am using my homepage with PR7 to have this webmap so that each linked pages got a good ranking.
If I split it in a few pages, I will have then to link from a 5 PR6 pages and then the PR will be smaller.
Is there something wrong? Is there really a proven limit of 100 links/page.
I don't know if it is proven or not, and I do have a site where the homepage has more than 100 links and it ranks very well.
It does catch my attention though when Google does mention something so specific in their guidelines:
I would guess that it's just one factor among hundreds, but I would expect it would have to factor in somehow since they so specifically mention it.
It would be interesting to see what others think about this...
How well do those big pages work for the user though? Is there some sort of organization to the pages that might work better for the customer if they were broken into separate pages? EG, if links.html has 200 links, with 50 links spread over 4 topics, it might be worthwhile to just make 4 pages with 50 links a piece. Or something like that.
Adding extra unneeded trash to your html just so you can fool a search engine is bad policy, especially since the reason for doing so has no merit.
Try make a pre - test: on Poodle:
See how search-engine friendly your site is, can the spider crawl it easily? Will it get good rankings?