homepage Welcome to WebmasterWorld Guest from 54.197.215.146
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Will Google Follow These?
Reducing the number of links per page...
rover




msg:99423
 8:17 pm on Mar 20, 2004 (gmt 0)

One of our sites recently fell significantly in the Google SERPS. When checking what the cause might be, I see on the Google guidelines that it says to keep the links on a given page to a reasonable number (fewer than 100). It also recommends breaking up site map pages if there are more than 100 links.

We do have several pages on the site that have more than 100 internal links. These are clearly and cleanly listed and are not designed to try to get better indexing, etc., but rather for making navigation easier for our users.

So, to adhere to the Google guidelines, but still keep our site user-friendly, now I am looking at making these links work through Javascript. Does anyone know if the following structure will stop Google from considering these as the type of links they mean are considering the 100 links per page criteria?

I've heard that Google doesn't follow javascript links, but it does seem that the links would be easily read by Google because the url is right there, so I wonder if this will work:

Here is the sample code:


<script>
function go(e){
window.location=(e.options[e.selectedIndex].value);
}
</script>

<select name=go onclick="go(this)">
<option value="h**p://www.ourdomain.com/url1.html">Country1</option>
<option value="h**p://www.ourdomain.com/url2.html">Country2</option>
<option value="h**p://www.ourdomain.com/url3.html">Country3</option>
Etc.
</select>

Should I be doing this in another way to make sure that Google or search engines such as Yahoo don't think we're trying to over-optimize?

Thanks in advance for any help or insights anyone can provide...

 

mcavic




msg:99424
 9:58 pm on Mar 21, 2004 (gmt 0)

That code should stop Google from indexing the links. Since Googlebot doesn't submit forms, it shouldn't look at the option values.

rover




msg:99425
 5:25 pm on Mar 22, 2004 (gmt 0)

Thanks very much for your input on this. Somehow, it still bothers me that the full URL is just sitting there in the HTML code, so just to be safe, I finally decided that I'm going to have it use an external perl cgi script so that the urls are not anywhere on the html page, or in any accessible file.

I generally structure my sites to be user-friendly first and foremost as Google suggests, but I still also have to live with their guideline that a maximum of 100 links on a page is reasonable, and the implication that they don't like pages with more than 100 links and thus it would add some negative weighting for the SERPS of the page if it exceeds their limits... (Interesting that major sites like Yahoo and others often have pages with more than 100 links though).

JasonD




msg:99426
 5:28 pm on Mar 22, 2004 (gmt 0)

why not change it slightly so a full url isn't viewable without parsing the JavaScript.

It may not work with the rumblings of a JS parsing testbot, but anyway.....


<script>
document.write('
<select name=go onclick="go(this)">
<option value="h**p://www'+'.'+ourdomain.com'.+.'/url1.html">Country1</option>');

etc. etc.

</script>

Etc.

creative craig




msg:99427
 5:30 pm on Mar 22, 2004 (gmt 0)

Over-optimise, more like de-optimise. If you are planning on doing this on your own site don't forget that you will also stop the flow of PageRank to these pages.

Also it will stop it being counted as a backlink and there will be no anchor text passed or no other goodies ;)

rover




msg:99428
 6:00 pm on Mar 22, 2004 (gmt 0)

Over-optimise, more like de-optimise. If you are planning on doing this on your own site don't forget that you will also stop the flow of PageRank to these pages.

Thanks for this important point. In addition to the drop down box on all of the pages that uses the cgi script, I still do have a standard link on all of the pages to the single page that does include the regular list of links. Kind of like a site-map page.

So, now I no longer have the identical list of 100+ links on all of the thousand plus pages of the site, eventhough it is user-friendly.

So the user can now use the drop-down list to find the specific categories (still user friendly), or if they don'tlike drop-down lists, they can still link to the single page that has the full list of standard html navigation links (and google can too). That single page lists the first 80 - 90 links and continues on another page to keep below the 100 google limit.

In this case, wouldn't I still get the benefits of having all of the links to the pages, since google can still follow the trail to all of the pages?

kalalau beach




msg:99429
 9:37 pm on Mar 22, 2004 (gmt 0)

To continue the subject:
I have a web map of let say 500 links.
Google's cache of this web map shows all the 500 links.
I am using my homepage with PR7 to have this webmap so that each linked pages got a good ranking.
If I split it in a few pages, I will have then to link from a 5 PR6 pages and then the PR will be smaller.
Is there something wrong? Is there really a proven limit of 100 links/page.

rover




msg:99430
 9:57 pm on Mar 22, 2004 (gmt 0)

I don't know if it is proven or not, and I do have a site where the homepage has more than 100 links and it ranks very well.

It does catch my attention though when Google does mention something so specific in their guidelines:

h**p://www.google.com/webmasters/guidelines.html

I would guess that it's just one factor among hundreds, but I would expect it would have to factor in somehow since they so specifically mention it.

It would be interesting to see what others think about this...

antsaint




msg:99431
 10:25 pm on Mar 22, 2004 (gmt 0)

How well do those big pages work for the user though? Is there some sort of organization to the pages that might work better for the customer if they were broken into separate pages? EG, if links.html has 200 links, with 50 links spread over 4 topics, it might be worthwhile to just make 4 pages with 50 links a piece. Or something like that.

steveb




msg:99432
 10:40 pm on Mar 22, 2004 (gmt 0)

Adding extra unneeded trash to your html just so you can fool a search engine is bad policy, especially since the reason for doing so has no merit.

Keypoint




msg:99433
 12:37 am on Mar 23, 2004 (gmt 0)

Try make a pre - test: on Poodle:
h**p://www.gritechnologies.com/tools/spider.go

See how search-engine friendly your site is, can the spider crawl it easily? Will it get good rankings?

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved