Forum Moderators: open

Message Too Old, No Replies

What the best way to STOP google from following certain links?

Is this a good thing to do? Javascript cleverness or robots.txt?

         

matthew2003

2:37 pm on Apr 3, 2003 (gmt 0)

10+ Year Member



We have a PR6 site with about 6000 pages. For human usability and legal reasons we like to have a standard footer on every page with links to stuff like 'TERM OF USE', 'CONTACT US' etc.

By having links to these pages on every page in the site we are surely boosting their importance unnecessarily?

If you agree, what is the best way to stop google from following these links.

I am thinking either:
-stick the pages in a non crawlable (via robots.txt) folder
or
-link to these pages with some javascript link. (if you think this is the best can you suggest some example code which you know google will not follow?)

Any input appreciated.

Matthew

scottyman

5:41 pm on Apr 3, 2003 (gmt 0)

10+ Year Member



I have also been investigating this recently. If you use JavaScript you need to make sure the http: is missing. Google can follow JavaScript if it has the full hhtp:www.mysite.com/page but not www.mysite.com/page. Or that is my understanding.

Psycho1

10:39 pm on Apr 3, 2003 (gmt 0)

10+ Year Member



I have also been investigating this recently. If you use JavaScript you need to make sure the http: is missing. Google can follow JavaScript if it has the full hhtp:www.mysite.com/page but not www.mysite.com/page. Or that is my understanding.

Are you sure about that? I'm using javascript for some of my site navigation, so I'd actually want Google to see these links:)