Forum Moderators: open

Message Too Old, No Replies

Using robots.txt to exclude link pages

Is this a good practice?

         

markis00

2:00 am on Nov 19, 2003 (gmt 0)

10+ Year Member



Hello all,

this may have been covered before but I have a question about using the robots.txt file to exclude link pages.

I have been thinking about doing this. I know that if I do, my link partners would be angry because basically you're saying you're linking to them, but you're not.

However, if I was to disclude my link pages using a robots.txt file, Google would only see incoming links to my site, giving my site a huge ranking boost (right?)

I think other people have done this too and I wonder - is it a good practice? If other webmasters were to find out they would be angry, most likely. But I know other people do it and am wondering whether I should do it too.

skipfactor

10:28 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Google would only see incoming links to my site, giving my site a huge ranking boost (right?)

Wrong, you'll probably get a negative boost. With the Florida update, my link partners are ranking higher than I am with my anchor text. Additionally, your link partners will probably remove your links. Bad idea.

markis00

12:36 am on Nov 20, 2003 (gmt 0)

10+ Year Member



Yeah, I thought so. Thanks.

skipfactor

12:40 am on Nov 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Forgot to mention, from what I've read, you don't lose any PR by linking out, you're just sharing it if you will.