Forum Moderators: martinibuster

Message Too Old, No Replies

Block link pages via robots text.!

is it safe?

         

lakr

4:40 am on Jul 31, 2007 (gmt 0)

10+ Year Member



I am going to block my link pages via robot text, is it safe, and does it mean that Google will NOT follow the outbound links within those pages?

if I use:

disallow: /link/linkcategory1.html

Will Google not follow the page: /link/linkcategory1.html or the entire folder /link/?

I am so shame for such stupid question.

Thanks for any reply.

Lkr

new_seo

4:55 am on Jul 31, 2007 (gmt 0)

10+ Year Member



My friend,I would suggest you not to do this.It's unethical.

cnvi

3:57 pm on Jul 31, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



When you block links pages with robots.txt, you show potential link partners that you wont be allowing links from your site to their site to be crawled by search engines.. this can result in lower linkback rates. If you want the best chance of success in obtaining links (through relevant link exchange), dont block your links pages with robots.txt.

dickbaker

9:46 pm on Jul 31, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I spend a lot of time trying to solicit quality, on-topic links for my site. It frankly angers me that webmasters will use various techniques to keep the search engines from following links.

It also makes the job of searching for links that much harder, because I have to look at the source code for every potential link partner's site, and also at the robots.txt file.

If you think that having a disallow: in your robots file will somehow benefit you, I think the ethical thing to do would be to mention that you're doing so on your links page(s). But I'd bet you that the number of sites wanting to trade links with you would drop to zero.

lakr

2:19 am on Aug 1, 2007 (gmt 0)

10+ Year Member



Thanks for replies, friends, of course, I do not block all the link pages, just pages with partners using the same techniques to me.

P/S: if I block the page via robot text, will the PR of those pages disappear?

Best,

dickbaker

3:26 am on Aug 1, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"Thanks for replies, friends, of course, I do not block all the link pages, just pages with partners using the same techniques to me."

Do I understand you correctly? Are you saying that you're only blocking links on pages containing reciprocals from sites that are blocking your links?

That makes no sense, unless you're hoping to gain traffic (but not PR) from those sites that are blocking you by using "noindex, nofollow" tags, or "disallow:" in the robots.txt files, or by using javascript mouseovers to hide the href's to your site.

I'm sorry, but I wouldn't trust any webmaster who used such techniques. If those webmasters are abusing you now, how much worse will they abuse you in the future?

If I'm misunderstanding this thread, please accept my apologies to all concerned.