Welcome to WebmasterWorld Guest from 50.19.156.133

Message Too Old, No Replies

Way for a forum to avoid being penalised for bad outbound links

     

funfun168

2:15 am on Jun 12, 2006 (gmt 0)

5+ Year Member



If search engine such as google and yahoo delist or penalise a site, it would kill. For a forum, it is difficult to control the content and outbound link of the member's post. Is there any way to avoid being penalised by google and yahoo?

There is one case come to my mind, if a member posts a bad link(ie. a link that already penalised by search engine), to an extreme, a link farm, what will search engine do to my forum?

Is there any other case would lead to penality?
May we discuss here?thank you

jonrichd

11:57 pm on Jun 12, 2006 (gmt 0)

10+ Year Member



FunFun, Google and the other search engines are aware of the problems webmasters of forum and blog sites have with links from members that might not be the best quality.

They have provided a method of saying 'do not trust this link' that you could implement in your forum code.

If a normal link looks like: <a href=goodsite.com">Good Site</a>, then one that shouldn't be trusted should look like: <a href="badsite.com" rel="nofollow">Bad Site</a>.

The 'rel=nofollow' tells the bot you can't vouch for the link, and the search engines shouldn't penalize you for any of these links.

2by4

12:59 am on Jun 13, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



now if yahoo's slurpbot would just start obeying nofollow it would all be great.

theBear

1:55 am on Jun 13, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Gee 2by4,

You want software that works? After decades of crudware all of a sudden this stuff is supposed to work as advertised?

Let me find my hat, ah there it is, firmly on cranium.

You really should wear your aluminium beanie more. It helps protect against the harmful incorrect software (R_____d) ray.

2by4

12:05 am on Jun 15, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



theBear, I know, I know, it's unreasonable to expect software to work, that's a given.

But you'd think, somewhere, deep in the heart of yahoo, there might be one lonely programmer who could manage to execute something like:

if ( $link contains nofollow )
do not index or follow
fi

I know it's unreasonable to expect this, absurd almost, but it's been years now since slurp routinely accesses pages blocked in robots.txt at least once or twice before giving up, and of course it never succeeded in not following nofollow stuff.

Maybe the yahoo guys haven't been able to get any decent programmers now that google has been slurping them all up?

I can see having a buggy algo, but not being able to parse and obey a simple little line of text in a link? yahoo must really not care about its search division.

Or it's just too hard to implement it. Oh wait, I'd better patent my algo above before somebody tries to steal it.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month