Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Dealing with outbound links

Google specific

         

Ept103

7:11 pm on Mar 8, 2005 (gmt 0)

10+ Year Member



My problem is that my website has an extreme number of outbound links.

Note: The most important goal here is to stay on good graces with Google (proper webmaster guidelines so we're not penalized in any way) they are sending a large majority of our traffic. A close second goal is to be user friendly. Without Google our site wouldn't exist, so we have to make sure we are 100% sctrict (or as close as possible) with their guidelines. Sounds bad that user friendly isn't number 1 concern but their wouldn't be users if it wasn't for Google.

I'm curious to hear feedback about which of the following situations will be "better" in Google's eyes:
Note: I'm trying to find a way to deal with outbound links which because of excessive quantity can't be dealt with by hand to check for linking to bad neighbhorhoods,etc

1) PHP redirection script used for every outbound link can set it between 301 or 302.
Down side: Potential of linking to "bad neighborhoods"
Question:
-If a link goes through a redirection script and the script is blocked via robots.txt, does the link get crawled and open up the door for potential bad neighbhorhood penalization?

2) Javascript outbound links either stored in external .js file (blocked from search engines) or regular javascript links on page (uncrawlable in some way.)
Down side: Google may frown upon excessive use of javscript.
Question:
-Is this cloaking?
-Has excessive use of javascript ever been shown to cause a penalty?

3) Plain text outbound links
Down side: Potential of linking to "bad neighborhoods"
Question:
-Does rel="nofollow" make Google ignore the links, i.e. get rid of the bad neighborhood problem?

4) TEXT URLs, no outbound links. Example:
Title: Widgets for sale
URL: www[dot]widgets[dot]com (without the [dot]'s of course, in the form of a URL just not a hyperlink)
instead of <a href="http://www.widgets.com">Widgets for sale</a>
Down side: Usability
Questions:
-Does google crawl text URLs?
-Will google penalize for text URLs?
-Is there anything wrong with text URLs in this situation over links, aside from usability?

Note: The links are NOT paid nor necessarily reciprocal, so there's no link partner obligation.

2by4

6:20 am on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Number one is what WebmasterWorld uses, more or less. That's my preference too, although not used to excess, I think that's not safe. But if you block the redirector script in robots.txt, the spider will see the link, but won't follow it, since it goes to a page blocked by robots.txt. No reason I can think of to use a 302 or 301, just set the header location to the link url. This is I think a good way to avoid bad neighborhood penalties etc, but I have a sneaking suspicion redirected links will have problems in the future, it's a thing almost only done by seos, for seo reasons. Although there are totally legitimate reasons, for example on a forum if you use that, you don't have to worry about people linking to bad neighborhoods as much.

larryhatch

6:49 am on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If your #1 concern is Google's good graces, I would stick with
straight honest <a href=http://www.. type outgoing links.
Not even rel=nofollow or other tricks, just straight html hyperlinks.

IMHO this is probably the most user friendly as well since users
can hover and see the link URLs before they click.
It shortens your code for faster loading pages.
All sorts of benefits accrue when you play by the rules.
Your code is cleaner and shorter.
You don't have to hide and keep track of URL conversion files.

The downside is converting many links back to the honest kind.
One possible upside is Google's perception (later if not already)
that you are white-hat. That can't hurt things. - Larry