Forum Moderators: coopster
The outbound links are in a file which counts the clicks called link.php, we have a function called url which when used in this syntax, counts the link then redirects to an outside site.
Example:
[example.com...]
This would count a click for example2.com on our script then redirect to example2.com
The problem, is that we have over 100,000 outbound links on our site and it is virtually impossible to manually check all the links.
What prevention method can I take in order to make sure I'm not penalized for broken links or any site that is involved with methods that search engines frown upon? (I'm referring to links after the url=)
I would prefer a script that simply redirects in such a way that it doesn't count as an outbound link and redirects in some manner, is this possible?
I'm seriously thinking about writing a script that put the link in a text box, since I'm so sick of seeing 404 errors everytime I use a validator. I don't want to be penalized if any of these links are bad.
Any thoughts?
Since it is a "redirect" and I don't actually have a link to anything directly, could I still be penalized for my redirected links? (every outside link on my site uses the link.php?url= format)
Have you tried something like Xenu link checker to check your links and cull the bad ones? It might be a big project the first time through if you have 2000 broken links, but I would think that if you ran it monthly it would only find a few new dead pages every month.
One problem now, though, is that with some registrars, when the domain goes dormant, it reverts to a parked ad page with the registrar, so though the link won't be valid in any real sense, it won't send a 404 either, which devalues the output from automated link checking.
I'm not sure I understand why the links are broken and why you're getting 404 errors. Is it because the destination site has gone down? You should be able to send any header you want from your end using header().Have you tried something like Xenu link checker to check your links and cull the bad ones? It might be a big project the first time through if you have 2000 broken links, but I would think that if you ran it monthly it would only find a few new dead pages every month.
Yes, the problem is with the destination sites.
I've setup a script that checks the headers on the outbound link before sending the end user to the site. If the script returns a 404 it sends them to a custom error page on my website instead of sending them to the broken link.
The problem which I can't find a work around for is, NON 404 error sites, which are code 200 and potentially "bad neighborhoods" and etc. This directory has ATLEAST 100,000 outbound links, which pass through the script in my first post. It's impossible to manually do a full investigation for each and every website to determine if it is a "bad neighborhood" or something that I will be penalized for in search engines such as Google. I'm trying to figure out if I'm just being paranoid or if there's an actual procedure directory websites use to deal with outbound links like this (other than manually investigating each and every link). I'm sure sites such as Yahoo, DMOZ and etc end up listing "bad neighborhood" websites, and they sure aren't penalized by Google or other big search engines for linking to bad neighborhoods aka sites with black hat optimization procedures. There must be some kind of general setup in the website directory business of how to setup a proper script so search engines don't penalize you for lots of outbound links (links that have the potential or possibilty of being "bad neighborhood")