Forum Moderators: martinibuster
I'm probably going to try to have a script written that will check "reciprocal"/backward links ... just check, not solicit
So I'm trying to put together a list of reasonable expectations and needed some help from you folks.
What I have so far is:
====================
The links I want to check would be on a special page on the server. Something like /menu/backwardlinks.html and referenced in the robots.txt file
I think I would prefer that the script run on the server rather than desktop and be initiated by typing something like
[mysite.com...]
into my browser
The script would go down the entire list on backwardlinks.html, find each URL and report the following:
-URL of page - found/not found
-Link status - found/not found
-Whether the page is actually linked to the main site - linked/not linked
-Page is actually a "frame" - frame
-Link is buried in a script - script
-Meta robots="nofollow" - robot nofollow
-Meta redirects googlebot(et al?) to another URL - redirect SEBot
-Link has rel="nofollow" - relnofollow
-Link has "arguments" - arguments, query string, etc
Any negatives i.e., not found, not linked, etc would be displayed in red.
I'm probably saying some of these terms wrong, but I think you guys get the idea.
Any suggestions for other checks that should be incorporated into the script?
Regards and appretiations
- Ah, nevermind, I see you mentioned that.
What kind of user interface is this thing going to have? Are you just going to pass it a file with a bunch of links to check, or are you going to put a web front end on it?
As I envision it, I copy all the links I want checked (some of the people I link to don't link back and that's okay) to a page called backlinks.html and store it in a file I use on most of my sites /menu/ for storing SSI snippets. It is off limits to bots that respect robots.txt instructions.
Then I type a command like [mysite.com...] and that activates the script.
Then I get a report like backwardlink_report.html that tells me all these things.
I would rather evaluate the problem links individually, just as I pursue link arrangements.
I'm not a coder so I don't know the difficulties associated with the various tasks I would like performed.