Forum Moderators: open
I have a site that has all his external (outgoing) links going through a script, with the link's database id as a parameter. This script is in a folder that is disallowed for all robots.
I used this method to be able to track the number of outgoing leads I'm producing for the different sites (this is done in the script)
Now when I start thinking about it, this leads to a site with no visual outgoing links for googlebot...
After all, the links are all to a script in the bot-forbidden folder that redirects you after logging your request.
Would it be a wise idea to have the links rendered directly (plain html a href) if the googlebot user agent is requesting the page?
Actually, I have some doubts about it, as the html produced for the bot will be quite different than the html for other user agents ...
In case of a manual review (or comparison when using another user agent), could this be interpreted as cheating?
Is it actually worth the risk of upsetting googlebot with 'special, different content' for him, or will linking to quality sites improve my PR? (it's PR6 for the moment)
Any ideas/comments?
Günther.
At that point you could simply setup some rules for which user-agents you treat as "real" visitors and which you will treat as "bots".
This also saves the worry about whether the content change will upset them requiring just a really minor change.
- Tony