| 2:49 am on Sep 14, 2009 (gmt 0)|
Your current set-up is actually a good practice. Simply Disallow the /visit/ directory in your robots.txt and you are good. The nofollow attribute is also a good practice but there's no need for you to make all those changes because you are already in good shape.
|I heard Google lowers a page's score in the presence of affiliate links |
I think that's just a bit off the mark. Google may lower an affiliate page's ranking when the content on the pages is not unique or if the site does not offer visitors some extra value, over and above showing the same products and information that other affiliates of the same business offer. There's no automatic lowering of a page just because it serves an affiliate link.
The simplest way I've heard this said is "create a valuable site first, then it can have affiliate links". Going the other direction is what makes trouble.
| 6:54 am on Sep 14, 2009 (gmt 0)|
|There's no automatic lowering of a page just because it serves an affiliate link. |
My testing shows that sometimes it does, and sometimes it doesn't, most likely depending on other factors about the site and/or particular page.
| 7:03 am on Sep 14, 2009 (gmt 0)|
i think thats just what ted said!
|Google may lower an affiliate page's ranking when the content on the pages is not unique or if the site does not offer visitors some extra value |
or do you mean that with everything else equal adding a link will demote it and removing it will promote it? Its the propomot part of the equation that would be on interest if thats what you mean since ANY change can demote a page these days.
| 7:15 am on Sep 14, 2009 (gmt 0)|
|i think thats just what ted said! |
No, it isn't. In the past I have taken pages, added affiliate links, have them drop, removed the aff link and had the rankings return. I don't know anything on the algo for certain, but I would guess based on my testing that some pages do drop just because of aff links.
| 9:29 am on Sep 14, 2009 (gmt 0)|
I think everything else held equal, a page without affiliate links would weigh more than one with.
I did read that Google has gotten better with JS links, so how about doing this:
This, script.js, I talked about earlier, which captures onclicks and forwards them to my affiliate link, would only be added to the page when the user-agent is not a Google bot?
On top of that, I can also Disallow the /visit/ directory. That's a good idea.
| 10:30 am on Sep 14, 2009 (gmt 0)|
Bear in mind that google does visit from non "googlebot" UAs from time to time, and cloaking against the bot might be deemed inappropriate.
| 12:56 pm on Sep 14, 2009 (gmt 0)|
Call me a cynic, but if there was a way, its unlikely that the answer would be posted here, given that Google read these threads (particularly as this tread is front paged!)
| 1:56 pm on Sep 14, 2009 (gmt 0)|
Isn't this what nofollow is (partially) for?
Ofcourse, you can always use a redirect script that is blocked by robots.txt.
| 2:00 pm on Sep 14, 2009 (gmt 0)|
|This, script.js, I talked about earlier, which captures onclicks and forwards them to my affiliate link, would only be added to the page when the user-agent is not a Google bot? |
That's really playing with fire. Google dislikes cloaking VERY, VREY much.
| 2:15 pm on Sep 14, 2009 (gmt 0)|
Disallow instructions in robots.txt tell bots not to index content. But, does it also tell them not to crawl that content and use it to determine a score for page? I don't think so.
I guess this can easily be tested...
You could create a secret page at a URL that's disallowed by robots.txt. Make sure that no one knows about that page. Then create an indexable page where you put a link to your disallowed page in an obscured location, where only a bot (not a human) could find it. Then wait and see if the disallowed page gets retrieved by a bot at all.
Has anyone done a test like that?
| 2:27 pm on Sep 14, 2009 (gmt 0)|
robots.txt stops bots visiting. That's what it's for.
Stick a page on your server, disallow it, point a bunch of links at it, and watch your server logs. No bot visits (at least from reputable bots).
| 3:32 pm on Sep 14, 2009 (gmt 0)|
You can create a direction for Google AdBot specifically not to visit certain parts of your site.
Google does not like affiliates from the perspective that the site has been created with affiliate business in mind. So Tedster's claim stands that you have a site about something you "love" or "do", and then you figure you could put affiliate links onto it.
On another side, my experience tells that sites with QS of 10/10 for long time get slapped with manual review and switch to 1/10.
I argue with this as it is obvious that the system was not cheated on, but good content has been provided and yet Google's employee has determined that such content already exists (on authoritative site) and therefore your site sucks and nobody needs it.
In some cases this is right, but in others where affiliate site provides a unique reviews, or unique coupon codes and other discounts, QS switch should not be the case.
In overall, Google AdWords does not have much sentiment when about affiliates, period.
They even don't give any kind of specific support like dedicated rep or anything similar. I run tens of accounts and have lost reps more then two years ago even with spends that are extremely high on regular bases. At that time I had two reps (day to day, and senior one) and then BAM - no more. After that, I was approached several times by AdWords with ideas about how they cold help, but as soon as they hear about my business concept, they're gone. If I run an agency that does service for clients, it would be OK. Now, what's the difference?
| 3:46 pm on Sep 14, 2009 (gmt 0)|
|That's really playing with fire. Google dislikes cloaking VERY, VREY much. |
If it is a domain you wouldn't want penalized, you'd better be sure any links would pass a human review.
| 4:56 pm on Sep 14, 2009 (gmt 0)|
I believe is the link is loaded in the page by Ajax, it can not be viewed by crawlers.
And it is not cloaking. Cloaking = content viewed by robots and not by human. It is the contrary.
Otherwise all dynamic pages would be wrong.
| 5:05 pm on Sep 14, 2009 (gmt 0)|
|Google does not like affiliates from the perspective that the site has been created with affiliate business in mind. So Tedster's claim stands that you have a site about something you "love" or "do", and then you figure you could put affiliate links onto it. |
Remember when a Dutch guy leaked Google's manual for quality evaluators a few years ago? The manual used examples of sites with and without affiliate links, and some of the sites with affiliate links were used as examples of useful, non-spammy sites. That was in line with what Google spokespeople such as GoogleGuy have said here and elsewhere: "thin affiliates" don't get much respect from Google, but affiliate links per se aren't a no-no.
| 5:08 pm on Sep 14, 2009 (gmt 0)|
The argument of hiding from Googlebot is a moot point because they actually have humans manually checking sites as well so the odds are if you're trying to cheat the system you'll get caught eventually, it's just a matter of when.
| 7:34 pm on Sep 14, 2009 (gmt 0)|
--- No bot visits (at least from reputable bots). --
MSNBot ignores Robots.txt and even lists the pages for a SITE Command that have NOINDEX, NOFOLLOW on them. I've been watching it for years now.
I have a 9 year old site of 25 pages. Every page links to badbot trap. Robots txt contains only one entry:
Slurp and MSNBot are ignoring that atleast 10 a month each, Never seen GBot doing that.
| 11:54 pm on Sep 14, 2009 (gmt 0)|
Like Tedster and others have mentioned, offer a good website that is chalk full of good content (not content all geared at converting the user to your affiliates)
This protects against manual visits from the 'team'.
For everything else, nofollow the link, the then robots.txt disallow the script that redirects.
There was a time when it made sense, and affiliates could get away with linking out using redirects to the affiliate from every page. I find this is tougher to do, and in general I usually send the user to a no bs page which really talks to the consumer in clear language about products. This way, additionally, there are limited of numbers of pages that actually redirect.
All that being said, there are others I see at the top of some genres in the index that appear to be there only to redirect the user directly to affiliates (and I mean banners and links galore right up the middle of the index page.
Amazing how they are still there...
| 3:59 pm on Sep 15, 2009 (gmt 0)|
|Isn't this what nofollow is (partially) for? |
nofollow takes the link out of the link map. IF Google is devaluing a page based on an affiliate link being present, then nofollow does not hide the fact that an affiliate link is being used.
For what it's worth (as someone managing a lot of affiliate sites all using various methods) I've found affiliate links fall in to the "is this spam?" thought process of the algorithm -- e.g. if Dmoz started showing affiliate ads it's rankings would be unchanged. If a site launched last week with 10 inbound links from dodgy-looking links pages suddenly has affiliate ads on all it's pages ... it will sink like a stone.
Think "flags" not "penalties".