Forum Moderators: open

Message Too Old, No Replies

Hmm we know Google does not follow javascript

but what about affiliate links?

         

zeus

11:59 pm on Dec 28, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I just thought about if Google does follow affiliate links from lets say CJ or other because they have a "?" in the link and they do not get spidered or are seen in the Google search. If Google does not see/spider the links then they will not hurt your PR on the site.

zeus

rcjordan

12:09 am on Dec 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yep, at times there are items on the page that we want to present to our visitors but not to the bots, document.write works great for this.

zeus

12:13 am on Dec 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



well document.write I dont know what that is I just do the HTML/Flash stuff thats it, not that in to PHP and so on.

zeus

europeforvisitors

12:14 am on Dec 29, 2002 (gmt 0)



If Google does not see/spider the links then they will not hurt your PR on the site.

Is there any evidence to suggest that Google lowers the PageRank of pages with affiliate links?

zeus

12:21 am on Dec 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Well every link out of your site hurts your PR, thats law, but I dont think there is any difference between affiliate links or other links out.

zeus

steveb

1:36 am on Dec 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"Well every link out of your site hurts your PR, thats law,..."

Not in this galaxy.

willybfriendly

1:54 am on Dec 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Links out hurt?

I have seen this statement before, but I do not understand where the idea comes from. While I agree that a outbound link dilutes the page rank that is transferred to other pages in the site by that particular page, I have been unable to find any documentation suggesting a general penalty in page rank for outbound links.

Can anyone explain where this idea comes from, and if there is any data to support it.

aka ferrari360

1:56 am on Dec 29, 2002 (gmt 0)

10+ Year Member



if the affiliate site you are linking to is a "bad neighbourhood" it may result in a penalty..

personally I'd check the PR bar of the site you are linking to (without your affiliate code) and check to see if it has PR 0 - or is indexed at all.

if all looks normal (e.g. the site has PR) then you "should" be OK

zeus

1:58 am on Dec 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Ok, but thats what it is typed in the document from the Uni, where they made the PR ranking project, Thats why they suggest to have a site with info about the link out, so you dont loose the PR to ousite URLs.

zeus

psoares

3:08 am on Dec 29, 2002 (gmt 0)



PR with an outbound anchor can't hurt coz only inbound random path nodes with anchor links may provide useful PR unless it is zero or otherwise.

zeus

1:21 pm on Dec 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Links out of your site

When considering links out of your site there is one golden rule:

Generall, you will want to keep PR within your own site.

By linking out the total of PR within your site may be lower then it could have been had you not linked out.

Formerly PR explained from Chris Ridings and Mike Shishigin

zeus

P.s what is document.write

Grumpus

2:06 pm on Dec 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm going to suggest that Google has been playing around with this theory in the background for several months now. I think we're just starting to see penalties (very mild now, but penalties, nonetheless) for sites that are PR hogs - i.e. Not linking out to other sites.

We've known for a while that Google doesn't like "web pockets" where the bot goes in and gets bounced around between a small group of sites with no other way out. This is were "crosslinking" penalties come in. 3 (or whatever) sites all link to each other all over the place, bot goes in and never finds a link anywhere but among the three sites. Boom - crosslinking penality.

It's been my theory for a while that you could eliminate crosslinking penalties by simply linking out to some other sites along with your crosslinking scheme.

Now, in recent months, I'm seeing that Google seems to be tinkering around with this theory on a site by site level. I'm starting to see what appears to be "penalties" (again, they aren't massive right now, but I'm noticing monthly changes on some sites that I don't see on ones that link generously) being applied when a site simply doesn't link out. In essence, you're creating a crosslinking penalty within your own site rather than between two or more sites.

It makes sense that they'd do so. The idea of ranking sites based on links furthers the idea of the word "web" and how it originated. But, in recent times, we've seen people keying on inbound links and not outbound links - something that is detrimental to this notion. A common SEO "trick" right now is to create "throwaway" sites that do nothing but get PR by exchanging links on deep pages, then passing that PR onto the front page and then onto the "real site" which then simply doesn't link out to anyone. It's been cranking my stones for a while now, and I'm glad I'm starting to see something being done about it.

In the end, I doubt there is much life left in sites that don't link out. You're going to have to develop a nice ratio of internal, inbound, and outbound links. I don't know what that ratio is, but we'll know sometime in the first quarter of 2003, I'd suspect.

Back to the question at hand, Google DOES follow some affiliate links. As suggested in the original post the "?" has exactly nothing to do with it though. I have a "?" in virtually every link on my site. I'd suspect it's a domain level filter, but even that, I can't say for sure. I also suspect that many affiliate links don't get indexed because the landing page does some funky stuff most of the time - there's a redirect from the affiliate manager to the page and/or session information, cookies data which the bot doesn't get, etc. It's not your link that's not being followed, it's your afilliate host that's not taking advantage of the PR you're giving it.

Share the wealth, gang. If Google had never released its "PR" manifesto and people hadn't jumped onto the bandwagon, the web would be interlinked very much in the way it was in the early 90's and the "little guys" would be ranking very well today against the "big guys". The big guys don't link out because it's not worth their time to pay someone to keep up on it. If all the little guys just linked out as they did in the "olden days" there would be MASSIVE amounts of PR shuffling ALL over the web to ALL SORTS of sites. As it is now, people hog their PR and have to get little bits here and little bits there because people aren't willing to share.

I realize, that the PR formula is a socialist notion which is exactly why things are the way they are today - we (for the most part) are trying to apply capitalist techniques to a socialistic concept. Period.

G.

zeus

2:55 pm on Dec 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I dont think they will give sites with more links out a + point in the ranking and about the crosslinking I have never found out what that is, how can you get a penalty for have a good linking trough out your site.

About share the PR, you are right there its hard to let loose of PR and to get PR from other sites.

zeus

Grumpus

10:37 pm on Dec 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I never said that sites that link out would get a + ranking. I'm saying that sites that DON'T link out get a minus ranking. And it's starting already.

G.

Stefan

11:10 pm on Dec 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Great stuff, Grumpus. Thanks for the info.

Hardwood Guy

12:38 am on Dec 30, 2002 (gmt 0)

10+ Year Member



Absolutely! Great post Grumpus. I try to share the belief in leaving some of the PR stuff in the background and concentrate on your visitors. Afterall isn't that what the web is all about? Provide the information that the people are looking for even if that means outbound links. I must have ten times more of them than inbound because I feel they're useful to my visitors. Doesn't seem to hurt my traffic as my repeats are around 37% and lately for some reason they're reaching 45%..Go gigure.

coconutz

12:53 am on Dec 30, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I never said that sites that link out would get a + ranking. I'm saying that sites that DON'T link out get a minus ranking. And it's starting already.

I've been trying to figure out why 2 of my pages are now a PR6. These pages are a reciprocal links page and a resources page. Consider:

  • Links page: 1 PR3 and 1 PR4 external inbound links. 56 outbound links to other sites.

  • Resources page: 1 PR4 external inbound link. 26 outbound links to various related organizations, associations and publications and an affiliate link.

    There are other internal pages that have more inbound links pointing to them than the above 2 pages, but they do not link out and have been PR5 for some time.

    If outbound linking has a negative effect on your pages, neither of these pages should be a PR6.

    These 2 pages had remained a PR5 for the longest time until I started increasing the outbound links. The resources page wasn't much in the way of resources, and the links page was just that.

    I wonder if there is a small plus factor for a site that does link out.

  • BigDave

    1:14 am on Dec 30, 2002 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Unless you have terrible site navigation, you will not have much of a PR leak from linking out. Just the regular navigation sections of all my pages have around 20 links, and most of my higher ranking pages have only internal links. But that is just the way that things work out.

    Yeah, my only high PR pages with lots of outbound links are my links pages. So what if I lose some PR that I could have driven back into the site? Those link pages get a lot of use and bring in a lot of high results in the SERPs. They also bring in a lot of traffic.

    If a PR2 page on my PR5 site has 2 outbound links on the page, I am still sending 90% of that PR2 back to other pages on my own site. And if there is a penalty for not linking out of a site I will not get that penalty. If there is a bonus for linking out, that page will get that bonus for a measily 10% of a PR2 page. How much effort would you be willing to go through to get an incoming link from a PR2 page?

    buckworks

    3:02 am on Dec 30, 2002 (gmt 0)

    WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



    I just counted: one of my pages has over fifty links going out to other sites, besides the in-site navigation links. The majority of the links out are affiliate links. The page is #1 out of 665,000 for its main target phrase, and has held that position for months. It also ranks well for several secondary phrases.

    I'm on a Mac so don't know what the page's PR is. The site is not listed in DMOZ, but that particular page has links coming in from other sites because it's useful to people interested in the product category.

    My conclusions:

    1) Carefully themed outbound links and the descriptions you give them add useful content that more than compensates for any "PR leakage".

    2) Google does not penalize pages for having affiliate links. The fact that a site with affiliate links will likely be rejected by DMOZ causes a setback, but Google itself does not seem to mind affiliate links if it likes the page otherwise.

    bnc929

    3:21 am on Dec 30, 2002 (gmt 0)

    10+ Year Member



    Grumpus I am 100% sure your theory is wrong. Your PR fluctuations are probably for other reasons.

    I have a site that exists in a very controlled environment, its great for testing PR behavior.

    Anyways there is not a single outgoing link on the page, not a single one, it is only 5 pages in total, and it has still managed to hang onto it's number 1 spot for a very competitive keyword.

    There are also technical flaws with measuring outgoing links.

    1. Sometimes there is more than 1 site on a single domain (geocities). How can Google tell if it is an internal or an external link?

    2. Some sites have more than 1 domain. For instance promotionbase.com, webmasterbase.com, and ecommercebase.com are all 1 site. How is Google to know this?

    The fact is Google looks at things on a page by page basis. There is no such thing as an internal or external link to them, they're all external. For Google to analyze outgoing links as you propose they'd need to look at things on a site wide basis. They don't do that, and they can't do that. Not perfectly anyways.

    Additionally by introducing outgoing links into the algorithm they lower the quality of their search results. You see the goal of a search engine is to provide a user with answers. By introducing outgoing links thats like a search engine saying "Instead of answering your question, I'm just going to give you the URL of a site that probably links to a site that somewhere on it has the answer to your question." Its not very intuitive.

    Even if Google doesn't like sites without any outbound links, it is doubtful, for all the reasons above, that they'd do anything about it. The bottom line is that it wouldn't benefit the end user, and search engines are made for end users, not for webmasters.

    ciml

    3:01 pm on Dec 30, 2002 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    > If Google does not see/spider the links then they will not hurt your PR on the site.

    Nice idea zeus, but if Google sees the link then you may give PR to a 'URL only' listing. Even if Google can't fetch the URL (robots.txt excluded or 404 "not found" is returned), the other URLs you link to will only get the PR that they would have received if the non-functioning link had worked.

    As rcjordan points out, document.write could be used to make links 'invisible'.

    <added>
    I agree with Hardwood Guy and BigDave, people worry too much about 'PR leakage'.

    The steep curve of the Toolbar log scale means that at most you can loose less than 7/8ths of one notch from the maximum PR available using the commonly believed log base of 6, or at most less than 1/2th of one notch from the maximum PR available using a much higher effective log base figure such as I believe in.

    Markus

    3:39 pm on Dec 30, 2002 (gmt 0)

    10+ Year Member



    ciml, this is what Larry said on links to pages that cannot be spidered:

    Because dangling links do not affect the ranking of any other page directly, we simply remove them from the system until all the PageRanks are calculated. After all the PageRanks are calculated, they can be added back in, without affecting things significantly. Notice the normalization of the other links on the same page as a link which was removed will change slightly, but this should not have a large effect.

    Some time ago, I've turned all my JS links to robots.txt protected redirects and I've excluded internal pages with no importance in terms of SEO by robots.txt. It worked well.

    I agree that some people worry too much about PR leakage, but it is real and working on sensible linking tactics can improve rankings a lot.

    mayor

    4:08 pm on Dec 30, 2002 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Grumpus >> sites that are PR hogs - i.e. Not linking out to other sites.

    Isn't this exactly what Google identifies as a 'authority' which is central to their web topography theories? Wouln't penalizing a PR hog undermine the very foundation of their concepts?

    Ever try to get links from a .gov site? How 'bout a .mil site? Should they be excluded from sporting the hog tag? There are plenty of .com's that won't give others the time of day (or a link) because they ARE authorities.

    Also, if it is determined that Google is somehow identifying 'hogs' as distinct from 'authorities' and penalizes them, it's only a matter of one update cycle before every SEO that does his homework spatters a few links on his site. This would be one of the easiest SEO maneuvers of the year. In the process, some true authorities that don't bother with SEO could suffer collateral damage.

    I appreciate your advice and analysis, Grumpus, and I do believe in offering outbound links where appropriate irregardless of what happens to page rank, but I just don't see how Google can start penalizing PR hogs without distorting their basic concepts.

    Still, I'll be watching for your superb postings to see if your vision bears true :)

    ciml

    6:04 pm on Dec 30, 2002 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Markus, I often wonder about that paragraph.

    All I can assume is that the "figure to divide outbound PageRank by" includes the dangling links. Done that way, the dangling URLs and all other URLs could get the proper PageRank applied...I think.

    Markus

    6:31 pm on Dec 30, 2002 (gmt 0)

    10+ Year Member



    Hmm, since I am not a native english speaker, I hope I did not misunderstand that paragraph. I thought, changing the normalization of the other links on a page means that they proceed the computation with the new figure (without dangling links). Only in this way, the average PR of the remaining pages can still equal 1. And assigning PR to the dangling links is not so much of a problem.

    ciml

    7:52 pm on Dec 30, 2002 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Markus, I never fully understood the problem with dangling links. As I see it, the PageRank should redistribute at the end of iteration normalisation. There must be a reason for that section, so somewhere I am missing something.

    Notice the normalization of the other links on the same page as a link which was removed will change slightly, but this should not have a large effect.

    Seeing as "Bringing Order to the Web" also mentions that "often these dangling links are simply pages that we have not downloaded yet" it seems sensible to assume that robots protected pages are dangling. In that case, my results don't agree with the paper.