Forum Moderators: open

Message Too Old, No Replies

Javascript hack to funnel the PR

overzealous fool or good idea?

         

tkroll

11:14 pm on Jun 19, 2003 (gmt 0)

10+ Year Member



Apologies if this has been covered...

So I have a site that has 20 pages. Each page links to the 5 top navigation pages.

If each page only linked to my main homepage, I should expect a better PR for the main page since each pages' PR vote would not be spread among 5 pages. Correct?

Since Google does not follow JS links, especially if the code is external, I could hide each pages' other 4 links in script. Thus, Google would see only one outgoing per page--all leading to my index page.

Now is this a good idea? Will Google flag this as a spamming technique? Does the PR benefit outweigh the implications of non-JS browsers, etc.? So many questions...

I know this is a judgment call, but the judgment I've seen here is top-notch.

Thanks!

markdidj

11:32 pm on Jun 19, 2003 (gmt 0)

10+ Year Member



As far as I know you can have as many internal links as you wish without PR being affected.

Slade

12:03 am on Jun 20, 2003 (gmt 0)

10+ Year Member



Links don't generally affect a page's PR. The number of links on a page affects the percentage of it's total PR that gets passed through it's links.

tkroll

12:19 am on Jun 20, 2003 (gmt 0)

10+ Year Member



The way I understand it is that each pages' outgoing links get a "vote" value of that page's PR divided by the number of links. So, if my info page has 5 outgoing links and a PR of 5 (I wish!) each page linked to from these links gets 5/5 = 1 "PR vote".

If this same page had only 1 link (and 4 hidden JS links) and PR 5, wouldn't each page linked to from this page get 5/1 = 5 "PR vote"?

Please enlighten.

ukgimp

8:16 am on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sounds to me like you just need to link to these pages using standard href's. Dont worry about TBPR just link, especially when it comes to internal pages.

Think of it from another perspective if you link internally with JS you are cutting off the amount of people who will be able to navigate your site. Why would you want to do that.

If you are worried about PR all you will do internally is spread it around, you wont lose any.

Cheers

mayor

9:18 am on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Using javascript to contain and/or direct page rank is a great theory but I've used several variations of this and IMHO the extra site design effort, loss of non-js visitors, and page rendering delays don't make js links of much value in generating search engine traffic. I really see little traffic generation difference between js-configured sites and those with no js links.

Maybe it helps for super-competitive keywords but in moderately competitive areas which I target I doubt you'll find much advantage as far as traffic generation goes.

There may be good reasons for using js links, but I haven't yet seen where page rank manipulation is one of them.

doc_z

9:23 am on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Of course, internal links influence PR (distribution). Depending on your link structure, you can distrubute your PR either flat (equally) over your pages or in a hierarchical way. The total amount of PR (sum of all internal pages) is always the same. (As long as the transferred PR to other sites is unchanged.)

Therefore, hiding some internal links with JS will increase the PR of your homepage and decrease PR of the other 4 top pages. If you'll see any effect for the ToolbarPR depends on the details.

(If it makes sense at all is a different question.)

mayor

9:31 am on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



doc_z >> The total amount of PR (sum of all internal pages) is always the same

I agree ... there's no perpetual motion machines in cyberspace ... you can't create page rank "energy" where none exists in the first place.

sullen

10:07 am on Jun 20, 2003 (gmt 0)

10+ Year Member



In theory this should work.

However, I do remember reading a post of Googleguy's which suggested that it wouldn't be that hard for Google to at least rip the links from javascript (buy looking for "http:", or presumably ".html" etc) and that we shouldn't be surprised to see this kind of functionality soon. Actually it may not have been Googleguy but someone else - but it does make sense.

So you might spend all that time redesigning your site, annoy all those js haters (yes, all 3 in 1000 of them) for nothing.

mil2k

10:19 am on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



LOL i was going to respond after msg 4 but then thought hey doc_z would be better person to respond. :)

However, I do remember reading a post of Googleguy's which suggested that it wouldn't be that hard for Google to at least rip the links from javascript (buy looking for "http:", or presumably ".html" etc) and that we shouldn't be surprised to see this kind of functionality soon.

Very good point.I think it was Matt Cutts in Pubcon 4. At least that was what I read in the Pubcon 4 analysis thread.

tkroll

2:45 pm on Jun 20, 2003 (gmt 0)

10+ Year Member



Great responses!

I don't think I'll go this route. Does anyone have a dedicated site for Google testing, one where we could try out these techniques and record results? Not sure if recording can be accurate, though.

Thanks again for the info, guys!

bhartzer

2:54 pm on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I tried this "javascript hack to influence PR" about a month or two ago as an experiment on one of my sites.

Needless to say, it didn't work and my site's PR is suffering greatly because of it. I think Google has picked up on it (like they should).

Whatever you do, don't do it; your site's PR will suffer.

ogletree

2:59 pm on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Acording to the Phil Craven article PR changes for the root page by changing the internal link structure. He seemed to say that the best way is to have the best possible PR on one page is point to all pages to that page and have all pages point back to that page but don't have any of the other pages point to each other. This would make one page have the higest possible PR. If all pages point to each other then it is evenly spread out. Of course this is a guess i'm sure.

martinibuster

3:36 pm on Jun 20, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



The more you try to manipulate and control Google, the more Google will get away from you.

mil2k

3:58 pm on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The more you try to manipulate and control Google, the more Google will get away from you.

I agree with this. What you can instead concentrate on is the site structure i.e. Breadth Vs Depth.

doc_z

5:33 pm on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I tried this "javascript hack to influence PR" about a month or two ago as an experiment on one of my sites.
Needless to say, it didn't work and my site's PR is suffering greatly because of it.

I would be careful about drawing conclusions from different situations in time. The reason is that the other parameters aren't unchanged. Google can change the PR algorithm (e.g. damping factor) or the ToolbarPR scale. Also, the transferred PR to your site will change after each update.

It's always better to compare different situation at the same time. Otherwise it's impossible to distinguish between effects caused by changes on your site and effects from other sides. Of course, in most of the cases it's not possible to build two sites (same incoming links) and compare the situation, but that's the only way to eliminate external factors.

Chris_1

5:39 pm on Jun 20, 2003 (gmt 0)

10+ Year Member



I used the javascript idea to help diminish our "cross linking" between two domains.

We have one company that sells products to a specific niche. However, they setup a new domain to target a new, but, still, somewhat related niche market that overlaps with their current market.

To get better search engine rankings and try to establish a new site on it's own, we setup the new domain, but used the same, overall design.

The new site has it's own content (and a good amount of it, too). The drop down menu's are the same, but, we modified the top header and bottom footer a bit (even though they still look similar).

By using the drop down menu's we are able to lessen the amount of "cross linking" between the two domains - while still giving our visitors access to all the products on each domain...

Chris

jimbeetle

5:59 pm on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



wouldn't be that hard for Google to at least rip the links from javascript

Yep, mil2k, Matt Cutts did say that at PubCon. It isn't going to parse js, but basically if you can read something on a page as a link then G-bot can read it.

The more you try to manipulate and control Google, the more Google will get away from you.

Will G-bot parse js in the future? Retrieve external js files? Always considerations to keep in mind. We might think G is 1/2 step behind sometimes, but with a push of a button...? I always heed mb's words.

Jim

keyplyr

6:25 pm on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month




basically if you can read something on a page as a link then G-bot can read it.

The benefit (as described in the 1st message) can be achieved by putting all the link info in external JS and using document.writeln to display on page. Bots do not "read" them. This is shown by Google not indexing one page that is orphaned from hard code, but linked reputedly by this method.

Using external JS, I have successfully pushed 2 second level pages up 1 PR over the last year, until they fell back this update. This might lead to the conclusion that new JS parsing has indeed been implemented.

tkroll

6:40 pm on Jun 20, 2003 (gmt 0)

10+ Year Member



I see. Interesting.

What about doing something like this

In external .js in a directory with a restrictive robots.txt:


var linkArr = new Array('http://www.site1.com', ''http://www.site2.com');

function linkTo(pos) {
document.location.href=linkArr[pos];
}

And then this in the html:


<a href='javascript:linkTo(0);'>Site1</a>

Could Google ever get these links?

This is more theoretical now. I don't think I'll be doing this.