Forum Moderators: open
So I have a site that has 20 pages. Each page links to the 5 top navigation pages.
If each page only linked to my main homepage, I should expect a better PR for the main page since each pages' PR vote would not be spread among 5 pages. Correct?
Since Google does not follow JS links, especially if the code is external, I could hide each pages' other 4 links in script. Thus, Google would see only one outgoing per page--all leading to my index page.
Now is this a good idea? Will Google flag this as a spamming technique? Does the PR benefit outweigh the implications of non-JS browsers, etc.? So many questions...
I know this is a judgment call, but the judgment I've seen here is top-notch.
Thanks!
If this same page had only 1 link (and 4 hidden JS links) and PR 5, wouldn't each page linked to from this page get 5/1 = 5 "PR vote"?
Please enlighten.
Think of it from another perspective if you link internally with JS you are cutting off the amount of people who will be able to navigate your site. Why would you want to do that.
If you are worried about PR all you will do internally is spread it around, you wont lose any.
Cheers
Maybe it helps for super-competitive keywords but in moderately competitive areas which I target I doubt you'll find much advantage as far as traffic generation goes.
There may be good reasons for using js links, but I haven't yet seen where page rank manipulation is one of them.
Therefore, hiding some internal links with JS will increase the PR of your homepage and decrease PR of the other 4 top pages. If you'll see any effect for the ToolbarPR depends on the details.
(If it makes sense at all is a different question.)
However, I do remember reading a post of Googleguy's which suggested that it wouldn't be that hard for Google to at least rip the links from javascript (buy looking for "http:", or presumably ".html" etc) and that we shouldn't be surprised to see this kind of functionality soon. Actually it may not have been Googleguy but someone else - but it does make sense.
So you might spend all that time redesigning your site, annoy all those js haters (yes, all 3 in 1000 of them) for nothing.
However, I do remember reading a post of Googleguy's which suggested that it wouldn't be that hard for Google to at least rip the links from javascript (buy looking for "http:", or presumably ".html" etc) and that we shouldn't be surprised to see this kind of functionality soon.
Very good point.I think it was Matt Cutts in Pubcon 4. At least that was what I read in the Pubcon 4 analysis thread.
Needless to say, it didn't work and my site's PR is suffering greatly because of it. I think Google has picked up on it (like they should).
Whatever you do, don't do it; your site's PR will suffer.
I tried this "javascript hack to influence PR" about a month or two ago as an experiment on one of my sites.
Needless to say, it didn't work and my site's PR is suffering greatly because of it.
I would be careful about drawing conclusions from different situations in time. The reason is that the other parameters aren't unchanged. Google can change the PR algorithm (e.g. damping factor) or the ToolbarPR scale. Also, the transferred PR to your site will change after each update.
It's always better to compare different situation at the same time. Otherwise it's impossible to distinguish between effects caused by changes on your site and effects from other sides. Of course, in most of the cases it's not possible to build two sites (same incoming links) and compare the situation, but that's the only way to eliminate external factors.
We have one company that sells products to a specific niche. However, they setup a new domain to target a new, but, still, somewhat related niche market that overlaps with their current market.
To get better search engine rankings and try to establish a new site on it's own, we setup the new domain, but used the same, overall design.
The new site has it's own content (and a good amount of it, too). The drop down menu's are the same, but, we modified the top header and bottom footer a bit (even though they still look similar).
By using the drop down menu's we are able to lessen the amount of "cross linking" between the two domains - while still giving our visitors access to all the products on each domain...
Chris
wouldn't be that hard for Google to at least rip the links from javascript
Yep, mil2k, Matt Cutts did say that at PubCon. It isn't going to parse js, but basically if you can read something on a page as a link then G-bot can read it.
The more you try to manipulate and control Google, the more Google will get away from you.
Will G-bot parse js in the future? Retrieve external js files? Always considerations to keep in mind. We might think G is 1/2 step behind sometimes, but with a push of a button...? I always heed mb's words.
Jim
basically if you can read something on a page as a link then G-bot can read it.
The benefit (as described in the 1st message) can be achieved by putting all the link info in external JS and using document.writeln to display on page. Bots do not "read" them. This is shown by Google not indexing one page that is orphaned from hard code, but linked reputedly by this method.
Using external JS, I have successfully pushed 2 second level pages up 1 PR over the last year, until they fell back this update. This might lead to the conclusion that new JS parsing has indeed been implemented.
What about doing something like this
In external .js in a directory with a restrictive robots.txt:
var linkArr = new Array('http://www.site1.com', ''http://www.site2.com');function linkTo(pos) {
document.location.href=linkArr[pos];
}
And then this in the html:
<a href='javascript:linkTo(0);'>Site1</a>
Could Google ever get these links?
This is more theoretical now. I don't think I'll be doing this.