Forum Moderators: open

Message Too Old, No Replies

Does anyone use JS links on purpose?

To hoard PR for their other links?

         

br33526

12:55 am on Apr 29, 2003 (gmt 0)

10+ Year Member



On any given website, there are main navigational links that go to the privacy policy, site terms, etc.

Most webmasters don't care if these pages rank well or not.

Does anyone out their use java links for these types of links on purpose so that google does not follow them? And so more PR is given to their other links.

rcjordan

1:01 am on Apr 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, I do --though I pay little/no attention to the effects of PR distribution (in this case). I also move some nav and even some other repetitive content off-page using external js to increase density of the remainder.

born2drv

1:10 am on Apr 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I do for all my non-essential pages like you mentioned. As a result they have PR 1-2 otherwise they would be 5-6. I'm assuming it is helping my other pages.

It's also good for email mailto: links. By putting js code to break up the email address and put it backtogether, it's less likely to be collected by spam robots.

mrguy

2:35 am on Apr 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I make those pages become valuable and optimize them for less popular phrases.

We get a lot of good traffic from some less popular phrases.

For example, I optimized our company page for the specific region we are located in. It has a PR of 5 and places in the top 5 for region defined searches for our industry.

It all adds up!

No since wasting a good page!

4eyes

11:49 am on Apr 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



yep - works for me.

Sometimes its the best way to provide a navigation structure that suits my client's requirements whilst still having a sensible text navigation for spidering the way Google likes it.

FleaPit

3:36 pm on Apr 29, 2003 (gmt 0)

10+ Year Member



Just taken a look at Counter.com stats on javascript enabled browsers.

JavaScript Stats
Sat Mar 1 00:05:02 2003 - Mon Mar 31 23:55:04 2003 31.0 Days

Javascript 1.2+: 318909095 (89%)
Javascript <1.2: 925397 (0%)
Javascript false: 35458114 (9%)

So it would seem roughly 10% of users don't have JS enabled which sounds quite bad. Turn the glass on its head and you can say 90% of browsers have JS enabled which sounds much better :)

I would consider using server side script links e.g link.cgi?Target=sitemap, rather than loose functionality for 10% of my users. That way you can limit the haemorrahage of PR but links work cross browser 100% all the time.

MrSpeed

3:45 pm on Apr 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Oh great. Now Google will get a million spam reports about people using javascript links.

richardb

3:59 pm on Apr 29, 2003 (gmt 0)

10+ Year Member



If using JS causes a problem for 10% could you not use SSI for the "offending links" and dump the SSI files into a folder blocked by the robot.txt.

Less bandwidth and browser issues, or have I missed something?

Rich

Yidaki

4:46 pm on Apr 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I use cgi's, php or database redirects instead of js. However, i don't like the phrase "hoarding pr". To me it's more "controlling your survival chances" AND "controlling your pr". ;)

jomaxx

4:53 pm on Apr 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Rich, that wouldn't work. The SSI is executed by the server and the browser or robot is never aware of it happening.

Edouard_H

5:00 pm on Apr 29, 2003 (gmt 0)

10+ Year Member



I don't think placing SSI files in a folder blocked by robots.txt would help as they would already be included in the page when spidered.

<edit> jomaxx addressed SSI

killroy

5:55 pm on Apr 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How about placing a MET:NOINDEX,FOLLOW on the "offending pages? Would htat work to preserve link structure, but not waste PR on non-content pages?

SN

xy123

7:13 pm on Apr 29, 2003 (gmt 0)

10+ Year Member



How do you code a JS link that both behaves and appears identical to a plain old static href? In all browsers, irrespective of users link color choices?

FleaPit

7:36 pm on Apr 29, 2003 (gmt 0)

10+ Year Member



To create a simple href link using JS...

<script type="text/javascript">

document.write( '<a href="mylink.html">My Link</a>' );

</script>

If you want it to look the same as every other link on your page then use CSS.

Forgot to add, place the JS code wherever you want the link to appear on the page.

jimbeetle

7:46 pm on Apr 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Somebody is going to have to help me remember.

During Saturday's q&a, did Matt Cutts from Google state that G is improving its ability to parse on-page javascript -- especially links? If so, in order to hide links you'll have to go to an external js file.

(I should remember but there was just sooo much information thrown out.)

Jim

rogerd

7:52 pm on Apr 29, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



JimBeetle, Matt implied that Google is getting better at picking up text that looks like a URL as a link. He wasn't specific about the details or the timing, but it sounded like a general move in the direction of identifying more links, even when part of some kind of script or query string.

tedster

8:10 pm on Apr 29, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Relative to javascript, Matt Cutts said that Google plans to look at javascript on the page as plain text, rather than script - they definitely can't afford to automatically run unknown javascript. If something in that text looks like a URL, then Googlebot will attempt to crawl to it.

He did not say whether or not that would including passing on any PR vote - my impression is that it will not, but you never know. It is their algorithm and they will do whatever seems best for their business.

This change is part of Google's effort to find every possible page they can for their database. But their approach is also purposed to find unlinked web addresses in the content of news stories, and so on.

xy123

7:20 am on Apr 30, 2003 (gmt 0)

10+ Year Member



Strikes me that killroy's suggestion - putting a META:NOINDEX,FOLLOW on the pages you dont want google to index - outhgt do the trick and have the benefit that google-indexed pages that link to such non-content pages are unchanged.

Questions:
1. How rigidly does Google (and others) take note of NOINDEX,FOLLOW meta tags?
2. Assume a page x.html links to 2 other pages a.html and b.html and that all these pages are indexed in google. Assume that a.html gets a contribution of A to its PR as a result of this link, likewise B for b.html. If you then direct Google to not index b.html using the above meta tags method, will a.html now get a contribution of A+B to its PR, or just A?