Forum Moderators: open

Message Too Old, No Replies

Java Script - rankings

         

pnlla_rav

9:31 am on Oct 10, 2003 (gmt 0)

10+ Year Member



I am using more javascript code in my site.
Is this is going to effect the search engine rankings?

Anybody ..

many thanks.

ciml

11:39 am on Oct 10, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A lot of people think that the keyword density is reduced if you have a lot of on-page HTML code, CSS or Javascript. I haven't noticed this with Google though.

Gert_Jan

11:44 am on Oct 10, 2003 (gmt 0)

10+ Year Member



You should call the script like this
<script language="Javascript" src="/js/script.js" type="text/javascript"> </script>

Don't put it all in your page

WebWalla

2:04 pm on Oct 10, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A lot of people think that the keyword density is reduced if you have a lot of on-page HTML code, CSS or Javascript. I haven't noticed this with Google though.

ciml - even if it doesn't effect the keyword density, don't you think G might give your keywords less importance if it first has to spider through a whole lot of css or javascript before hitting the actual text?

claus

2:30 pm on Oct 10, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> using more javascript code in my site

Well, i bet you can't beat me there ;) I have a full section of a site running entirely by means of javascript. It's just one physical html page, but it's acting as anything between 1 and a lot of pages to a user with a js-enabled browser (currently around 10 pages, but i've had it running as a two-level hiearchy with more than 60 pages).

The rest of that site is indexed nicely, but this particular page/section - nope. Not at all. The page has been there for a couple of years, and it's not even indexed. All other pages on that site are indexed - every single one. Too much javascript i guess. Then again, i don't really need it indexed, it's just a service to my users.

/claus

ciml

2:41 pm on Oct 10, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



WebWalla; that's not something I've found. Personally, I don't see why the algo would work on the full source code. Too much irrelevant data.

claus, do you think that something in the Javasscript might trip the bot? Often Javascript is implemented in such a way as to break HTML (usually the ">" character); but sometimes the bot is thrown by a correct implementation.

claus

3:04 pm on Oct 10, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yet Another Total re-edit of this post:

That page is weird - it is in the index, i just found it. When viewed in a Toolbar enabled browser it even has a PR. If you enter the web address in the google searchbox, you will not find it, unless you put www in front of the (domain) URL. Www is not used at the site anymore, it hasn't been for months, and the server rewrites all www-reguests to non-www urls. So, i guessed it was an old cache, when i did the first total rewrite of this post.

There are four words in plain text on this page. When you search for these, the page shows up in the SERPS. Here's the weird thing - it's not an old cache - the URL in the serps is the right one (the one without www - that is: the one that yields zero results when input into the searchbox), and... it has a fresh tag?!

Anyway, 50% of all characters on that page are JS and it's a small page (7k) so the rest is (literally) navigation and layout. It uses external ".js" files, and validates as HTML 4.1 Transitional ;)

/claus

[edited by: claus at 4:00 pm (utc) on Oct. 10, 2003]

plasma

3:29 pm on Oct 10, 2003 (gmt 0)

10+ Year Member



Yup,
<script language="Javascript" src="/js/script.js" type="text/javascript"> </script>

This is generally a good idea, not only for SEO.
Especially, if the code repeats on many pages, it speeds up loading enormously.

kaled

5:47 pm on Oct 10, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Purely as a matter of best practice, virtually all javascript should be kept in external files. However, I imagine it is likely that Google takes the view that pages with a low visible ratio (i.e. visible text : total page size) are poorly designed. After all, robots have to scan the whole page, scripts and all, and a low visible ratio means CPU time is being wasted.

Kaled.