| 7:26 pm on Dec 6, 2002 (gmt 0)|
There is no connection between the two. Google ignores CSS/JS. It doesn't matter where you put it. You could have made no changes and experieced the drop.
| 7:44 pm on Dec 6, 2002 (gmt 0)|
But many people do say that removing JS will increase your keyword ratio. If so, I guess it's possible that your ratio is now too high and you're being penalized slightly.
I would look at SE rankings where possible instead of visitors, because traffic is extremely variable. Also look at whether the proportion of visitors coming from Yahoo/Google has changed.
| 7:48 pm on Dec 6, 2002 (gmt 0)|
(wrong example, that was internal js, not external) sorry
[edited by: ikbenhet1 at 7:51 pm (utc) on Dec. 6, 2002]
| 7:49 pm on Dec 6, 2002 (gmt 0)|
I'm not sure that Google indexes the JS, and if not, then it follows that removing the JS should have no effect on keyword/text ratio. The only time JS comes into play is when you have a scripting/coding problem, and GBot starts to parse JS as html.
I think the advantage in a low code/text ratio is that there is less muck for the bot to wade through. The goal being that you are making it easier to lead the bot to your content.
| 8:06 pm on Dec 6, 2002 (gmt 0)|
Thanks, now I know I'm not going completely around the bend. I started this thread a couple of weeks ago concerning an almost identical problem:
Mine was an almost across the board drop in traffic on one major section. I changed these pages back about 10 days ago and referrals are back up. Coincidence? With this stuff always, always possible but the timing here appears to be too close.
Whether right or wrong I'm going to keep them the way they are now and start tweaking here and there and see what happens.
As always, muddled in Manhattan,
| 8:22 pm on Dec 6, 2002 (gmt 0)|
Could you give me more information on the 'direct hit' issue from MSN you were talking about? This could explain part of my problem. Sounds like it occurred about the time my traffic dropped.
| 8:40 pm on Dec 6, 2002 (gmt 0)|
Sure, it's that the MSN/Direct Hit partnership ended at the end of September (I believe). And this is where coincidence comes into play -- with an across the board drop from SEs I didn't pick up this specific change.
And more coincidence. Seasonal fluctuation in hospitality industry? I thought that as a possibility also.
Afraid this thread isn't Google-specific any more but I'll go on for now. Since I changed the pages back I'm getting almost half the traffic from MSN that I used to, the other half had been from Direct Hit. That's after basically zero referrals for about a month.
It's hard wading through all of the possibilities that affect SERPs. You do this, he does that, somebody else tweaks an algorithm. But I do think that one of the likely kernels in both of our cases is the change in keyword density.
Would it be any fun if we ever figure it out?
| 9:18 pm on Dec 6, 2002 (gmt 0)|
|I'm not sure that Google indexes the JS |
| 9:33 pm on Dec 6, 2002 (gmt 0)|
|The only time JS comes into play is when you have a scripting/coding problem, and GBot starts to parse JS as html. |
If it's miscoded, it can get indexed.
| 5:37 pm on Dec 7, 2002 (gmt 0)|
|But many people do say that removing JS will increase your keyword ratio. If so, I guess it's possible that your ratio is now too high and you're being penalized slightly. |
Then what about tableless sites who depen upon CCS to render properly? Their text/html ratio is very high and search engines almost only see text.
| 5:39 pm on Dec 7, 2002 (gmt 0)|
| 6:17 pm on Dec 7, 2002 (gmt 0)|
The "this issue" I was referring to in an earlier post was the coding problem. In the cited case, the JS was coded incorrectly. Therefore, it was getting parsed as html, and "seen/indexed" by the bot. I ran his code through the w3c html validator and received weird results which confirmed that the JS was coded incorrectly.
Always a good idea to run your code through the Validator to see what's getting parsed.
The larger issue for me, is this: If googlebot is programmed to crawl through the first 100k of code and ignore the rest, and you have a 60k JS Drop Down Menu with DHTML sliders, and another 40k of JS rollovers and stuff, how much of your content is going to get indexed? Nothing.
Though the JS doesn't get "indexed/seen" (when all is correctly coded), it does get crawled. It presents a barrier between the bot and the keyword rich content.
| 6:26 pm on Dec 7, 2002 (gmt 0)|
This is the first I've considered how JS might affect keyword density. I have a site that has a news headline feed that uses a whale of a lot of script. The news service is a major feature of the home page.
Now as far as prominence of the main keywords is concerned, they precede the JS for the most part. However, I'm now wondering how much detriment to ranking there may be because density is diluted.
| 11:24 pm on Dec 7, 2002 (gmt 0)|
>Though the JS doesn't get "indexed/seen" (when all is correctly coded), it does get crawled.
And in the case of stuntdubl and myself, going to external rather than on-page scripts changed the page size/word count and therefore the keyword density. Unless, of course, as WebGuerrilla always like to point out, other folks tweaked their pages or the SE tweaked its algo...
It all becomes less murky
| 12:29 am on Dec 8, 2002 (gmt 0)|
jimbeetle and compadres,
Are page size and keyword density related? Not in the way you are linking them together.
Keyword density: Is a measurement of the percentage of keywords within the actual text of the content.
Ratio of text to code: Is a measurement of how much code there is versus the actual text (aka content).
|And in the case of stuntdubl and myself, going to external rather than on-page scripts changed the page size/word count and therefore the keyword density. |
Page size and word count do not have the connection that you are attributing to them. If you are counting page size, then you must also include your graphics. Page size is a measurement of the total kilobytes of a page, including the graphics.
In your case, by offloading the JS, you changed the text to code ratio. You also changed the page size (to the bot), in kilobytes.
But your keyword density, no.
Keyword density is relative to the text, not to the code.
In other words, you are counting the amount of times a keyword appears within your text, and expressing it as a percentage of the total text within your page.
That is the definition of keyword density. It has nothing to do with the amount of code.
So, if you have 100 words of text, and your keywords appear eight times within that page, your keyword density will always be 8 percent.
The only other way to count Keyword Density is when you include the title tag and the Meta Description. You JS has nothing to do with Keyword Density. It only impacts the total page size and the code to text ratio.
| 6:23 pm on Dec 8, 2002 (gmt 0)|
Of course. In my earlier post I had at first written it as just "page size" then "page size or?word count?" then for some reason deleted the?s. Thanks for getting me back on track.
So, assuming text or actual content did not change then the impact we saw *might* be attributed to the change in page size. Or, do bots consider the text to code ratio?
I'm going to start tweaking a significant handful of these pages to see what happens. It won't answer anything definitively but might lead me in the right direction.
| 6:33 pm on Dec 8, 2002 (gmt 0)|
I think that the text to code ratio is a matter of streamlining. So in a matter of speaking, it is not considered by the bot, but it is a variable for us to consider when optimizing a site for maximum indexing.
Putting the least amount of obstacles between your content and the bot is something you want to do. The obstacle in this case being JS and CSS, but it could also be excessive <td>, etc.
So, in theory, by linking externally, you are creating a direct path for the bot to follow: from the <html> tag to the actual content.
| 7:07 pm on Dec 8, 2002 (gmt 0)|
This may be old news for some of you, but I still think it is an interesting online-tool from holovaty.com :
It calculates the content to code ratio. The sourcecode is available too.
| 8:38 pm on Dec 8, 2002 (gmt 0)|
Great tool ruserious. Had it squirreled away in a 'miscellaneous' bookmarks folder and had forgotten all about it.
Almost too enlightening to see that pages I considered mostly content top out at about 20% to 25%. Now to go back an tweak to use external .js files but at same time restrike the balance that the bots had liked before.
| 6:15 pm on Dec 9, 2002 (gmt 0)|
>>>But many people do say that removing JS will increase your keyword ratio. If so, I guess it's possible that your ratio is now too high and you're being penalized slightly.
After reading all of these, I know most of you are firmly convinced that .js and css files are not "seen" i.e. indexed by Google. I think that this arguement is logical, and there is quite a bit of evidence to support it.
Unfortunately, my traffic is still down, and while I am reducing the amount of possible variables, I am not quite establishing a working hypothesis either.
If anyone has any other possible suggestions as to causation of my drop, please let me know. The changes made sitewide on my site effected the top 100 lines of code of every page on my site. It cut about 80 lines by externally linking. Is there somehow that proximity to the top of the code could possibly have an ADVERSE effect?
Still searching for answers...
| 7:44 pm on Dec 9, 2002 (gmt 0)|
Link Rot has been my bogeyman lately.
| 6:33 pm on Dec 10, 2002 (gmt 0)|
Just in case anyone really cares, I've had a breakthrough. I realized that when I updated the template, I left out my section links at the bottom that point to the eight main pages of the website. I am unsure right now, but I think this may have decreased my linkpop and thus my overall ratings. Just a working theory, but so far the best one I have.
Cross your fingers for me.:)