homepage Welcome to WebmasterWorld Guest from 54.234.228.64
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Externally linked .css and .js causes drop in traffic
This doesn't seem right.
stuntdubl




msg:117603
 7:21 pm on Dec 6, 2002 (gmt 0)

I changed a site that I maintain (not the one in my profile) to use externally linked style sheets and javascript, thinking it would improve my text/html ratio, and move my content closer to the top of the code (proximity).

In theory, I thought, that this would help increase my rankings and thus traffic.

I have seen a significant drop in traffic, and I am very unsure why. I don't know if it is just the time of season for this particular business (hospitality industry), but there was a significant change, and it seems to be since I made the code changes.

Can anyone explain this?

 

WebGuerrilla




msg:117604
 7:26 pm on Dec 6, 2002 (gmt 0)


There is no connection between the two. Google ignores CSS/JS. It doesn't matter where you put it. You could have made no changes and experieced the drop.

jomaxx




msg:117605
 7:44 pm on Dec 6, 2002 (gmt 0)

I tend to believe that Google would completely ignore stuff like Javascript that it doesn't index, so IMO there should be no change.

But many people do say that removing JS will increase your keyword ratio. If so, I guess it's possible that your ratio is now too high and you're being penalized slightly.

I would look at SE rankings where possible instead of visitors, because traffic is extremely variable. Also look at whether the proportion of visitors coming from Yahoo/Google has changed.

ikbenhet1




msg:117606
 7:48 pm on Dec 6, 2002 (gmt 0)

--> deleted.

(wrong example, that was internal js, not external) sorry

[edited by: ikbenhet1 at 7:51 pm (utc) on Dec. 6, 2002]

martinibuster




msg:117607
 7:49 pm on Dec 6, 2002 (gmt 0)

I'm not sure that Google indexes the JS, and if not, then it follows that removing the JS should have no effect on keyword/text ratio. The only time JS comes into play is when you have a scripting/coding problem, and GBot starts to parse JS as html.

I think the advantage in a low code/text ratio is that there is less muck for the bot to wade through. The goal being that you are making it easier to lead the bot to your content.

jimbeetle




msg:117608
 8:06 pm on Dec 6, 2002 (gmt 0)

Hi stuntdubl,

Thanks, now I know I'm not going completely around the bend. I started this thread a couple of weeks ago concerning an almost identical problem:

[webmasterworld.com...]

Mine was an almost across the board drop in traffic on one major section. I changed these pages back about 10 days ago and referrals are back up. Coincidence? With this stuff always, always possible but the timing here appears to be too close.

I agree with jomaxx that it might have to do with the key word density. Whether spiders actually read javascript on an html page (as opposed to completely ignoring it when called from an external .js file), it's beginning to look like they somehow (might, maybe, possibly) take the javascript word count into consideration.

Whether right or wrong I'm going to keep them the way they are now and start tweaking here and there and see what happens.

As always, muddled in Manhattan,

Jim

stuntdubl




msg:117609
 8:22 pm on Dec 6, 2002 (gmt 0)

Jim....
Could you give me more information on the 'direct hit' issue from MSN you were talking about? This could explain part of my problem. Sounds like it occurred about the time my traffic dropped.

jimbeetle




msg:117610
 8:40 pm on Dec 6, 2002 (gmt 0)

Sure, it's that the MSN/Direct Hit partnership ended at the end of September (I believe). And this is where coincidence comes into play -- with an across the board drop from SEs I didn't pick up this specific change.

And more coincidence. Seasonal fluctuation in hospitality industry? I thought that as a possibility also.

Afraid this thread isn't Google-specific any more but I'll go on for now. Since I changed the pages back I'm getting almost half the traffic from MSN that I used to, the other half had been from Direct Hit. That's after basically zero referrals for about a month.

It's hard wading through all of the possibilities that affect SERPs. You do this, he does that, somebody else tweaks an algorithm. But I do think that one of the likely kernels in both of our cases is the change in keyword density.

Would it be any fun if we ever figure it out?

Jim

klatschaffe




msg:117611
 9:18 pm on Dec 6, 2002 (gmt 0)

I'm not sure that Google indexes the JS

I don't think google indexes JS, if you search google for 'script language="javascript"' it doesn't list any pages who bear proper JS within their code. If it would index JS you'd get more than just a few 100 000 pages.
best
klatschaffe

martinibuster




msg:117612
 9:33 pm on Dec 6, 2002 (gmt 0)

The only time JS comes into play is when you have a scripting/coding problem, and GBot starts to parse JS as html.

If it's miscoded, it can get indexed.

if you search google for 'script language="javascript"

If you do the search you will find javascript tutorials and people with the script stuffed into their title tags, etc. Pretty messy.

In fact, this was an issue last week for a user named something like Virgin, whose site description in Google was a snippet from his JavaScript.

turk182




msg:117613
 5:37 pm on Dec 7, 2002 (gmt 0)

Jomaxx
But many people do say that removing JS will increase your keyword ratio. If so, I guess it's possible that your ratio is now too high and you're being penalized slightly.

Then what about tableless sites who depen upon CCS to render properly? Their text/html ratio is very high and search engines almost only see text.

turk182




msg:117614
 5:39 pm on Dec 7, 2002 (gmt 0)

MartinBuster
In fact, this was an issue last week for a user named something like Virgin, whose site description in Google was a snippet from his JavaScript.

Then Google does or does not see JS code? If it sees, then maybe I should link externally all my javascript.

martinibuster




msg:117615
 6:17 pm on Dec 7, 2002 (gmt 0)

Ay-yai-Yai...

Perhaps we should define what is meant by "seen." To me, "seen" means "being indexed". Googlebot will not index/see JavaScript if your code is written correctly. However, it does need to crawl through the code to get to your content.

The "this issue" I was referring to in an earlier post was the coding problem. In the cited case, the JS was coded incorrectly. Therefore, it was getting parsed as html, and "seen/indexed" by the bot. I ran his code through the w3c html validator and received weird results which confirmed that the JS was coded incorrectly.

Always a good idea to run your code through the Validator to see what's getting parsed.

The larger issue for me, is this: If googlebot is programmed to crawl through the first 100k of code and ignore the rest, and you have a 60k JS Drop Down Menu with DHTML sliders, and another 40k of JS rollovers and stuff, how much of your content is going to get indexed? Nothing.

Though the JS doesn't get "indexed/seen" (when all is correctly coded), it does get crawled. It presents a barrier between the bot and the keyword rich content.

Go60Guy




msg:117616
 6:26 pm on Dec 7, 2002 (gmt 0)

This is the first I've considered how JS might affect keyword density. I have a site that has a news headline feed that uses a whale of a lot of script. The news service is a major feature of the home page.

Now as far as prominence of the main keywords is concerned, they precede the JS for the most part. However, I'm now wondering how much detriment to ranking there may be because density is diluted.

jimbeetle




msg:117617
 11:24 pm on Dec 7, 2002 (gmt 0)

>Though the JS doesn't get "indexed/seen" (when all is correctly coded), it does get crawled.

So, we're finally all getting to be on the same page. The bot won't parse the Javascript but it knows it's there and includes it in the overall page size or word count.

And in the case of stuntdubl and myself, going to external rather than on-page scripts changed the page size/word count and therefore the keyword density. Unless, of course, as WebGuerrilla always like to point out, other folks tweaked their pages or the SE tweaked its algo...

It all becomes less murky

martinibuster




msg:117618
 12:29 am on Dec 8, 2002 (gmt 0)

jimbeetle and compadres,

Are page size and keyword density related? Not in the way you are linking them together.

Keyword density: Is a measurement of the percentage of keywords within the actual text of the content.

Ratio of text to code: Is a measurement of how much code there is versus the actual text (aka content).

And in the case of stuntdubl and myself, going to external rather than on-page scripts changed the page size/word count and therefore the keyword density.

Page size and word count do not have the connection that you are attributing to them. If you are counting page size, then you must also include your graphics. Page size is a measurement of the total kilobytes of a page, including the graphics.

In your case, by offloading the JS, you changed the text to code ratio. You also changed the page size (to the bot), in kilobytes.

But your keyword density, no.

Keyword density is relative to the text, not to the code.

In other words, you are counting the amount of times a keyword appears within your text, and expressing it as a percentage of the total text within your page.

That is the definition of keyword density. It has nothing to do with the amount of code.

So, if you have 100 words of text, and your keywords appear eight times within that page, your keyword density will always be 8 percent.

The only other way to count Keyword Density is when you include the title tag and the Meta Description. You JS has nothing to do with Keyword Density. It only impacts the total page size and the code to text ratio.

jimbeetle




msg:117619
 6:23 pm on Dec 8, 2002 (gmt 0)

martinibuster,

Of course. In my earlier post I had at first written it as just "page size" then "page size or?word count?" then for some reason deleted the?s. Thanks for getting me back on track.

So, assuming text or actual content did not change then the impact we saw *might* be attributed to the change in page size. Or, do bots consider the text to code ratio?

I'm going to start tweaking a significant handful of these pages to see what happens. It won't answer anything definitively but might lead me in the right direction.

Jim

martinibuster




msg:117620
 6:33 pm on Dec 8, 2002 (gmt 0)

I think that the text to code ratio is a matter of streamlining. So in a matter of speaking, it is not considered by the bot, but it is a variable for us to consider when optimizing a site for maximum indexing.

Putting the least amount of obstacles between your content and the bot is something you want to do. The obstacle in this case being JS and CSS, but it could also be excessive <td>, etc.

So, in theory, by linking externally, you are creating a direct path for the bot to follow: from the <html> tag to the actual content.

<html>
<title>
<meta description></meta>
<external js>
<external css>
<body>
CONTENT
</body>
</html>

ruserious




msg:117621
 7:07 pm on Dec 8, 2002 (gmt 0)

This may be old news for some of you, but I still think it is an interesting online-tool from holovaty.com :

[holovaty.com...]

It calculates the content to code ratio. The sourcecode is available too.

jimbeetle




msg:117622
 8:38 pm on Dec 8, 2002 (gmt 0)

Great tool ruserious. Had it squirreled away in a 'miscellaneous' bookmarks folder and had forgotten all about it.

Almost too enlightening to see that pages I considered mostly content top out at about 20% to 25%. Now to go back an tweak to use external .js files but at same time restrike the balance that the bots had liked before.

stuntdubl




msg:117623
 6:15 pm on Dec 9, 2002 (gmt 0)

>>>But many people do say that removing JS will increase your keyword ratio. If so, I guess it's possible that your ratio is now too high and you're being penalized slightly.

After reading all of these, I know most of you are firmly convinced that .js and css files are not "seen" i.e. indexed by Google. I think that this arguement is logical, and there is quite a bit of evidence to support it.

Unfortunately, my traffic is still down, and while I am reducing the amount of possible variables, I am not quite establishing a working hypothesis either.

If anyone has any other possible suggestions as to causation of my drop, please let me know. The changes made sitewide on my site effected the top 100 lines of code of every page on my site. It cut about 80 lines by externally linking. Is there somehow that proximity to the top of the code could possibly have an ADVERSE effect?

Still searching for answers...

martinibuster




msg:117624
 7:44 pm on Dec 9, 2002 (gmt 0)

Link Rot has been my bogeyman lately.

stuntdubl




msg:117625
 6:33 pm on Dec 10, 2002 (gmt 0)

Just in case anyone really cares, I've had a breakthrough. I realized that when I updated the template, I left out my section links at the bottom that point to the eight main pages of the website. I am unsure right now, but I think this may have decreased my linkpop and thus my overall ratings. Just a working theory, but so far the best one I have.

Cross your fingers for me.:)

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved