homepage Welcome to WebmasterWorld Guest from 54.211.97.242
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 60 message thread spans 2 pages: 60 ( [1] 2 > >     
Google and Javascript
HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 3:57 pm on Oct 28, 2002 (gmt 0)

I just lost my #1 Google ranking because I converted a bunch of HTML code to javascript. The site looked the same to the end user and it loaded about twice as fast (because it was twice as small). I was smart and I optimized my site to enhance the end user experience and Google cut me out. I do not even exist in Google anymore. The only results are cached older versions of the site (not the same cached versions that had #1 ranking).

When is Google going to respect Javascript? By optimizing my site, cutting down on my server bandwidth bills, and allowing 56k users to download my site twice as quick, I lost a HUGE portion of my income.

 

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 4:00 pm on Oct 28, 2002 (gmt 0)

Also, this happened to two of my sites, each with #1 rankings. They both no longer exist in Google except for older versions of the site (in which these older versions never obtained #1 ranking anyway.)

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 4:04 pm on Oct 28, 2002 (gmt 0)

I wish I was allowed to post my sites, but I am not allowed because they are adult related. I would like to discuss my sites and Google if it were somehow possible. If anyone would like to contact me, please do so via ICQ:

<snip> see profile

[edited by: engine at 4:21 pm (utc) on Oct. 28, 2002]
[edit reason] Please use the profile [/edit]

Macguru

WebmasterWorld Senior Member macguru us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 6379 posted 4:06 pm on Oct 28, 2002 (gmt 0)

Hi HQ_Webmaster,

Welcome to WebmasterWorld. [webmasterworld.com]

>>When is Google going to respect Javascript?

As far as I know, any spider from any search engine can read JavaScript. They need HTML links to follow, and text to read.

andreasfriedrich

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 6379 posted 4:08 pm on Oct 28, 2002 (gmt 0)

I was smart and I optimized my site to enhance the end user experience and Google cut me out.

By optimizing your site the way you did you also shut out every(body¦thing) using a user agent that is not capable of executing JavaScript.

As you see it is a rather bad idea to let JavaScript write out the content of your page. It is just as bad as having the main content in a flash file or something similar.

Use JavaScript for some fancy enhancements (if you need to use it at all) but not for the core structure/content of your websites.

Andreas

mediaman

10+ Year Member



 
Msg#: 6379 posted 4:12 pm on Oct 28, 2002 (gmt 0)

We have a few jsp driven sites and they are indexed quite well with Google, so I do not think they have a problem with it. One thing I can say is it would appear to take longer to have the site listed. (Initially) But once it is you'll be fine.

rogerd

WebmasterWorld Administrator rogerd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 6379 posted 4:16 pm on Oct 28, 2002 (gmt 0)

HQ, you can use spider simulators like SpiderSim at searchengineworld.com to see what your page looks like to typical spiders. Some SEs may see more or less, of course, but I'd recommend that as a good start. One of the biggest web design mistakes we find when we are called in to fix traffic problems is that content is invisible - the pages are accessed by scripted links that spiders can't follow, frames seal off content from some SEs, or content is buried in flash, js, etc. I've gotten a few surprises myself when I've "spidered" pages I thought I designed well, it's well worth a try.

gsx

10+ Year Member



 
Msg#: 6379 posted 4:20 pm on Oct 28, 2002 (gmt 0)

JSP is not JavaScript. JSP is done server side (the server does the work and sends raw HTML to the browser). JavaScript is done client side (the persons browser loads the code and then runs it to generate the HTML output).

A spider sees JSP as 'normal' HTML because that is what it is sent, whereas it sees the raw code for JavaScript and because of the complexities involved in processing it - it usually ignores the whole lot.

andreasfriedrich

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 6379 posted 4:24 pm on Oct 28, 2002 (gmt 0)

We have a few jsp driven sites

Yesterday, we had a big storm in Berlin and that didnīt affect my Google ranking, so I do not think Google has a problem with storms in Germany.

JavaScript is entirely unrelated to JavaServerPages. So proving that the one works ok in Google has no bearing at all on the fact whether the other will work ok in Google.

Andreas

----

<added>gsx beat me to it.</added>

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 5:16 pm on Oct 28, 2002 (gmt 0)

I realize that spiders can read javascript, but they do not parse it. Spiders should parse javascript, and THEN examine the page as if javascript did not even create it.

The reason I used javascript (if you were able to view my websites) was to make my page smaller and faster to load. A lot of the content on my pages were similar so I coded some javascript loops to take care of huge repetitive tables.

By not parsing javascript, it removes the greatness of javascript itself. The true power of javascript can not be used. It is a pain. Javascript was not meant only to do insignificant (fancy) enhancements. This really sucks for me. I can go code all my javascript in Perl and make Perl output the contents of my page, but why should I have to do this? Google, start parsing javascript, PLEASE.

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 5:20 pm on Oct 28, 2002 (gmt 0)

Thanks rogerd, I understand what my pages look like without JS enabled. They have NO content what-so-ever. The page is basically blank with a few links to counters and other sites. The content of my pages are completely 100% generated via JS. It is no wonder Google kicked me out. About 30% of my profits came from Google too. You can see why this is not good for me.

Considering I had a #1 ranking, you know that my sites were worthwhile. I did not employ any tricks (never have and never will) to get those rankings. Although I am an adult webmaster, I am 100% honest (we tend to have bad reputations for understandable reasons).

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 5:22 pm on Oct 28, 2002 (gmt 0)

Good point gsx, but I am not sure why you are telling me this? I understand that server-side processing of HTML is interpreted by all spiders as regular HTML.

hiker_jjw

10+ Year Member



 
Msg#: 6379 posted 5:29 pm on Oct 28, 2002 (gmt 0)

HQ, You got caught.

Let the truth out HQ! The real reason you used the JavaScript was to make you pages more concise, smaller, to the point... and oh yeah,... get better SERPS.

What you're serving up to the GoogleBot is not the same as what you serving up to your regular visitors. This could easily be viewed as a way of cloaking. Were you banned from Goole?

Further more, if you use PERL you'll end up worried about CPU usage rather than bandwidth usage.

Cheers

rogerd

WebmasterWorld Administrator rogerd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 6379 posted 5:38 pm on Oct 28, 2002 (gmt 0)

HQ, the important message here is that you must design for the limitations of the spiders. Design techniques that may be great for your visitors may be awful for search engine indexing. Doing your scripting using server-side code may get you the best of both worlds - "real" HTML for the spiders, and reusability of components, etc. I'm sure at some point the search engines will be able to interpret more. Today, for example, many no longer choke on frames, and Google will supposedly follow flash links. Until they advance further, though, I'd suggest using js only for content you DON'T want indexed. It has nothing to do with the merit of your sites - I've seen sites that looked great and had tons of content, but were still impossible for spiders to index.

If I were you, HQ, I'd revert back to the content that gave you your #1 in Google. Get it in there before the next major spidering, which is probably imminent. One of the cardinal rules of SEO is if a page is working well, don't mess with it!

rogerd

WebmasterWorld Administrator rogerd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 6379 posted 5:47 pm on Oct 28, 2002 (gmt 0)

Hiker, I doubt if HQ was aiming for better SERPs by putting the content in a script, particularly when he was apparently #1 for his desired search phrase.

Sasquatch

10+ Year Member



 
Msg#: 6379 posted 6:07 pm on Oct 28, 2002 (gmt 0)

I hope google never gives high rankings to sites that *require* js, flash or any other optional service. If the site doesn't downgrade in a reasonable fashion, I do not want to give that site my business anyway, so I do not want them showing up in my SERPs.

hiker_jjw

10+ Year Member



 
Msg#: 6379 posted 6:49 pm on Oct 28, 2002 (gmt 0)

Maybe I was reading more into it than I should have.

I agree with the previous posts. You "must" design your site so that spiders can "FOLLOW" it. In some cases, you may be able to provide navigation in the main section of your page -- while still using your JavaScript to reduce bandwidth.

Jump over to Search Engine World and use their SE Spider Tool to see if your pages are follow-able.

Cheers

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 7:50 pm on Oct 28, 2002 (gmt 0)

Sorry hiker_jjw, but you are wrong.

The real reason I used JS was to make my page smaller and faster to load. Plus it made my gallery modifications much much easier. With 85 categories I can now just go in and add one more categories and new tables are created automatically. My JS code is more advanced than your average newbie! :)

SERPS?!

I do not know if I was banned from Google. How can I tell? What I was serving to Google is 100% THE SAME as what I server to my visitors. Google is just not intelligent enough (yet) to realize it.

Do not worry about Perl and CPU usage. I would not create the same page via Perl code for every visitor. I would only run the Perl code whenever I updated the page.

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 7:52 pm on Oct 28, 2002 (gmt 0)

rogerd, you are exaclty right. "If is ain't broke, don't fix it."

I just wish Google would parse JS code so that I can make use of it without suffering. The only problem here is Google holding me back.

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 7:54 pm on Oct 28, 2002 (gmt 0)

Sasquatch, my site requires JS to view the content. What is wrong with this? I am living in the 21st century, not the 1960s. I do not care for old and outdated visitors because if they have JS turned off then they likely have cookies turned off and other things that make it impossible to track. In my field, visitors that can not be tracked are useless. I trade traffic for traffic. Why send you a visitor you can not track because of no JS?

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 7:55 pm on Oct 28, 2002 (gmt 0)

(I should add that a large portion of traffic trading scripts that I trade with depend on JS.)

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 7:56 pm on Oct 28, 2002 (gmt 0)

Is Googleguy a guy that works at Google? Assuming that is the case, can he help me out on the status of my sites in Google's database?

dcheney

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 6379 posted 8:04 pm on Oct 28, 2002 (gmt 0)

Personally, I'm rather glad Google doesn't attempt to handle javascript (even though my sites use it to some degree). It would be extremely easy to add tons of spam via javascript.

gsx

10+ Year Member



 
Msg#: 6379 posted 8:17 pm on Oct 28, 2002 (gmt 0)

"Good point gsx, but I am not sure why you are telling me this? I understand that server-side processing of HTML is interpreted by all spiders as regular HTML."

Because the opposite is true: Non-server side in not interprated as regular HTML and is IGNORED.

To quote you:

"The site looked the same to the end user" - but completely different to a spider

"I was smart and I optimized my site" - no. If you were smart you would have read previous threads and realised that NO search engine parses JavaScript before you made any changes.

hiker_jjw

10+ Year Member



 
Msg#: 6379 posted 8:30 pm on Oct 28, 2002 (gmt 0)

Good point gsx.

If you were smart you would have read previous threads and realised that NO search engine parses JavaScript before you made any changes.

HQ, if you're so advanced... ah, forget it.

You might consider on-the-fly content generation using PERL and an Apache Server. That combination works rather well. The concept is simple. The first time an HTML page is hit/requested, the PERL CGI script is executed and the HTML page is written. From that point on the Apache server serves up the written HTML page. But, I'm sure you already know this.

hiker_jjw

10+ Year Member



 
Msg#: 6379 posted 8:31 pm on Oct 28, 2002 (gmt 0)

Oh, one more thing HQ... Go get a Google Toolbar.

It will help you determine the PR of your pages and if your site has been banned.

Cheers

dauction

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 6379 posted 8:34 pm on Oct 28, 2002 (gmt 0)

Once again the problem is GOOGLE..you have to DUMB DOWN your site so google can undestand it..

Oh no it's has javascript..give me a beak..

My suggestion would be to send Goolge an email so your site can be reviewed by someone and given back it's rank and not let it be junked because the spiders cant figure a little javascript..

Dont settle for for 2nd best..they have the talent, and the money to beable to fix that problem..and is is their problem not yours..

You are4 using COMMON web master tools ..javascript has been around a long time

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 1:12 am on Oct 29, 2002 (gmt 0)

dcheney, whatever spam can be added via javascript can be added via plain HTML. I do not see what benefit Google has in not parsing JS (if it could).

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 1:15 am on Oct 29, 2002 (gmt 0)

<snip> I meant I was smart in the optimization of my site for the reasons that I listed. Most webmasters could not (or would not) do that. I undersatnd that I can not show you my site, but the amount of JS I used was extreme. Like I said, my site has no content at all, 0%, once you turn JS off.

Google is the problem, not my intelligence. I knew that my JS would harm my Google rating. If you are trying to say that I did not know this (?), then you are wrong.

My point is that it sucks that Google does not parse JS and I want to draw attention to it.

"Because the opposite is true: Non-server side in not interprated as regular HTML and is IGNORED."

I already know this. Nothing you have said is news to me. I think I confused you or you did not understand what I was saying. (I was not trying to start a flame war in my first thread.) This thread has nothing to do with my lack of knowledge in this area. I 100% fully understand server side and client side issues. I was just asking a question because I did not understand where you were coming from and wanted you to clariy it. If you thought that I did not understand all of what you have said, then I apologize.

[edited by: HQ_Webmaster at 1:23 am (utc) on Oct. 29, 2002]

[edited by: Marcia at 6:01 pm (utc) on Oct. 30, 2002]
[edit reason] Please stay with issues without getting personal [/edit]

HQ Webmaster

10+ Year Member



 
Msg#: 6379 posted 1:18 am on Oct 29, 2002 (gmt 0)

hiker_jjw, I already have the Google toolbar, thanks! Both sites were rated only 5/10, despite their #1 rankings. I am not familar with how impressive or unimpressive 5/10 is as I am new to PR ratings.

"You might consider on-the-fly content generation using PERL and an Apache Server."

Yes I do already know this. I wrote in this thread somewhere before that I could do this. The problem with this? I have to use my server and my server's resources plus more bandwidth and thus a slower site. It is not fun. I am in a no-win situation.

This 60 message thread spans 2 pages: 60 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved