Forum Moderators: open
When is Google going to respect Javascript? By optimizing my site, cutting down on my server bandwidth bills, and allowing 56k users to download my site twice as quick, I lost a HUGE portion of my income.
<snip> see profile
[edited by: engine at 4:21 pm (utc) on Oct. 28, 2002]
[edit reason] Please use the profile [/edit]
Welcome to WebmasterWorld. [webmasterworld.com]
>>When is Google going to respect Javascript?
As far as I know, any spider from any search engine can read JavaScript. They need HTML links to follow, and text to read.
I was smart and I optimized my site to enhance the end user experience and Google cut me out.
By optimizing your site the way you did you also shut out every(body¦thing) using a user agent that is not capable of executing JavaScript.
As you see it is a rather bad idea to let JavaScript write out the content of your page. It is just as bad as having the main content in a flash file or something similar.
Use JavaScript for some fancy enhancements (if you need to use it at all) but not for the core structure/content of your websites.
Andreas
A spider sees JSP as 'normal' HTML because that is what it is sent, whereas it sees the raw code for JavaScript and because of the complexities involved in processing it - it usually ignores the whole lot.
We have a few jsp driven sites
Yesterday, we had a big storm in Berlin and that didnīt affect my Google ranking, so I do not think Google has a problem with storms in Germany.
JavaScript is entirely unrelated to JavaServerPages. So proving that the one works ok in Google has no bearing at all on the fact whether the other will work ok in Google.
Andreas
----
<added>gsx beat me to it.</added>
The reason I used javascript (if you were able to view my websites) was to make my page smaller and faster to load. A lot of the content on my pages were similar so I coded some javascript loops to take care of huge repetitive tables.
By not parsing javascript, it removes the greatness of javascript itself. The true power of javascript can not be used. It is a pain. Javascript was not meant only to do insignificant (fancy) enhancements. This really sucks for me. I can go code all my javascript in Perl and make Perl output the contents of my page, but why should I have to do this? Google, start parsing javascript, PLEASE.
Considering I had a #1 ranking, you know that my sites were worthwhile. I did not employ any tricks (never have and never will) to get those rankings. Although I am an adult webmaster, I am 100% honest (we tend to have bad reputations for understandable reasons).
Let the truth out HQ! The real reason you used the JavaScript was to make you pages more concise, smaller, to the point... and oh yeah,... get better SERPS.
What you're serving up to the GoogleBot is not the same as what you serving up to your regular visitors. This could easily be viewed as a way of cloaking. Were you banned from Goole?
Further more, if you use PERL you'll end up worried about CPU usage rather than bandwidth usage.
Cheers
If I were you, HQ, I'd revert back to the content that gave you your #1 in Google. Get it in there before the next major spidering, which is probably imminent. One of the cardinal rules of SEO is if a page is working well, don't mess with it!
I agree with the previous posts. You "must" design your site so that spiders can "FOLLOW" it. In some cases, you may be able to provide navigation in the main section of your page -- while still using your JavaScript to reduce bandwidth.
Jump over to Search Engine World and use their SE Spider Tool to see if your pages are follow-able.
Cheers
The real reason I used JS was to make my page smaller and faster to load. Plus it made my gallery modifications much much easier. With 85 categories I can now just go in and add one more categories and new tables are created automatically. My JS code is more advanced than your average newbie! :)
SERPS?!
I do not know if I was banned from Google. How can I tell? What I was serving to Google is 100% THE SAME as what I server to my visitors. Google is just not intelligent enough (yet) to realize it.
Do not worry about Perl and CPU usage. I would not create the same page via Perl code for every visitor. I would only run the Perl code whenever I updated the page.
Because the opposite is true: Non-server side in not interprated as regular HTML and is IGNORED.
To quote you:
"The site looked the same to the end user" - but completely different to a spider
"I was smart and I optimized my site" - no. If you were smart you would have read previous threads and realised that NO search engine parses JavaScript before you made any changes.
If you were smart you would have read previous threads and realised that NO search engine parses JavaScript before you made any changes.
HQ, if you're so advanced... ah, forget it.
You might consider on-the-fly content generation using PERL and an Apache Server. That combination works rather well. The concept is simple. The first time an HTML page is hit/requested, the PERL CGI script is executed and the HTML page is written. From that point on the Apache server serves up the written HTML page. But, I'm sure you already know this.
It will help you determine the PR of your pages and if your site has been banned.
Cheers
Oh no it's has javascript..give me a beak..
My suggestion would be to send Goolge an email so your site can be reviewed by someone and given back it's rank and not let it be junked because the spiders cant figure a little javascript..
Dont settle for for 2nd best..they have the talent, and the money to beable to fix that problem..and is is their problem not yours..
You are4 using COMMON web master tools ..javascript has been around a long time
Google is the problem, not my intelligence. I knew that my JS would harm my Google rating. If you are trying to say that I did not know this (?), then you are wrong.
My point is that it sucks that Google does not parse JS and I want to draw attention to it.
"Because the opposite is true: Non-server side in not interprated as regular HTML and is IGNORED."
I already know this. Nothing you have said is news to me. I think I confused you or you did not understand what I was saying. (I was not trying to start a flame war in my first thread.) This thread has nothing to do with my lack of knowledge in this area. I 100% fully understand server side and client side issues. I was just asking a question because I did not understand where you were coming from and wanted you to clariy it. If you thought that I did not understand all of what you have said, then I apologize.
[edited by: HQ_Webmaster at 1:23 am (utc) on Oct. 29, 2002]
[edited by: Marcia at 6:01 pm (utc) on Oct. 30, 2002]
[edit reason] Please stay with issues without getting personal [/edit]
"You might consider on-the-fly content generation using PERL and an Apache Server."
Yes I do already know this. I wrote in this thread somewhere before that I could do this. The problem with this? I have to use my server and my server's resources plus more bandwidth and thus a slower site. It is not fun. I am in a no-win situation.