| 4:00 pm on Oct 28, 2002 (gmt 0)|
Also, this happened to two of my sites, each with #1 rankings. They both no longer exist in Google except for older versions of the site (in which these older versions never obtained #1 ranking anyway.)
| 4:04 pm on Oct 28, 2002 (gmt 0)|
I wish I was allowed to post my sites, but I am not allowed because they are adult related. I would like to discuss my sites and Google if it were somehow possible. If anyone would like to contact me, please do so via ICQ:
<snip> see profile
[edited by: engine at 4:21 pm (utc) on Oct. 28, 2002]
[edit reason] Please use the profile [/edit]
| 4:06 pm on Oct 28, 2002 (gmt 0)|
Welcome to WebmasterWorld. [webmasterworld.com]
| 4:08 pm on Oct 28, 2002 (gmt 0)|
|I was smart and I optimized my site to enhance the end user experience and Google cut me out. |
| 4:12 pm on Oct 28, 2002 (gmt 0)|
We have a few jsp driven sites and they are indexed quite well with Google, so I do not think they have a problem with it. One thing I can say is it would appear to take longer to have the site listed. (Initially) But once it is you'll be fine.
| 4:16 pm on Oct 28, 2002 (gmt 0)|
HQ, you can use spider simulators like SpiderSim at searchengineworld.com to see what your page looks like to typical spiders. Some SEs may see more or less, of course, but I'd recommend that as a good start. One of the biggest web design mistakes we find when we are called in to fix traffic problems is that content is invisible - the pages are accessed by scripted links that spiders can't follow, frames seal off content from some SEs, or content is buried in flash, js, etc. I've gotten a few surprises myself when I've "spidered" pages I thought I designed well, it's well worth a try.
| 4:20 pm on Oct 28, 2002 (gmt 0)|
| 4:24 pm on Oct 28, 2002 (gmt 0)|
|We have a few jsp driven sites |
Yesterday, we had a big storm in Berlin and that didnīt affect my Google ranking, so I do not think Google has a problem with storms in Germany.
<added>gsx beat me to it.</added>
| 5:16 pm on Oct 28, 2002 (gmt 0)|
| 5:20 pm on Oct 28, 2002 (gmt 0)|
Thanks rogerd, I understand what my pages look like without JS enabled. They have NO content what-so-ever. The page is basically blank with a few links to counters and other sites. The content of my pages are completely 100% generated via JS. It is no wonder Google kicked me out. About 30% of my profits came from Google too. You can see why this is not good for me.
Considering I had a #1 ranking, you know that my sites were worthwhile. I did not employ any tricks (never have and never will) to get those rankings. Although I am an adult webmaster, I am 100% honest (we tend to have bad reputations for understandable reasons).
| 5:22 pm on Oct 28, 2002 (gmt 0)|
Good point gsx, but I am not sure why you are telling me this? I understand that server-side processing of HTML is interpreted by all spiders as regular HTML.
| 5:29 pm on Oct 28, 2002 (gmt 0)|
HQ, You got caught.
What you're serving up to the GoogleBot is not the same as what you serving up to your regular visitors. This could easily be viewed as a way of cloaking. Were you banned from Goole?
Further more, if you use PERL you'll end up worried about CPU usage rather than bandwidth usage.
| 5:38 pm on Oct 28, 2002 (gmt 0)|
HQ, the important message here is that you must design for the limitations of the spiders. Design techniques that may be great for your visitors may be awful for search engine indexing. Doing your scripting using server-side code may get you the best of both worlds - "real" HTML for the spiders, and reusability of components, etc. I'm sure at some point the search engines will be able to interpret more. Today, for example, many no longer choke on frames, and Google will supposedly follow flash links. Until they advance further, though, I'd suggest using js only for content you DON'T want indexed. It has nothing to do with the merit of your sites - I've seen sites that looked great and had tons of content, but were still impossible for spiders to index.
If I were you, HQ, I'd revert back to the content that gave you your #1 in Google. Get it in there before the next major spidering, which is probably imminent. One of the cardinal rules of SEO is if a page is working well, don't mess with it!
| 5:47 pm on Oct 28, 2002 (gmt 0)|
Hiker, I doubt if HQ was aiming for better SERPs by putting the content in a script, particularly when he was apparently #1 for his desired search phrase.
| 6:07 pm on Oct 28, 2002 (gmt 0)|
I hope google never gives high rankings to sites that *require* js, flash or any other optional service. If the site doesn't downgrade in a reasonable fashion, I do not want to give that site my business anyway, so I do not want them showing up in my SERPs.
| 6:49 pm on Oct 28, 2002 (gmt 0)|
Maybe I was reading more into it than I should have.
Jump over to Search Engine World and use their SE Spider Tool to see if your pages are follow-able.
| 7:50 pm on Oct 28, 2002 (gmt 0)|
Sorry hiker_jjw, but you are wrong.
The real reason I used JS was to make my page smaller and faster to load. Plus it made my gallery modifications much much easier. With 85 categories I can now just go in and add one more categories and new tables are created automatically. My JS code is more advanced than your average newbie! :)
I do not know if I was banned from Google. How can I tell? What I was serving to Google is 100% THE SAME as what I server to my visitors. Google is just not intelligent enough (yet) to realize it.
Do not worry about Perl and CPU usage. I would not create the same page via Perl code for every visitor. I would only run the Perl code whenever I updated the page.
| 7:52 pm on Oct 28, 2002 (gmt 0)|
rogerd, you are exaclty right. "If is ain't broke, don't fix it."
I just wish Google would parse JS code so that I can make use of it without suffering. The only problem here is Google holding me back.
| 7:54 pm on Oct 28, 2002 (gmt 0)|
Sasquatch, my site requires JS to view the content. What is wrong with this? I am living in the 21st century, not the 1960s. I do not care for old and outdated visitors because if they have JS turned off then they likely have cookies turned off and other things that make it impossible to track. In my field, visitors that can not be tracked are useless. I trade traffic for traffic. Why send you a visitor you can not track because of no JS?
| 7:55 pm on Oct 28, 2002 (gmt 0)|
(I should add that a large portion of traffic trading scripts that I trade with depend on JS.)
| 7:56 pm on Oct 28, 2002 (gmt 0)|
Is Googleguy a guy that works at Google? Assuming that is the case, can he help me out on the status of my sites in Google's database?
| 8:04 pm on Oct 28, 2002 (gmt 0)|
| 8:17 pm on Oct 28, 2002 (gmt 0)|
"Good point gsx, but I am not sure why you are telling me this? I understand that server-side processing of HTML is interpreted by all spiders as regular HTML."
Because the opposite is true: Non-server side in not interprated as regular HTML and is IGNORED.
To quote you:
"The site looked the same to the end user" - but completely different to a spider
| 8:30 pm on Oct 28, 2002 (gmt 0)|
Good point gsx.
HQ, if you're so advanced... ah, forget it.
You might consider on-the-fly content generation using PERL and an Apache Server. That combination works rather well. The concept is simple. The first time an HTML page is hit/requested, the PERL CGI script is executed and the HTML page is written. From that point on the Apache server serves up the written HTML page. But, I'm sure you already know this.
| 8:31 pm on Oct 28, 2002 (gmt 0)|
Oh, one more thing HQ... Go get a Google Toolbar.
It will help you determine the PR of your pages and if your site has been banned.
| 8:34 pm on Oct 28, 2002 (gmt 0)|
Once again the problem is GOOGLE..you have to DUMB DOWN your site so google can undestand it..
Dont settle for for 2nd best..they have the talent, and the money to beable to fix that problem..and is is their problem not yours..
| 1:12 am on Oct 29, 2002 (gmt 0)|
| 1:15 am on Oct 29, 2002 (gmt 0)|
<snip> I meant I was smart in the optimization of my site for the reasons that I listed. Most webmasters could not (or would not) do that. I undersatnd that I can not show you my site, but the amount of JS I used was extreme. Like I said, my site has no content at all, 0%, once you turn JS off.
Google is the problem, not my intelligence. I knew that my JS would harm my Google rating. If you are trying to say that I did not know this (?), then you are wrong.
My point is that it sucks that Google does not parse JS and I want to draw attention to it.
"Because the opposite is true: Non-server side in not interprated as regular HTML and is IGNORED."
I already know this. Nothing you have said is news to me. I think I confused you or you did not understand what I was saying. (I was not trying to start a flame war in my first thread.) This thread has nothing to do with my lack of knowledge in this area. I 100% fully understand server side and client side issues. I was just asking a question because I did not understand where you were coming from and wanted you to clariy it. If you thought that I did not understand all of what you have said, then I apologize.
[edited by: HQ_Webmaster at 1:23 am (utc) on Oct. 29, 2002]
[edited by: Marcia at 6:01 pm (utc) on Oct. 30, 2002]
[edit reason] Please stay with issues without getting personal [/edit]
| 1:18 am on Oct 29, 2002 (gmt 0)|
hiker_jjw, I already have the Google toolbar, thanks! Both sites were rated only 5/10, despite their #1 rankings. I am not familar with how impressive or unimpressive 5/10 is as I am new to PR ratings.
"You might consider on-the-fly content generation using PERL and an Apache Server."
Yes I do already know this. I wrote in this thread somewhere before that I could do this. The problem with this? I have to use my server and my server's resources plus more bandwidth and thus a slower site. It is not fun. I am in a no-win situation.
| This 60 message thread spans 2 pages: 60 (  2 ) > > |