Forum Moderators: open

Message Too Old, No Replies

Has Google lost me for good

Should I resubmit?

         

StephenRKnight

3:59 pm on Feb 1, 2003 (gmt 0)

10+ Year Member



Hi - last November I added a redirection Server Side code to all 400 pages of my site. This code was to direct all Netscape users to a incompatiable page.

The result: When Google finished with me on the last round, all searches went straight to the incompatiable page - which of course is NOT what I wanted.

I didn't pick up on this problem until 10 January 2003

Looking through my logs Google paid it's last visit on the 10/01/03 - 6 hours before I realised what was going on.

As soon as I found out I spent 12 straight hours removing the code.

My question;

The logs show the following;

2003-01-08
GET /robots.txt - 404 604 410 80 HTTP/1.0
GET /Default.asp - 302 448 450 80 HTTP/1.0
GET /Incompatible.htm - 304 143 466 80 HTTP/1.0
GET /html/reguksnail_compact_page2.htm - 404 604 192 80 HTTP/1.0 Gigabot/1.0 -

2003-01-09
GET /robots.txt - 404 604 406 80 HTTP/1.0
GET /mrconehead_disc1_frm.htm - 404 604 420 80 HTTP/1.0

2003-01-10
GET /robots.txt - 404 604 406 80 HTTP/1.0
GET /Default.asp - 302 448 396 80 HTTP/1.0
GET /robots.txt - 404 604 410 80 HTTP/1.0
GET /Incompatible.htm - 200 10559 416 80 HTTP/1.0

Do you think Google will index my site COMPLETELY next time round - or should I resubmit and start again.

This is my first site which Google listed in October 2002 after spending 22 hours sniffing around.

To compound the problem - most searches that I try outside of Google have picked up on Googles results - i.e. It displays the Incompatible.htm

I can't even get into DMOZ to register my site to give me a fighting chance.

Thank you in advance - Stephen

jdMorgan

2:42 am on Feb 2, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



StephenRKnight,

It sounds like the problem was with the redirection: It assumed that Googlebot was Netscape for some reason, probably because the logic of the redirect did not comprehend Googlebot and/or other robots.

Since the 'bot is still returning to your site, all you have to do is fix the problem (which it sounds like you've done) and then wait for the next deep-crawl. While Googlebot may not pick up all of the pages again, it will pick up the first-tier links at least, and then return for more later.

You could try resubmitting. If you have a site index page with links to all of your important pages, submit that. If not, submit the home page and possibly a few inner pages. I doubt that submitting a whole bunch of pages will help much, if at all (I could be wrong, though).

When rolling out a new function that could have disastrous effects such as blocking all 'bots, it's a real good idea to test it on a test site first, or to place it on a few relatively-unimportant pages. Also, after any change to robots.txt, .htaccess, or server configuration, it's a real good idea to validate the changes. Using Search Engine World's robots.txt validator [searchengineworld.com], the WebmasterWorld server header checker [webmasterworld.com], and wannabrowser [wannabrowser.com] to make sure everything works as expected.

Good luck with this!
Jim