Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

My Robots.txt Screw Up.

case study in how to kill well performing serps in 1 easy step

5:55 pm on Mar 7, 2003 (gmt 0)

Full Member

10+ Year Member

joined:Dec 6, 2002
votes: 0

So last month (early Feb) I updated my site. On my development box, i keep a robots.txt that disallows everyone. I accidentally copied this "disallow all" robots.txt to production site, while deepbot was crawling.

I didn't notice it for about a week, and by then, deepbot was gone. I had about 60,000 pages indexed with a PR5 and 1500+ referrals a day. Life was good.

Starting in the middle of the month, I started getting crawled by freshbot. I think freshbot crawled over 100,000 pages.

But to no avail. www2 shows me with less than 15,000 pages indexed, much of which doesn't look to be totally indexed (URL with no title or description)

I've also disappeared completely on around 90% of my SERPS (#1,2,or 3 on several hundred different serps)

Now I'm silently praying that freshbot will update the main index with it's 100,000 pages sitting somewhere on some server in the googleplex...

Don't let this simple mistake happen to you! It's only 10 AM but I'm going to get a drink!

11:47 pm on Mar 7, 2003 (gmt 0)

Full Member

10+ Year Member

joined:Oct 30, 2002
votes: 0

Ouch! I wouldn't trust myself with a robots.txt file. I have read of too many mistakes being made with this.

I stick to the old fashioned noindex tag where required and a good spam killer on my server. Seems to suit me at the moment.


1:01 am on Mar 8, 2003 (gmt 0)

Inactive Member
Account Expired


aaah.. sorry to hear it! Just hope it will be a short month between updates!
1:43 am on Mar 8, 2003 (gmt 0)

New User

10+ Year Member

joined:Feb 28, 2003
votes: 0

Sorry to hear that as well.
I had a similar experience. Procrastinated on publishing a ton of new information and did not get it up in time for deep crawl.
Anyone know what the timeframe after an update, that is best to have content published by? One week, two weeks, etc? It would really be helpful for creating publishing deadlines and backlink solicitation.
1:45 am on Mar 8, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member korkus2000 is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 20, 2002
votes: 0

This month google is taking no time between update and crawl. It use to be a week, but someone I think GG said it would start when the dance did.

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members