homepage Welcome to WebmasterWorld Guest from 54.167.179.48
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

    
My Robots.txt Screw Up.
case study in how to kill well performing serps in 1 easy step
webdevsf

10+ Year Member



 
Msg#: 98 posted 5:55 pm on Mar 7, 2003 (gmt 0)

So last month (early Feb) I updated my site. On my development box, i keep a robots.txt that disallows everyone. I accidentally copied this "disallow all" robots.txt to production site, while deepbot was crawling.

I didn't notice it for about a week, and by then, deepbot was gone. I had about 60,000 pages indexed with a PR5 and 1500+ referrals a day. Life was good.

Starting in the middle of the month, I started getting crawled by freshbot. I think freshbot crawled over 100,000 pages.

But to no avail. www2 shows me with less than 15,000 pages indexed, much of which doesn't look to be totally indexed (URL with no title or description)

I've also disappeared completely on around 90% of my SERPS (#1,2,or 3 on several hundred different serps)

Now I'm silently praying that freshbot will update the main index with it's 100,000 pages sitting somewhere on some server in the googleplex...

Don't let this simple mistake happen to you! It's only 10 AM but I'm going to get a drink!

 

Total Paranoia

10+ Year Member



 
Msg#: 98 posted 11:47 pm on Mar 7, 2003 (gmt 0)

Ouch! I wouldn't trust myself with a robots.txt file. I have read of too many mistakes being made with this.

I stick to the old fashioned noindex tag where required and a good spam killer on my server. Seems to suit me at the moment.

daamsie

10+ Year Member



 
Msg#: 98 posted 1:01 am on Mar 8, 2003 (gmt 0)

aaah.. sorry to hear it! Just hope it will be a short month between updates!

OneTooMany

10+ Year Member



 
Msg#: 98 posted 1:43 am on Mar 8, 2003 (gmt 0)

Sorry to hear that as well.
I had a similar experience. Procrastinated on publishing a ton of new information and did not get it up in time for deep crawl.
Anyone know what the timeframe after an update, that is best to have content published by? One week, two weeks, etc? It would really be helpful for creating publishing deadlines and backlink solicitation.

korkus2000

WebmasterWorld Senior Member korkus2000 us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 98 posted 1:45 am on Mar 8, 2003 (gmt 0)

This month google is taking no time between update and crawl. It use to be a week, but someone I think GG said it would start when the dance did.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved