Forum Moderators: goodroi
my robots says this
=============
User-agent: *
Disallow: sitemap.htm
=============
sitemap.htm is a nasty dynamic page that is created every time my site is published (i use e-commerce software), its about 300k, so i built my own.
maybe the search engine liked it, maybe thats why im being punished, for not wanting them to see it.
Try using the the Brett's robots.txt validator [searchengineworld.com...]
Your file called sitemap.htm I assume is a site map. A sitemap is usually a good way for search engine spiders to find their way around your site. By banning the robots reading it you're stopping them from traversing the site easily.
If you changed nothing else and you had good site performance previously it is unlikly to be anything else related to robots.txt, other than what I said above.
I assume you're talking about Google and have studied this. [webmasterworld.com]
so i built a light version - i know site maps are usefull, thats why i built one under 100k with less than 100links.
its so frustrating, every site seems to have survied!
only other thing i can think of is last week our service providers experienced a large period of outage and maybe google came along when we were down.
this, i pray is the answer!