Forum Moderators: goodroi
[webmasterworld.com...]
[webmasterworld.com...]
Has anyone else experience this, and are there some guidelines to better manage both prevention and an urgent reinclusion with the search engines
In general, people should realize the importance of robots.txt. Most critical robots.txt mistakes that I have come across are made out of ignorance. For example there is a large goverment site that currently is blocking Yahoo and MSN but allowing Google. When they contacted me for help with another issue, I pointed this out and no one in a decision making position knew about it.
Keep a complete back-up of your website including your roobts.txt and regualrly review it to make sure no one changes it without you being told.
For example there is a large goverment site that currently is blocking Yahoo and MSN but allowing Google. ..... I pointed this out and no one in a decision making position knew about it.
This is scary, how many good sites just sit there with problems on them and no solutions. You can read about them all over these forums - and the people here are much more switched on than the average siteowner.
Is there a way to re instate pages within the 180 day exclusion period on Google?
In my mind this is another example of how webmastercentral could be improved greatly with better access to notifications and remedial steps. This is a clear case of "error" which is akin to "fixing" after a hack attack.