Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
The robot tags are there to help a SE robot find it's way around your site: To point it in the right direction and to stop it from indexing a page you don't want indexed.
If you mean the robots.txt file, then that should have a similar effect.
Both are a means of control, rather than a way to get a site banned.
Banning is a very strong word and is usually associated with some specific activity which breaches the terms and conditions of submission and use for the search engine. An example of this includes deliberate spamming of the index with excessive page submissions or falsifying the visual page content vs. what the search engine spider sees when indexing.
You can have a site that has no robots.txt or robot tags and it will be indexed (eventually) if it meets the criterion on the url submission form.
I hope that has helped clarify.
I believe miles was probably asking about this article [aim-pro.com].
When AV first stopped showing the "you submitted too many urls" message, one way that you could tell if your were banned was by submitting a page that blocked AV with the robot.txt tag. If you WEREN'T banned AV would read the robot.txt and reject your submission, If you WERE banned AV would ignore the robot.txt and act like it excepted your submission (even though nothing actually happened beside a quick crawl).
To tell you the truth I don't know if this still works, I don't have any banned sites to test it with.....Anybody with a banned site wanna give it a try?
Thank you for responding Engine and Seth.
I would just go to the directory listing, copy the url and then submit it directly to the SE's. This should not only help you on AV, but you should see a boost on Google as well. I wouldn't stop at this though....you can never have enough links (especially if your in a competitive industry)