Welcome to WebmasterWorld Guest from 184.108.40.206
Google is no longer following my robots.txt file. It has indexed hundreds of pages that I have disallowed.
Generally it's best to make any changes when SERPs are stable, never within an update due to panic!
[edited by: tedster at 7:43 pm (utc) on May 13, 2010]
It's tough to tell what's 'better' or 'worse' without a comparison, isn't it?Yes, if you are just starting out with a new search engine. But Google has 10+ years of data/experience to know at least which (types of) sites they want to suppress. They may not necessarily know which of the first 10 URLs that made the top cut should be #1 since they are supposedly all good pages but to intentionally let scrapers/spam in just to stir things up - that's a stretch. I see how an intentional SERP agitation could be useful to them but not outside the small range around the SERP that the page is ranked for by the algo.
they might have to just let it run for a while
for you the results may be junk... for us they are good results
Exactly. Now how to we diversify