Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
Does anyone have some insight into this? Forgive me if this has been covered somewhere but I couldn't find anything.
Google doesn't "punish" sites for having duplicate content, but in the long run it does drop duplicate pages and merges domains that it sees as being mirrors. IMO the probability of this happening so quickly is quite low.
Depending on the timing of the monthly cycle, there's a fair chance that page could be indexed with the wrong text for a month.
Well, one site is very new and was only crawled by Freshbot a few times. The established site is the one that mistakenly hosted the wrong index file of the new site for a few days. The thing is, both are now sitting on the same SERP just a few slots away from each other and with the same cache. I hope this doesn't mean Google will interpret the new site as a duplicate. I put a lot of promotion time into it.
I'm thinking that next time Freshbot comes around, the new content it picks up will override the previous crawl and all will be 'forgotten'. Is this correct or does it file everything away in a database for future comparison?