I'm developing a new site using some of the same products and text thatís already on an existing .co.uk site. I then plan to point one at the UK and the other at the USA - the .com version of the same url. Whilst certain products are the same, page coding, key text, spelling, is changed in the two versions.
As I understand Google doesn't like duplication of sites with the same material, and possibly punishes you for it, I understand I can use a robots.txt file to prevent Google from indexing the new site in the interim.
Can someone tell me what I should put in this file and where I should upload it to? Do I need to create a directory called Ďrobotsí? Can I place this text file on some pages only (those with duplication) and let others be indexed that have new material?
As a related issue, but of equal importance, I obviously wish to have Google recognize the existence of the site (and have placed it on my Google dashboard accordingly), in order that the 6-month incubation period ticks away. So I donít want to put the clock back to Day One.
Thanks in advance for your much-valued advice and help.