Welcome to WebmasterWorld Guest from 220.127.116.11
Is there an official way of knowing if you're penalized for duplicate sites? Also, how "similar" does a site have to be? We did some extensive rewording to a third site, seems to be ok.
if I put into google site:www.example.com keyword1
google shows the pages and for the description of these results it shows the meta description instead of the on page keyword1 like it should.
If i do the same on the duplicated domain it displays the results with the description how it should be (showing snippets from the actual page).
if I do allintitle, allinanchor, allintext for a range of keywords i am targeting the site comes up in the top 5 but for normal searches it is no where to be seen.
Ive done lots of analysis and this is the only conclusion i see logical. Any other suggestions are welcome.
I see one site is building up google page rank, while the other, despite lots of submissions to directories, remains at zero.
I have not had a problem with pagerank. Page rank has grown as expected.
I suggest that a) you try the site:www.example.com keyword1 search as mentioned above on your sites.
b) If the domains are the same you select 1 that you want in google and remove the other from the index using your robots.txt
You may even want to have one of the domains targeted at google and the other targeted at yahoo / msn. (block google from one and msn / yahoo from the other.
Although I've done this with great success on several occasions, I used it on an site earlier this week, and during the 20-30 seconds that the modified robots.txt was on the server, Googlebot picked it up and also wiped out most of the www.sitename.com site.
I'm not sure how you address this problem with a static site, but with my sites I now detect the cgi.server_name and test if it contains 'www'. If not then the page includes a meta robots noindex tag.
The extent of Google's desire to actually generate duplicate content is such that I have to check the script_name, case, and query_string before I deliver any pages, to ensure that it actually exists, and is not some crackpot URL dreamed up by Google.
Due to a previous domain forwarding setup (now on seperate IPs) I have a nightmare cross domain indexing problem. I have submitted thousands of pages individually using the URL removal tool, and to Google's credit, they deal with the requests usually within 24 hours.
It is still less than a week since I embarked upon this Google clean-up operation, and I have seen no improvement in the SERPs.
The other site, although identitcal in content, has a .ca domain name that he's using to target Canadians. This is the site that has had all sorts of SEO efforts and submitted to countless directories. It was actually the FIRST site done, yet is doing worse than the duplicated site which has had NO SEO work done to it. Go figure...