Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
There were problems in the summer with a 301 redirect.
Y.com is older and is being crawled regularly as it has been for the last 3 years.
We moved y.com to a different IP 6 weeks ago, removed all content & placed 1 link on every page saying "please click here" (which goes to x.com)
Y.com is still being crawled daily & x.com is not.
x.com is on the old IP. Y.com is a new IP.
Doing a G "site search" for X.com displays all y.com URLs.
Both sites are obviously regarded as the same thing by G & (both) have 100 backlinks
Whats the most bullet-proof method of encouraging Googlebot to not crawl y.com but visit x.com instead?
Thanks in advance
Then email all folks that link to site y and them to change the link to site x, that way google will eventually stop being referred to x, and if it does get there it wont be able to index it.
I hope that helps :) its a really simple way but i find when keeping things simple it often works better.
Doing a G "site search" for X.com displays all y.com URLs.applies in our case as well
Both sites are obviously regarded as the same thing by G
Unfortunately, we have not yet figured out why this is happening and what to do about it. G has just now started visiting *one* page on x.com (which has very good inbound links), but apart from that only the server root is visited by googlebot.
Similarly, if Google is not interpreting your robots.txt file correctly, then it's likely your robots.txt syntax [robotstxt.org] is incorrect, so check it [searchengineworld.com]. I've seen minor problems with other robots misinterpreting robots.txt, but Google's parser is one of the most sophisticated ones.
Do things "by the book" for best overall results and minimized headaches. Then apply work-arounds when and if necessary. Check your work and be patient -- it can take 60 days before some search engines catch on, and even longer for others. :o