No replies, Is it just me? :(
Should be ok. Make sure you've triple checked your robots.txt ... and banned well known user agents like googlebot using .htaccess if you're paranoid about it.
[webmasterworld.com...] is a good start on how to ban using user agent strings.
While you should be OK, I personally wouldn't risk it in the least.
I'd much rather put my effort into deciphering those stats, than risk a duplicate content penalty, especially if your livelihood depends on those free listings.
I would definitelt NOT do it. To track adwords you can perhaps use simple additions to the URL which is set as the landing page:
What about the Alexa ranking of your site, which is not really very accurate but a lot of business people use anyway?
What about people who bookmark your page and from that point on come browsing in through the PPC site? Over time that's going to throw off your stats more and more.
What about people who set up links to your PPC site? That's also also going to throw off your stats, plus the beneficial value of those links will be lost.
IMO sticking to a single site but using unique tracking URL's for PPC bids is the way to go.
I've been doing exactly what you're planning for over a year with no ill effects so far.
Please tell me more.
|I've been doing exactly what you're planning for over a year with no ill effects so far. |
I've been using tracking URL's for 3 years, they are not accurate enough. Plenty of customers don't buy immediately, they leave and come back. Their 'anti spyware' software deletes their cookies... so tracking is dead.
Excellent example that proves my point! Those people found the site via PPC, they are part of the return on investment, if they come back next year, or 10 times a month and buy stuff - it was only because of the Adwords ad.
|What about people who bookmark your page... |
To run the business effectively I need to know which investments pay what. SEO for free SERPs, Adwords, Overture...
Has anyone else safely created copies of their own site, keeping just one copy for the free SERPs, and prevententing the others from the free SERPs?
Googlebot will never see the site so there is no risk. As I said before I don't think it's the optimal solution, but it's safe.
Having said that, I would suggest blocking Googlebot via a .htaccess file as well. Once in a great while there can be a mixup and the site will get spidered. For example, last year my site was down for a couple of days, and Googlebot presumably tried and failed to retrieve the robots.txt file. For a short period after this, it did spider pages that it was not supposed to.
Thats exactly what I did for the same reasons as you.
I have three identical mirror sites, .com, .co.uk, and .tv and robots are barred from two of them leaving the third in the free serps. Whether or not it will become a problem I dont know but its fine at the moment.