Forum Moderators: open
I'd like to implement a script I've found that will allow split-run testing of multiple versions of a page and then track the results for each one.
The script uses CGI and Server Side Includes to serve different versions of the same page when a single URL is accessed.
I'm just worried about the possiblility of negatively impacting the site in Google's eyes.
Could the use of such a script be seen as cloaking? Could it have other unforseen negative impacts in the SE's?
If so, is there any other simple way for me to achieve the same goal in a safer manner?
If anyone needs to see the script in action to offer an educated answer sticky me, and I'll send you the URL for the demo page of the script in question.
I'm sure you could find some hard core, anti cloaking zealots who would try and convince you that running that type of script would mean that you have become an evil search engine spammer, but that doesn't mean it's so. :)
Contrary to popular belief, there isn't some hi-tech automated cloak buster out analysing each and every one of the 2,469,940,685 web pages in Google's index to see if any of those pages might look a bit different than what a human sees.
Cloaking penalties are almost 100% based on human review and that review process involves looking at the intent of the page.
If you are simply rotating different variations of the same page in order to test conversions, and those pages don't contain any blatant attempts to manipulate or influence ranking, then you won'thave a problem.
>>Could it have other unforseen negative impacts in the SE's?
The only potential negative impact happens if you don't track or control which version the spiders get. That can lead to fluctuations in SERP positions, which can have a big impact on conversions.
The only potential negative impact happens if you don't track or control which version the spiders get. That can lead to fluctuations in SERP positions, which can have a big impact on conversions.
Heh. Well here's an irony if I've ever seen one...
To control the version the spiders get do I need to implement a cloaking script, or is there another way?
Cloaking is the one facet of SEO that I have absolutely no first hand experience with.
Time to study up?
Of course I could try to limit the split-run testing to times of the month when GoogleBot's not on the crawl, but I'd still have to reckon with the FreshBot.
I don't think you need to start with a cloaking solution right away. But if you do get a daily fresh crawl, you will want to keep track of the cache so you can see which version Google indexed on any given day, and whether or not the change in content is causing the SERP listing or descriptions to bounce around at all.
If everything is fairly stable, I'd not mess with it. But if you see significant fluctuations, then you might want to consider serving the same page to Google.