I'm new to experimenting robots.txt files and I want to try something.
I've found that optimizing the content on a site for the most part works effectively across the search engines, except obviously not google - not anymore. :(
I've recently de-optimized a client's index page after it took a hit during the florida update in an attempt to make the content seem more organic and not optimized.
Unfortunately the rankings for that particular page slipped on the other engines. So what I did was make a clone of the index page and named it index2. I plan to reoptimize the index2 page back to the way it was and disallow googlebot visitation rights. Obviously, I don't want to deny googlebot from visiting the rest of the site though.
Is it possible to write a robots.txt file to keep googlebot from visiting just the index2 page?
Or is this just a stupid idea to begin with? Like I said, I have some room for experimenting here so I'm curious. Thanks in advance!