Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
In order to aleviate this problem we now serve pages differently based on the user agent. If the user agent is identified as a search engine (GoogleBot, etc) then the page is served as
This has allowed us to get indexed but we now have 2 questions....
1) Is this considered "Bad" by Google even thought the content on both pages is EXACTLY the same?
2) These new pages all have a PR of 0 since nobody links to them and the old non-idexable page has a page rank of 6...so now that we are indexed our results never come up!
This wisdom of these boards is much appreciated!
So, your intentions may be honorable but Google's non-human, automated process may not see it that way.
This is a question I would take to the Google Forum and present to GoogleGuy or at least put it over there for him to have a shot at.
Cloaking is cloaking and Googlebot isn't making good vs. bad judgement calls...