Welcome to WebmasterWorld Guest from 126.96.36.199
But here's the question I have. We have been a/b testing our index page for about 3 months now and I noticed index and indexb now both have a PR of 7. Ideally indexb should have a PR of 0 so as not to syphon off any off our PR from our links. Anyone have an idea how a page that has no inbound links could have a PR7? Does google see these pages as the same? Should I be concerned about this at all?
however the affect relative to other sites doesn't change so while toolbar pr number falls the power of that pr doesnt change..
But here's the question I have. We have been a/b testing our index page for about 3 months now and I noticed index and indexb now both have a PR of 7.
Oh-oh, I know what's coming...
Ideally indexb should have a PR of 0 so as not to syphon off any off our PR from our links.
You are correct.
Anyone have an idea how a page that has no inbound links could have a PR7?
It sounds as though both the A and B versions have been getting indexed from the beginning. Are you absolutely sure that there are no links pointing to indexb? Are you using the Google Toolbar while browsing to indexb?
Does google see these pages as the same?
No. You said you are doing a/b testing which usually means the content is different enough to not warrant duplication, its just another page at the root.
If its just a layout change, then there may be some duplication issues. But, the one with the most links will win which in this case is indexa.
Should I be concerned about this at all?
Not really. I'd drop a robots meta element on the indexb page to prevent it from getting indexed and obtaining PR.
<meta name="robots" content="none">
As the Internet grows each sites share of the Internet in terms of its PR must inevitably fall if the PR scale remains at 0-10. I believe at some point they will need to adjust the scale to differentiate more between too many sites having the same figure.
I believe they refer to that as a PageRankô Iteration. It occurs during major updates and could swing either way.
What is difficult to determine for some is why a site loses PageRankô during these Iterations. There are a lot of factors at play and trying to reverse engineer the process can be extremely tedious and not worth the time and effort. Any number of things could have happened in the "chain of command" to disrupt your PageRankô.
Also, with Google experimenting with so many different things in the algo these days, it is getting more and more difficult to pinpoint exactly where things occur. Unless of course you keep a very well documented history of every single thing that has happened with your site. And I do mean everything. ;)
Don't forget, PageRankô is like the Richter Scale. If you were at PR7 before and dropped to PR6, that could mean you were at a very low PR7 (7.1) and you are back to a high PR6 (6.9).
I'm positive no one is linking to indexb as we just started a/b testing a few months ago. However, I just did a backlink check on alltheweb at its says we have 160k links to indexb which is very strange as the links are pointing to the root. I am using toolbar while browsing to indexb but I dont understand how that would make a difference. What I failed to mention is that we are a/b testing through Google optimizer so could that why be why google is picking up the page?
It is primarily a layout change so dup content could aslo be a factor...
Thanks for the tip on robots meta element - I had never heard of that one before.
However, I just did a backlink check on alltheweb at its says we have 160k links to indexb which is very strange as the links are pointing to the root.
Ah, something isn't right somewhere. How about Yahoo!, what do they show for backlinks?
I am using toolbar while browsing to indexb but I dont understand how that would make a difference.
The Toolbar phones home. Any pages you browse to while having the Google Toolbar active are subject to get indexed unless of course they are being blocked from indexing.
[edited by: pageoneresults at 8:00 pm (utc) on May 13, 2007]
I also see that google has the same # of links to index as indexb so I assume they see the page as same? dup content?
Sounds like Google has effectively determined the duplication. This is normal behavior on Google's part. There was probably a 15-30 day period where there might have been some site performance issues (ranking) while those two pages were sorted out.
I'd drop the meta robots element on the indexb page now to stop that page from being indexed further.
You will now have to implement a 301 to permanently redirect those indexb pages that are indexed to the primary index page. If you don't, you're going to be splitting PR amongst those two root pages and it may cause issues.
P.S. I've briefly mentioned Competitive Sabotage at WebmasterWorld and this is another area prime for the picking. Anytime you do a/b testing, you need to make sure that the testing parts don't get indexed.
If I were a fierce competitor and found your a/b testing area, and also found that they were returning a 200 status, I might want to link to them from a high PR page somewhere on the web just to wreak a little indexing havoc with your site. ;)
they've taken away any link value from most sub pages of directories...even from DMOZ
It's just a bug in the toolbar PR reporting - the URLs still have the same value behind the scenes. We've got another thread spcifically about this:
Inner Page has grayed out PR [webmasterworld.com]