Forum Moderators: open

Message Too Old, No Replies

Is Google PageRank Case Sensitive?

2 different ranks for same url with different case.

         

frank_v

3:38 pm on Nov 30, 2004 (gmt 0)

10+ Year Member



I just noticed that page www.example.com/folder/page.aspx has a google rank of 1 and www.example.com/Folder/page.aspx has a rank of 0. The same is true for all the pages on my site.

Does google treat them as 2 different pages?
Thanks.

[edited by: ciml at 5:06 pm (utc) on Nov. 30, 2004]
[edit reason] Examplified [/edit]

ciml

5:11 pm on Nov 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Welcome to WebmasterWorld, Frank.

Google can index two URLs that are different only in case. Although one of the major Web server vendors rewrites URLs internally to make them case insensitive, HTTP and other Web servers allow those URLs to be different.

While Google makes some assumptions (e.g. that /index.html is the same as /), it does not assume that /FOO.html and /foo.html are the same. If someone links to both (or submits both, or buys AdWords for both), then both can be listed.

If, as seems likely, they both return identical content then Google will probably merge them - including all backlinks and PageRank.

BigDave

6:03 pm on Nov 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



then Google will probably merge them

eventually.

ciml

6:06 pm on Nov 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> eventually

Quite.

So if it can take a while to get merged, is there a chance of near-duplicate problems until the duplication has been picked up?

Hopefully not, it would make sense for the duplicate handling to be dealt with first.

frank_v

6:27 pm on Nov 30, 2004 (gmt 0)

10+ Year Member



ciml,
thank you very much for the explanation.

I was going to write to all the webmasters that backlink to my site asking them to use exact casing but it looks like all I need to do is just wait. Thanks again.

ciml

6:29 pm on Nov 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



One other thing. If you have something that changes frequently (e.g. the date) then Googlebot may not get identical content when it fetches the two URLs. In that case, they'll stay separate.

WebWalla

11:24 am on Dec 1, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How does Google check for exact duplicate in this case? Some sort of CRC check on both pages, or do they really compare text against text?

The latter method surely would use too much processing power?