Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

My homepage is PR5, why are inner pages PR0?



7:44 pm on Apr 6, 2006 (gmt 0)

10+ Year Member

I have a site with a home page PR5. My inner pages (the categories of products that are linked to from the home page) all have a PR0. Shouldn't they be higher?

When I search keywords, these pages rank fairly highly in the results, but they still have PR 0.

These pages are dynamically generated from a database. Is that causing the problem?


9:09 pm on Apr 6, 2006 (gmt 0)

5+ Year Member

If there is two many variables in the url, that could be the problem.


9:10 pm on Apr 6, 2006 (gmt 0)

5+ Year Member

you may want to try using a mod rewrite and .htaccess file.


9:13 pm on Apr 6, 2006 (gmt 0)

recently, as signs of a PR update were visible, all my inside pages turned to 0. Will wait and see...


9:32 pm on Apr 6, 2006 (gmt 0)

10+ Year Member


listen to walkman.

If this just happened there is another thread where several people noticed it happened to their sites.

Just part of the never ending wonder of BD and likely is temporary.


11:18 pm on Apr 6, 2006 (gmt 0)

10+ Year Member

These pages have always been PR0. It's not a recent change.

I have other pages, content pages, that are PR4. It's just the dynamically created product pages that are PR0.

I don't think variables is the issue either...the pages look like this: www.mysite.com/browse.asp?id=31

Any more thoughts?


11:22 pm on Apr 6, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Have you validated your code?


11:26 pm on Apr 6, 2006 (gmt 0)

10+ Year Member

ok, showing my ignorance here...what does that mean...to validate my code?

(so, I'm not a webmaster...lol, but I learned everything I needed to know about SEO here!)


1:30 am on Apr 7, 2006 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Two things. First, this is the direct advice from Google's Webmaster guidelines:

Don't use "&id=" as a parameter in your URLs,
as we don't include these pages in our index.


Now this may not be 100% true anymore, but it is still excellent advice.

Also, here are some excellent validation resources - from the W3C, the web standards body:

W3C Validator - HTML [validator.w3.org]
W3C Validator - CSS [jigsaw.w3.org]
(Invalid css should not affect crawling, but I put the link here for the sake of completeness.)

Don't worry too much at first about every "Warning" you may get, but definitely look for and fix "Errors". The folks in our HTML and Browsers Forum [webmasterworld.com] can help out if you get stumped on an Error message.


1:54 am on Apr 7, 2006 (gmt 0)

10+ Year Member

ahh ... sorry danaj. I thought it was part of the new thing that's happening.

Are they also supplemental?


2:00 am on Apr 7, 2006 (gmt 0)

and remember that about 95% of sites don't validate--google.com being one of them.


3:09 am on Apr 7, 2006 (gmt 0)

10+ Year Member

This is a very basic question that you probably thought of long ago, but I assume they return 200 when Googlebot crawls, right?


9:43 pm on Apr 7, 2006 (gmt 0)

10+ Year Member

Yes, they are supplemental pages.

What does it mean when you ask about Google returning 200 when it crawls?


Featured Threads

Hot Threads This Week

Hot Threads This Month