Forum Moderators: open
But the problem is, Google doesn't seem to index the whole website. It has currently indexed the index page, widget page one and widget page two, it fiddles around with only those pages and sometimes drops one of them, last time it dropped the index page, before that it dropped widget page one.
It keeps toggling with my main keyword. My site is www.widget-foo.com, and I'm aiming for the keyword "widget foo", and sometimes I'm #18, sometimes #80.
My real question to all this, is, does PR matter in the amount of pages Google indexes of a website?
for example, in the next update, when I get a PR4 or a PR5, will Google index all the pages on my site?
I know PR is a factor in the update frequency of a website, if your are linked from, or are a PR5+ site, you get into the update frequency every couple of days.
If PR is not a factor in quantity, then why isn't Google indexing all my web pages?
Any info on this would be most appreciated.
Sid
You're not alone - but I don't think PR has anything to do with it.
There's a few (lengthy) running threads at the moment lamenting the lack of indexing by Googlebot right now. I've a new site of a few hundred pages that has been waiting since the start of Feb for anything other than the index page to be indexed. So far no luck.
Just hang in there mate and keep adding content or build new sites while you're waiting.
2odd...
The other day I realised that one of my sites has the doctype declaration left out (doh!) and it gets no.2 ranking for its main key phrase. Draw your own conclusions....
Add new pages and link them from the home page.
Good suggestion. I always link new pages on the homepage in a "what's new" format. It lets return visitors know there is something new and at the same time it helps the new page get spidered right away. Sometimes I'm amazed how fast the new page shows up in the serps.
if a site doesn't validate, its not search engine friendly
If the code is good enough to allow the bot to crawl the whole page, read the text, and spot the links, it's usually friendly enough, but it's best to have decent, validated code anyway. That way, you catch errors that might cause problems with Opera or the Mozilla versions etc, (some pages look fine in IE but are a mess in other browsers). Validating is always a good idea.