Forum Moderators: open

Message Too Old, No Replies

I know updating frequency depends on PR

but does indexing quantity?

         

sidyadav

9:25 am on Mar 18, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've just started a site 3 months ago, it currently has 50 web pages altogether. Its linked from many PR5 and PR6 web pages, to get it a decent amount of PR. (However, it looks like I was too late to put the links in - after the latest update :()

But the problem is, Google doesn't seem to index the whole website. It has currently indexed the index page, widget page one and widget page two, it fiddles around with only those pages and sometimes drops one of them, last time it dropped the index page, before that it dropped widget page one.

It keeps toggling with my main keyword. My site is www.widget-foo.com, and I'm aiming for the keyword "widget foo", and sometimes I'm #18, sometimes #80.

My real question to all this, is, does PR matter in the amount of pages Google indexes of a website?
for example, in the next update, when I get a PR4 or a PR5, will Google index all the pages on my site?

I know PR is a factor in the update frequency of a website, if your are linked from, or are a PR5+ site, you get into the update frequency every couple of days.

If PR is not a factor in quantity, then why isn't Google indexing all my web pages?

  • I have added the:
    User-agent: *
    Disallow:

    just to allow all the bots.
  • The links in the site are plain href links.
  • However, my website doesn't validate - while talking to another fellow WebmasterWorld member, he said if a site doesn't validate, its not search engine friendly - does this matter?

    Any info on this would be most appreciated.
    Sid

  • kovacs

    1:27 pm on Mar 18, 2004 (gmt 0)

    10+ Year Member



    I have had exactly the same problem - a whole bunch of new sites have been fully and repeatedly spidered, but only the index page a few other pages have been added to the index so far. It really has me worried.

    2oddSox

    2:02 pm on Mar 18, 2004 (gmt 0)

    10+ Year Member



    Sid,

    You're not alone - but I don't think PR has anything to do with it.

    There's a few (lengthy) running threads at the moment lamenting the lack of indexing by Googlebot right now. I've a new site of a few hundred pages that has been waiting since the start of Feb for anything other than the index page to be indexed. So far no luck.

    Just hang in there mate and keep adding content or build new sites while you're waiting.

    2odd...

    HarryM

    2:13 pm on Mar 18, 2004 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    I think PR does has something to do with it, but whether it does or not, you can still take measures which may trigger Google into greater activity. Try changing something in your index pages to show your site is active. Add new pages and link them from the home page. Make sure you have a site map. Add deep links. Etc.

    sidyadav

    7:23 pm on Mar 18, 2004 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Hello Folks, I'm really happy to say, that Google has just indexed all my 50 pages!

    So the question is answered - its not PR related after all, if your stuck in my situation, just fix up things and wait for Googlebot :)

    However, I'm still #80 for my main keyword :(

    thanks for your replies,
    Sid

    roodle

    7:46 pm on Mar 18, 2004 (gmt 0)

    10+ Year Member



    >However, my website doesn't validate - while talking to another fellow WebmasterWorld member, he said if a site doesn't validate, its not search engine friendly -does this matter?

    The other day I realised that one of my sites has the doctype declaration left out (doh!) and it gets no.2 ranking for its main key phrase. Draw your own conclusions....

    annej

    11:56 pm on Mar 18, 2004 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Add new pages and link them from the home page.

    Good suggestion. I always link new pages on the homepage in a "what's new" format. It lets return visitors know there is something new and at the same time it helps the new page get spidered right away. Sometimes I'm amazed how fast the new page shows up in the serps.

    Stefan

    12:41 am on Mar 19, 2004 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    if a site doesn't validate, its not search engine friendly

    If the code is good enough to allow the bot to crawl the whole page, read the text, and spot the links, it's usually friendly enough, but it's best to have decent, validated code anyway. That way, you catch errors that might cause problems with Opera or the Mozilla versions etc, (some pages look fine in IE but are a mess in other browsers). Validating is always a good idea.