homepage Welcome to WebmasterWorld Guest from 54.166.53.169
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

    
1/4 of urls not indexed
arkain



 
Msg#: 4206076 posted 10:07 pm on Sep 23, 2010 (gmt 0)

Hello,

My sitemap has submitted 100 urls but only 77 are indexed (via webmaster tools). It's for a blog. I don't understand why some urls get indexed and others get ignored.

For example I could write domain.com/post-title and it gets indexed. Then the next day I write domain.com/another-post-title and it is not indexed.

The blog is a PR 0 that is under 3 months old.

So my question:

Is this normal or is this odd?
What can I do besides wait?
If it is odd is there anyone I should contact?

 

goodroi

WebmasterWorld Administrator goodroi us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4206076 posted 12:15 pm on Sep 24, 2010 (gmt 0)

hi arkain,

Is this normal or is this odd? - What you describe sounds normal.


What can I do besides wait? - Waiting is not going to help. You need to ensure that each page has significant & unique content. When I say content I mean text, as for significant & unique try to have at least for a few paragraphs of original text. Once you have original content that will pass Google's duplicate content filter, you need to work on link popularity (quality & quantity matters). Check out WebmasterWorld's link development section for ideas on how to boot your backlinks.

As for your "pr", PageRank is not nearly as important as it used to be. PageRank used to be magic. That was 10 years ago. Things have changed. You are also seeing Toolbar PageRank (tbpr). This is only updated with real pagerank data about 3-4 times a year. Accurate & current pagerank data is not released to the public. You might actually be a PR10 (highly doubtful :)) and not know it until months from now with the next update of tbpr.

Even under perfect situations, it is rare for Google to index 100% of a website. This is because of many reasons. The simplest reason is that there are often some very low quality pages in even the greatest websites (copyright notice page, terms & conditions, privacy policy etc.) Google is trying to gather the best set of pages for quality search results and not just index any & every file.

good luck

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved