Forum Moderators: Robert Charlton & goodroi
I have real problems indexing new pages at the moment. Since beginning of the summer, it seems like Google refuses to follow internal links on my site. I do have a site map, good site structure, Google SiteMap and propser robots.txt. Furthermore, a directory has only its homepage indexed while it is pretty well linked from outside the site.
Anyway, I didn't have any trouble indexing before.
Any ideas of what's going on?
Thanks
tedster :
GoogleBot is crawling the pages, but it's not following the links. If I do a search on the anchor text of a link it will come back with a result, but the page wich is linked under that anchor text is not coming up in the index (some of them for the past 4 months). The problem occurs even if I put up a link on the index page. Also, I would like to add that the site is 4 years old.
g1smd :
Xenu works fine for my site. I barely have any broken links, especially internal ones.
piatkow : I'm not talking about days or weeks, but months without being able to get those pages into the index.
Seems it's not crawling through the site, following links anymore like it used to.
All our links are connected by text links, and none are more than 2 levels deep. However, it seems that any that aren't listed in our main navigation menu (shown on every page)have dropped out of the index entirely, and new pages aren't being picked up.
Google never used to be like this. You used to be able to count on it to thoroughly crawl and index your site easily if you had good internal linking. Weird.
It does seem to be picking up pages rather eratically and has found low priority pages that are flagged as "yearly" ahead of high priority pages flagged "monthly".
I can tell you exactly what this issue is:-
Since the roll of of the new infastructure a page has to have a good PR before google wants to bother to index off it - its this simple.
So if you put another site map on your site with a list of pages on it google is not going to spider all of the pages off it and rank them in the serps unless the page has good Page Rank
Likewise the depth is restricted again based on the pages PR value. For example with a PR7 site you may see pages three deep ranking, with a PR5 site its unlikely to go more than two deep and PR4 or less is going to struggle with getting anything spidered off it imo unless its in a non commercial sector.
I realize there are exceptions to this but based on a number of the sites we work on and what i have seen in the serps this is typical of what i see.
So if you want more content indexed in the serps, deeper content and to be taken more seriously by google, bottom line (obvious one) get more links to your site and over time your PR will increase along with your serps positions