Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google not following my internal links anymore

         

thickparasite

10:03 am on Sep 28, 2006 (gmt 0)

10+ Year Member



Hi all,

I have real problems indexing new pages at the moment. Since beginning of the summer, it seems like Google refuses to follow internal links on my site. I do have a site map, good site structure, Google SiteMap and propser robots.txt. Furthermore, a directory has only its homepage indexed while it is pretty well linked from outside the site.
Anyway, I didn't have any trouble indexing before.
Any ideas of what's going on?
Thanks

tedster

6:35 pm on Sep 28, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Which is the situation:

1. googlebot is not requesting those urls
2. googlebot is requesting the urls, but they do not show in the site: search

g1smd

6:37 pm on Sep 28, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Have you run your site through something like Xenu LinkSleuth to see what it thinks about your site?

piatkow

8:29 pm on Sep 28, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Also look at the update frequency you have put in the site map for those pages.

thickparasite

11:41 pm on Sep 28, 2006 (gmt 0)

10+ Year Member



Thanks for the replies,

tedster :
GoogleBot is crawling the pages, but it's not following the links. If I do a search on the anchor text of a link it will come back with a result, but the page wich is linked under that anchor text is not coming up in the index (some of them for the past 4 months). The problem occurs even if I put up a link on the index page. Also, I would like to add that the site is 4 years old.

g1smd :
Xenu works fine for my site. I barely have any broken links, especially internal ones.

piatkow : I'm not talking about days or weeks, but months without being able to get those pages into the index.

tedster

12:01 am on Sep 29, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sorry, that's still not clear to me. Is any google spider (googlebot, mediabot, etc) requesting those urls that are still not showing up in their index?

thickparasite

10:36 am on Sep 29, 2006 (gmt 0)

10+ Year Member



Sorry Tedster (English is my second language). In fact, I don't track all the pages for GG Bot (I use an include on each page); therefore, I'm not so sure, but I will put back the tag to know more.

Baraccus

5:19 pm on Sep 29, 2006 (gmt 0)

10+ Year Member



The same thing has been happening to me - I have a site map linked off my homepage too, but only half of my pages are now in the index.

Seems it's not crawling through the site, following links anymore like it used to.

All our links are connected by text links, and none are more than 2 levels deep. However, it seems that any that aren't listed in our main navigation menu (shown on every page)have dropped out of the index entirely, and new pages aren't being picked up.

Google never used to be like this. You used to be able to count on it to thoroughly crawl and index your site easily if you had good internal linking. Weird.

thickparasite

9:36 pm on Sep 29, 2006 (gmt 0)

10+ Year Member



Thanks Barracudas ;)
I started to feel like a weirdo with my problem. I'm not the kind complaining around forums about my stuff, but the fact that my internal linking does not work anymore is struggling me.

piatkow

11:48 pm on Sep 29, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The bot no longer follows my links either. It seems to have taken the site map and then drops by for a page or two. I assumed that this is because of the fact that I have given different pages different priorities and update frequencies in the site map.

It does seem to be picking up pages rather eratically and has found low priority pages that are flagged as "yearly" ahead of high priority pages flagged "monthly".

RichTC

12:04 am on Sep 30, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi all,

I can tell you exactly what this issue is:-

Since the roll of of the new infastructure a page has to have a good PR before google wants to bother to index off it - its this simple.

So if you put another site map on your site with a list of pages on it google is not going to spider all of the pages off it and rank them in the serps unless the page has good Page Rank

Likewise the depth is restricted again based on the pages PR value. For example with a PR7 site you may see pages three deep ranking, with a PR5 site its unlikely to go more than two deep and PR4 or less is going to struggle with getting anything spidered off it imo unless its in a non commercial sector.

I realize there are exceptions to this but based on a number of the sites we work on and what i have seen in the serps this is typical of what i see.

So if you want more content indexed in the serps, deeper content and to be taken more seriously by google, bottom line (obvious one) get more links to your site and over time your PR will increase along with your serps positions