Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Internal links - Is there such a thing as too many?

         

cazzzk

2:05 pm on Dec 15, 2008 (gmt 0)

10+ Year Member



On the site where I work, many pages have a right hand column which contains lists of the top 5 which link to members profiles for things like 'most recent member' or 'most recent appeal' etc. etc.

Members profiles which have appeared in these lists are then recorded by Google as having 100s or 1000s of internal links, I suppose as the list has appeared on many different site pages as Googlebot spidered them.

This has resulted in e.g. one page having 3155 internal links, another having 2453, 879 etc. etc.

Are so many internal links going to individual pages a problem, does anyone know?

We try to have good internal linking but have noticed these excessive links for individual pages. The right hand column containing the lists was taken off the site a couple of weeks ago but Google still has all the links recorded at the moment (OK, I haven't actually noticed them until now).

If they are a problem, should I get Google to drop the pages with the large number of links from their cache and wait until they are indexed again?

tedster

8:10 pm on Dec 15, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yesw, thousands of internal links on the same page can be a significant problem. Google's recommendation is to limit pages to under 100 links - they used to not even look past that amount - although today they do. But housands is splitting your PageRank into very small packets and wasting it on links to pages that don't matter in the search results. I also doubt that Google even today would handle 3000 links on a page very well.

One answer if you want to retain the convenience of those links for your users - add the attribute rel="nofollow" to all the links you don't want to see in the search results.

should I get Google to drop the pages with the large number of links from their cache and wait until they are indexed again?

No - the issue is not the pages that used to hold the links. They should be spidered again rather quickly. The issue is all the urls that you used to link to and that Google now knows about

Since you've already removed those links from the pages, you've already addressed the PageRank dilution issue. But Google will still keep those urls in its memory for an indefinite amount of time and continue to spider them. For this reason, I'd also suggest adding such urls to a robots.txt disallow rule. Help googlebot focus its resources where they do you the most good.