|Does too many internal links to same page trigger penalties?|
I have a website with several category pages, which are in fact HTML sitemaps to facilitate navigation. There are no duplicate content issue among these pages.
However, these pages do contain several links (sometime more than 100) to a single target page, with a different URL parameter value and anchor text for each link.
The target page displays different information for each parameter value, but since it is lean, I have made it canonical to avoid potential thin/near duplicate content issues.
I have some issue indexing that site at Google and I am wondering whether my HTML sitemap/category pages are to blame?
Could it be that Google interprets these pages as an attempt to manipulate bots (by pointing excessively to the same page with too many keywords)?
It is not my intention to do so, but since there is no feedback from Google, I am trying to understand what is really happening.
I am considering setting 'noindex,follow' on these pages? What do you guys think?
I think you are confusing what a "page" is. If a single script returns a different content based on different parameters in URL then the result is that you have "different pages" with "different URLs".
Here I mean a trully different content and not filtered/sorted content that is a subset of another page.
If you have canonicalised pages with different content to a single URL, then Google will ignore non-canonical URLs and will not follow links from non-canonical page. And unless the page linked from non-canonical page is linked from elsewhere, this page may not be crawled.
If page.php?param1=A has links to
And page.php?param1=B has links to
If page.php?param1=B has canonical link element pointing to page.php?param1=A then page4 and page5 may not be crawled.
What you should do is to make (at least) page.php?param1=B noindex (follow) and page.php?param1=A may be indexed or may not be indexed depending on the content (if it is all links, I would noindex it too).
|Could it be that Google interprets these pages as an attempt to manipulate bots (by pointing excessively to the same page with too many keywords)? |
As said above, if URL parameters are different, these are different pages (which may or may not have the same content).
aakk9999: Thanks. I agree that I have narrowed the concept of a page a little too much in my question.
Tell me if I understand you correctly:
You tell me that I should "noindex,follow" my pages with plenty of links because Google will not find it interesting content, not because Google thinks I am trying to manipulate bots? Is that it?
For the records, using "noindex,follow" on my category pages AND removing them from the index with GWT's remove URL feature, lifted the silent indexation penalty I was facing.
Thank you on reporting back and glad it worked for you :)
So now we have something called a "silent indexation penalty?"
I'm sorry, so correct me if I'm wrong, but what this sounds like poor performance caused by a poorly structured site, in no possible way a penalty.
Sometimes it's us that shoots ourselves in the foot and not Google. (Been there, done that. Rinsed and repeated.)
@jimbeetle, you are right, lets make it clear that this was not a penalty.
In this case the problem was caused by having a canonical link element where it should not be.
|Sometimes it's us that shoots ourselves in the foot and not Google. |
Yep, most of us have done this at one time or another!