Forum Moderators: Robert Charlton & goodroi
I have a list of aprox 150 links on one of my subpages. This specific subpage will not rank in Google (even on strange 20 words unique for this specific subpage. Rest of the page is healthy and ranks well). I think the reason for this is that Google consider the subpage as a linkfarm.
Problem is that this is not the case. The page with the links is the most visited on the whole page, and people love it since its a great ressourcepage.
I have added nofollow-tags on all the links, but the page still not ranks in google.
Now I am thinking about making all the links javalinks, so Google can't see the links. (it do not care about PR etc.)
Would this be a good idea and do you guys know any javascript that hide the links for Google?
I am not trying to cheat google, I just want to keep my ressource-page.
Thanks
Cromagnon
1. 150 links is beyond the Google guideline of 100 (and I usually aim for lower) so breaking it up into 2 or 3 pages might help
2. Be wary of looking like a page of scraped content - be scrupulous about duplicating content.
3. Does the page show any toolbar PR? If it's gray barred, then you will need to do something major.
0. Thare are text in the top of the list and in the buttom. And ALL links are geared with a title-tag each, so people can read what the link is about before they click. This means, that there are lots of text on the page along with the links.
1. Yea, I know 150 is beyound Google guidelines. But breaking it up into 2 and 3 will really make it more difficult for visitors to navigate.
2. There is no scraped content and noone is scraping me :-)
3. The page still have PR.
Is it really such a bad idea to place the links in java?
A lot of people block Java.
An increasing number block Javascript using such things as NoScript in Firefox because Javascript is now (or was very recently) a major virus source from hacked sites.
I visit a dozen or so sites a week, checking them for a directory I run, and it's not uncommon for one or two a month to be un-navigable because all the links are in Javascript. It doesn't get listed because I'm not turning on Javascript just for navigation.
How old is the site? How old is the particular page?
How competitive is the topic?
How long did it take to accumulate the links on the page? Did you put the fully built and loaded page in a single shot, or did the number of links grow over time?
It's not impossible for links pages to rank, but I wonder if age and build speed may play a role.
its now called that, (widgets-resources.html) having previously been was widget-links.html, - renamed, 301'd, waited 6 months and it showed up in the SERP.
its just over a year old (as is the site) and its also currently alternating between PR4 and greyed out, every second week or so at the moment.
Yea, I know 150 is beyound Google guidelines. But breaking it up into 2 and 3 will really make it more difficult for visitors to navigate.
It'll probably make it easier; 150 links, with some descriptive text, is one long page.
Whatever your topic, there is always a way to subdivide logically, which will help your visitors and the SEs.
If the page has done you no harm so far, then it is unlikley to start now.
I'd look for a way to take out 30-40 links to a second page, linking from the first, and increasing the descriptive text a little as you go.
If this does no harm - and it won't - then repeat the operation with another sub-category.
There's no need for java / javascript or anything technical at all - just use nofollow, if you are really concerned about Google problems.
A better way to keep the resource safe and of value, is to check the links at least every couple of months, and weed out any that no longer work - xenu is your friend.
Having your own niche directory can be great for your visitors, and is not a problem for Google (or other SEs), so long as you maintain high quality. In my experience, it can be quite helpful in SEO terms, not harmful at all (I 'nofollow' selected links; eg to new sites).
Xenu (specifically) has several levels of link checking and can go deeply into external sites. Because of this some of us block xenu to avoid all sorts of pages being sucked in.
This does not mean the site is dead, merely that it has to be checked by hand. If you get an error, find out why before ditching the target site.
Thanks for your help.
I decided to split the page up in three pages with each 1/3 of the links. It seems to help! Now the page is indexed in top 15. Much, much better than before.
There is a leasson to learn here: Even though there is nofollow-tags on every link on the page, Google do not like to0 many links on a page.
Thanks again.