Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

How can I make my link page healthy?

         

Cromagnon

2:34 pm on Dec 11, 2008 (gmt 0)

10+ Year Member



Hi,

I have a list of aprox 150 links on one of my subpages. This specific subpage will not rank in Google (even on strange 20 words unique for this specific subpage. Rest of the page is healthy and ranks well). I think the reason for this is that Google consider the subpage as a linkfarm.

Problem is that this is not the case. The page with the links is the most visited on the whole page, and people love it since its a great ressourcepage.

I have added nofollow-tags on all the links, but the page still not ranks in google.

Now I am thinking about making all the links javalinks, so Google can't see the links. (it do not care about PR etc.)

Would this be a good idea and do you guys know any javascript that hide the links for Google?

I am not trying to cheat google, I just want to keep my ressource-page.

Thanks

Cromagnon

tedster

9:31 pm on Dec 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Do you have a significant amount of descriptive text on the page for each resource - along with all the other standard on-page signals? That's one thing you can do - and still Google doesn't always appreciate such a page. A couple other thoughts:

1. 150 links is beyond the Google guideline of 100 (and I usually aim for lower) so breaking it up into 2 or 3 pages might help

2. Be wary of looking like a page of scraped content - be scrupulous about duplicating content.

3. Does the page show any toolbar PR? If it's gray barred, then you will need to do something major.

Cromagnon

9:58 pm on Dec 11, 2008 (gmt 0)

10+ Year Member



Thanks.

0. Thare are text in the top of the list and in the buttom. And ALL links are geared with a title-tag each, so people can read what the link is about before they click. This means, that there are lots of text on the page along with the links.

1. Yea, I know 150 is beyound Google guidelines. But breaking it up into 2 and 3 will really make it more difficult for visitors to navigate.

2. There is no scraped content and noone is scraping me :-)

3. The page still have PR.

Is it really such a bad idea to place the links in java?

dstiles

10:46 pm on Dec 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Java is not the same as Javascript.

A lot of people block Java.

An increasing number block Javascript using such things as NoScript in Firefox because Javascript is now (or was very recently) a major virus source from hacked sites.

I visit a dozen or so sites a week, checking them for a directory I run, and it's not uncommon for one or two a month to be un-navigable because all the links are in Javascript. It doesn't get listed because I'm not turning on Javascript just for navigation.

ken_b

11:35 pm on Dec 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



What do you mean by not ranking? Not on the first page, second page, not in the top 1,000?

How old is the site? How old is the particular page?

How competitive is the topic?

How long did it take to accumulate the links on the page? Did you put the fully built and loaded page in a single shot, or did the number of links grow over time?

It's not impossible for links pages to rank, but I wonder if age and build speed may play a role.

kevsta

12:15 am on Dec 12, 2008 (gmt 0)

10+ Year Member



we have a links page that ranks in the 80's for widgets resources.

its now called that, (widgets-resources.html) having previously been was widget-links.html, - renamed, 301'd, waited 6 months and it showed up in the SERP.

its just over a year old (as is the site) and its also currently alternating between PR4 and greyed out, every second week or so at the moment.

Quadrille

11:27 am on Dec 13, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yea, I know 150 is beyound Google guidelines. But breaking it up into 2 and 3 will really make it more difficult for visitors to navigate.

It'll probably make it easier; 150 links, with some descriptive text, is one long page.

Whatever your topic, there is always a way to subdivide logically, which will help your visitors and the SEs.

If the page has done you no harm so far, then it is unlikley to start now.

I'd look for a way to take out 30-40 links to a second page, linking from the first, and increasing the descriptive text a little as you go.

If this does no harm - and it won't - then repeat the operation with another sub-category.

There's no need for java / javascript or anything technical at all - just use nofollow, if you are really concerned about Google problems.

A better way to keep the resource safe and of value, is to check the links at least every couple of months, and weed out any that no longer work - xenu is your friend.

Having your own niche directory can be great for your visitors, and is not a problem for Google (or other SEs), so long as you maintain high quality. In my experience, it can be quite helpful in SEO terms, not harmful at all (I 'nofollow' selected links; eg to new sites).

dstiles

11:30 pm on Dec 13, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



A warning about xenu and other link checkers.

Xenu (specifically) has several levels of link checking and can go deeply into external sites. Because of this some of us block xenu to avoid all sorts of pages being sucked in.

This does not mean the site is dead, merely that it has to be checked by hand. If you get an error, find out why before ditching the target site.

tedster

12:45 am on Dec 14, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Good warning about Xenu. Also appreciate that 404 link rot is not nearly the troublemaker that a target domain changing its character into a "bad neighborhood" can be. Xenu can't give you that kind of report. So it's a good tool, but not the final word.

Quadrille

2:43 am on Dec 14, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Xenu tells you that ;)

I don't think any link checker can do more than highlight a potential issue (even Robozilla!).

But it's a darn sight quicker than doing the whole thing manually!

Cromagnon

10:01 pm on Dec 27, 2008 (gmt 0)

10+ Year Member



HI guys

Thanks for your help.

I decided to split the page up in three pages with each 1/3 of the links. It seems to help! Now the page is indexed in top 15. Much, much better than before.

There is a leasson to learn here: Even though there is nofollow-tags on every link on the page, Google do not like to0 many links on a page.

Thanks again.

helensimons

1:14 pm on Jan 10, 2009 (gmt 0)

10+ Year Member



I have a problem following the last PR update where my links page (resources page) has gone from pr5 to pr0 (white bar). The links page contains links to sites related to my site and there are no paid links or reciprocal links. Some other internal pages on my site have become pr0 as well. What is the best way to structure a links page so at not to be seens as spam by Google?

jbinbpt

1:57 pm on Jan 10, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I decided to split the page up in three pages with each 1/3 of the links. It seems to help! Now the page is indexed in top 15. Much, much better than before.

Less than a month to do that. That's great. Are all three pages indexed now?

piatkow

8:15 pm on Jan 11, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I had the same problem a few years back and solved it in a similar way. Created a directory index page with links to the subject areas and a block of content (mostly below the fold) about what the directory contained.

Creating a directory is a frustrating business.