Welcome to WebmasterWorld Guest from

Forum Moderators: martinibuster

Message Too Old, No Replies

Going all wikipedia with my links

8:50 pm on Feb 2, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member wheel is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 11, 2003
votes: 11

I'm hopefully launching a new site this week. Part of the launch involves taking a site from 21 pages, to 5000 pages of content. The content is all good stuff, not what you're thinking.

Because I've got so much content it's hard for me to sort through it to give myself internal links. My intention is to let the content get indexed first. Then I'm going to search my site using Google for the juicy terms I want to rank for, and order them by whatever Google feels is order of relevance.

Then I'll take a few of those top ranking pages for the term, find the term on those pages, and link to another page on my site that I want to rank for that term. Maybe even link to the homepage in some instances.

Now I'm no PR distribution wizard. Does this seem like a good way to approach this? A different technique I should use? Any limits I should be careful about?
11:10 pm on Feb 2, 2010 (gmt 0)

New User

5+ Year Member

joined:Sept 17, 2008
votes: 0

I'm not very clear on what you are trying to achieve but what you do depends on your SEO and SMO strategy and time.

The only thing I would watch out for in your case is that you might be better off by creating your site structure before letting Google index the new pages.

A simple example on how I would go about building links to your 5000+ pages and structuring the site:

Let's say that you have 100+ pages that talk about "widget" and this is a juicy keyword too.

1. Create a category/tag page that contains teasers of 100+ pages that are related to widget. Optimize this page so that it has proper density for "widget."

2. Place a link to the widget category page on all the 100+ pages(of course with widget on the anchor text).

3. Place the widget category page one or two clicks away from the homepage...for PR. In this way your individual pages are no more than 3 clicks away from your homepage.

You can do a web search for flat site architecture and find out more.
11:20 pm on Feb 2, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member wheel is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 11, 2003
votes: 11

Thanks for the detailed response. I appreciate the time you put into that.

The nature of the content has dictated the site architecture. The individual pages are 'bound together' in groups of pages. Think of it as a collection of discrete articles. These 10 pages belong to article A, these 10 pages to article B. And the various articles don't have a whole lot in common theme other than they're the theme of the site. So my architecture is kind of like 'Articles' and then a list of articles, then the individual pages.

What I'm trying to achieve is circulating PR or something like that, using new internal links to get other pages to rank for search terms. Wiki does it, I suspect it's a valid ranking/link building technique.
12:15 pm on Feb 3, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 27, 2003
posts: 1642
votes: 0

I always post large chunks of content slowly - I have a newish site I am still releasing that I have 8000 entries for. i am at 2500 live so far after 5 months (I really slacked off over xmas - must get back on it).
Post 'em one by one and look at them individually to see what linking they need. Tedious, but it returns the most value.
10:47 am on Feb 4, 2010 (gmt 0)

Junior Member

5+ Year Member

joined:Aug 31, 2006
posts: 69
votes: 0

One comment on this - the site: operator can return some funky results. In essence some processing goes on when Google determines what to return, and the results may not be indicative of what Google really sees/knows on your site.

It's an interesting idea, but might be unreliable as Google degrades the site: operator, just as they have link: in the past. If you'd like to see what I'm talking about check the site: operator against the # pages indexed from XML sitemaps in GWT. The variance can often be in the high multiples.
9:55 pm on Feb 4, 2010 (gmt 0)

New User

5+ Year Member

joined:Jan 19, 2010
posts: 10
votes: 0

I don't think he's wasting time. I think this is a brilliant idea.
I do agree he should release the articles maybe 50 at a time in distinct periods.

Famous story man see's an archer in the woods, with his quiver of arrows empty, resting at tree. he sees hundreds of arrows embedded in the trees, perfect bulls eyes. "How did you hit every bulls eye" the man asks in amazement. The archer answered I Simply fired the arrows, then drew the bullseye around the arrows!

Let Google choose draw the bulls eyes on the trees, then stick arrows(Relevant Keyword Links) into the trees!

I look forward to seeing the results of Wheel's test. Please keep us informed.

8:30 pm on Feb 5, 2010 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
votes: 170

I'm wary of the accuracy of doing that for pages that are not well established. Maybe give it a try with Bing and Yahoo for a second and third opinion because they seem to focus on content a little better. Google is heavily weighted by links, sometimes ranking a site with zero content, just 404s- meaning that Google is overlooking content altogether for some sites and ranking solely on links.
7:01 pm on Feb 11, 2010 (gmt 0)

Preferred Member

10+ Year Member

joined:Feb 18, 2003
votes: 0

I think your biggest flaw is assuming all 5000 pages are going to get indexed [right away]. When you do your site commands you could have hundreds of pages that might be best for getting ranked but they just haven't been indexed yet.
7:27 pm on Feb 11, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member wheel is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 11, 2003
votes: 11

I agree with the establishement/time issue. I'm going to do this after a period of time. And for new content going forward, I may drop the links in right from the start.
6:18 am on Mar 18, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:July 29, 2007
votes: 64

A better approach would be to rank all of your own pages in order of potential traffic for its main keyword phrase (which is hopefully part of the title).

Next identify pages which compete against each other for the same term and make sure to link to the primary article from each of the extra articles (Google will only rank one for any given term).

And for good measure have the pages least likely to receive traffic (because the term is obscure or not searched for often) link to related more highly targeted pages.

You'll be well on your way with just this, adjust fire with GWT as necessary.

Now, care to tell me how you can go from 30 to 5000 articles that quickly? I write a couple per night max so it would take me over a decade!
6:36 am on Mar 22, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 19, 2004
votes: 0

Good suggestions JS. I use this approach as well.

However I pursue the research from a niche vertical sense, and research 'top down'.

Then articles within that vertical support 'upward' using breadcrumbs.

If you are able to sit down with a good programmer and brainstorm the concept of 'link / search keys' then you can develop a system for your articles that

1. Allows any article to reference other articles by keyword.

2. Allows you to setup potential anchor text for any given keyword.

3. Allows the system to randomly 'pull' and permanently write one variation, each time, when referencing said article from any piece of content.

4. In admin, for each article, the logic can be set to 'show' you potential matches 'in content' for any article you publish, and allow you to edit, delete(or simply add your own)

This way, articles are interlinking automatically in a very logical way (since you set the relevance and anchor text variations) and yet the logic and programming takes care of the heavy lifting :)

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members