homepage Welcome to WebmasterWorld Guest from 54.226.191.80
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Marketing and Biz Dev / Link Development
Forum Library, Charter, Moderators: martinibuster

Link Development Forum

    
Going all wikipedia with my links
wheel




msg:4073020
 8:50 pm on Feb 2, 2010 (gmt 0)

I'm hopefully launching a new site this week. Part of the launch involves taking a site from 21 pages, to 5000 pages of content. The content is all good stuff, not what you're thinking.

Because I've got so much content it's hard for me to sort through it to give myself internal links. My intention is to let the content get indexed first. Then I'm going to search my site using Google for the juicy terms I want to rank for, and order them by whatever Google feels is order of relevance.

Then I'll take a few of those top ranking pages for the term, find the term on those pages, and link to another page on my site that I want to rank for that term. Maybe even link to the homepage in some instances.

Now I'm no PR distribution wizard. Does this seem like a good way to approach this? A different technique I should use? Any limits I should be careful about?

 

KHWeb




msg:4073104
 11:10 pm on Feb 2, 2010 (gmt 0)

I'm not very clear on what you are trying to achieve but what you do depends on your SEO and SMO strategy and time.

The only thing I would watch out for in your case is that you might be better off by creating your site structure before letting Google index the new pages.

A simple example on how I would go about building links to your 5000+ pages and structuring the site:

Let's say that you have 100+ pages that talk about "widget" and this is a juicy keyword too.

1. Create a category/tag page that contains teasers of 100+ pages that are related to widget. Optimize this page so that it has proper density for "widget."

2. Place a link to the widget category page on all the 100+ pages(of course with widget on the anchor text).

3. Place the widget category page one or two clicks away from the homepage...for PR. In this way your individual pages are no more than 3 clicks away from your homepage.

You can do a web search for flat site architecture and find out more.

wheel




msg:4073112
 11:20 pm on Feb 2, 2010 (gmt 0)

Thanks for the detailed response. I appreciate the time you put into that.

The nature of the content has dictated the site architecture. The individual pages are 'bound together' in groups of pages. Think of it as a collection of discrete articles. These 10 pages belong to article A, these 10 pages to article B. And the various articles don't have a whole lot in common theme other than they're the theme of the site. So my architecture is kind of like 'Articles' and then a list of articles, then the individual pages.

What I'm trying to achieve is circulating PR or something like that, using new internal links to get other pages to rank for search terms. Wiki does it, I suspect it's a valid ranking/link building technique.

leadegroot




msg:4073405
 12:15 pm on Feb 3, 2010 (gmt 0)

I always post large chunks of content slowly - I have a newish site I am still releasing that I have 8000 entries for. i am at 2500 live so far after 5 months (I really slacked off over xmas - must get back on it).
Post 'em one by one and look at them individually to see what linking they need. Tedious, but it returns the most value.

RedCardinal




msg:4074215
 10:47 am on Feb 4, 2010 (gmt 0)

One comment on this - the site: operator can return some funky results. In essence some processing goes on when Google determines what to return, and the results may not be indicative of what Google really sees/knows on your site.

It's an interesting idea, but might be unreliable as Google degrades the site: operator, just as they have link: in the past. If you'd like to see what I'm talking about check the site: operator against the # pages indexed from XML sitemaps in GWT. The variance can often be in the high multiples.

saerchengineman




msg:4074618
 9:55 pm on Feb 4, 2010 (gmt 0)

I don't think he's wasting time. I think this is a brilliant idea.
I do agree he should release the articles maybe 50 at a time in distinct periods.

Famous story man see's an archer in the woods, with his quiver of arrows empty, resting at tree. he sees hundreds of arrows embedded in the trees, perfect bulls eyes. "How did you hit every bulls eye" the man asks in amazement. The archer answered I Simply fired the arrows, then drew the bullseye around the arrows!

Let Google choose draw the bulls eyes on the trees, then stick arrows(Relevant Keyword Links) into the trees!

I look forward to seeing the results of Wheel's test. Please keep us informed.

Searchengineman

martinibuster




msg:4075134
 8:30 pm on Feb 5, 2010 (gmt 0)

I'm wary of the accuracy of doing that for pages that are not well established. Maybe give it a try with Bing and Yahoo for a second and third opinion because they seem to focus on content a little better. Google is heavily weighted by links, sometimes ranking a site with zero content, just 404s- meaning that Google is overlooking content altogether for some sites and ranking solely on links.

BradleyT




msg:4078568
 7:01 pm on Feb 11, 2010 (gmt 0)

I think your biggest flaw is assuming all 5000 pages are going to get indexed [right away]. When you do your site commands you could have hundreds of pages that might be best for getting ranked but they just haven't been indexed yet.

wheel




msg:4078586
 7:27 pm on Feb 11, 2010 (gmt 0)

I agree with the establishement/time issue. I'm going to do this after a period of time. And for new content going forward, I may drop the links in right from the start.

JS_Harris




msg:4100149
 6:18 am on Mar 18, 2010 (gmt 0)

A better approach would be to rank all of your own pages in order of potential traffic for its main keyword phrase (which is hopefully part of the title).

Next identify pages which compete against each other for the same term and make sure to link to the primary article from each of the extra articles (Google will only rank one for any given term).

And for good measure have the pages least likely to receive traffic (because the term is obscure or not searched for often) link to related more highly targeted pages.

You'll be well on your way with just this, adjust fire with GWT as necessary.

Now, care to tell me how you can go from 30 to 5000 articles that quickly? I write a couple per night max so it would take me over a decade!

CainIV




msg:4102153
 6:36 am on Mar 22, 2010 (gmt 0)

Good suggestions JS. I use this approach as well.

However I pursue the research from a niche vertical sense, and research 'top down'.

Then articles within that vertical support 'upward' using breadcrumbs.

If you are able to sit down with a good programmer and brainstorm the concept of 'link / search keys' then you can develop a system for your articles that

1. Allows any article to reference other articles by keyword.

2. Allows you to setup potential anchor text for any given keyword.

3. Allows the system to randomly 'pull' and permanently write one variation, each time, when referencing said article from any piece of content.

4. In admin, for each article, the logic can be set to 'show' you potential matches 'in content' for any article you publish, and allow you to edit, delete(or simply add your own)

This way, articles are interlinking automatically in a very logical way (since you set the relevance and anchor text variations) and yet the logic and programming takes care of the heavy lifting :)

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Link Development
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved