homepage Welcome to WebmasterWorld Guest from 54.237.134.62
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 87 message thread spans 3 pages: 87 ( [1] 2 3 > >     
Experiments in keyword rich links to Home
CainIV




msg:3756887
 6:21 pm on Oct 1, 2008 (gmt 0)

Hi guys, I am doing an interesting experiment on two of my more throw away domains. The experiment is testing to try and determine more information about how linking to the homepage affects rankings. The testing involves various controls - linking to the root domain from the nav only using 'home', linking from the nav using 'main keyword', linking from nav using 'variations' of keyword, linking from content only (while nav links saying home) to home using 'keywords' etc, etc.

First, I should mention some points about the domain.

4 years old
Owned by me
Dedicated IP
Canonical comdomized
HTML only
Ranks top 5 in Google.com for main, second and third keyword phrases.
Total of 90 pages, all unique content (written by me)

Testing was done over a 3 month period, with grace periods in between testing.

Here is so far what I have found. Might tell us a little about the threshold and re ranking filters

1. Linking home from every page in content using the same keyword caused 6 page drop in rankings.

2. Linking home using keyword in nav on all pages caused the same drop.

3. Link home from every page in content using variations caused a 3 page drop.

4. Linking home from the first 10 pages listed in google.com for site:domain.com/* brought increased ranking (from 5th to 3rd)

What is really interesting is that I gotten this down to the 'by page' factor. When I *slightly* cross the threshold and add links to two extra pages, and then wait until they are cached, I tip the scales and drop, to page 6.

What is further interesting is that linking home from content using variations of keywords WAS quite effective to a point, after which the site plummeted.

As well, this might point to a 'hard line' being crossed in terms of threshold, at one point I had the website going between position 4 and 51-60 for the same keyword every second day (flipping back and forth)

My test will be about trying to -950 the website by being ridiculously deliberate in nav linking, and then seeing if I can reverse the results by removing those (and how long it takes for the trust to be reinstated to the website)

Any thoughts?

 

spadilla




msg:3757001
 8:50 pm on Oct 1, 2008 (gmt 0)

Linking home from the first 10 pages listed in google.com for site:domain.com/* brought increased ranking (from 5th to 3rd)

I was wondering about this statement. Would this only be useful on the first 10 results in this query? Are the first 10 pages shown in the results more important than 11-20 etc.? Are those top 10 pages regarded as more trustworthy or important by Google?

Thanks a bunch for sharing your experiment, I think it will be interesting to see the end result.

Robert Charlton




msg:3757055
 10:15 pm on Oct 1, 2008 (gmt 0)

CainIV - Thanks for sharing. Several thoughts come to mind.

One is that Google might be reacting to the change rather than to just the keyword situation. After the drop, what happened when you removed the keyword phrase and changed it back to "home"?

I should note that tedster has mentioned numerous times in the forum that you might be OK with an anchor text home page link if you'd had it from the start... but that he'd seen that a change might well cause a drop.

Secondly, I've observed a similar effect when linking from paragraph content within pages to other pages (apart from home)... a rise with the first, but then a drop if I added too many. If I used a link from within paragraph content that was the same as nav anchor text, the drop was likely to happen sooner.

I haven't been as well-organized as you've been about keeping records. Yes, though, a drop of roughly three pages does seem about right. In some cases, if I waited long enough, the page crept or jumped back up.

Again, I'm wondering what happens if you go back to the old link. Does Google remember original status, or do they treat it as another round of changes?

Even as I'd written about this, btw, there was one think gnawing at me that didn't make sense... which is how could you be penalized for global navigation, something which a great many sites have?

Might there be other factors that accompanied your changes? Eg, had you already reached a threshold of optimization (which is what I'd posited), where anything further was overoptimization, unless supported by a sufficiently diverse combination of relevant external inbounds?

CainIV




msg:3757072
 10:51 pm on Oct 1, 2008 (gmt 0)

I was wondering about this statement. Would this only be useful on the first 10 results in this query? Are the first 10 pages shown in the results more important than 11-20 etc.? Are those top 10 pages regarded as more trustworthy or important by Google?

Excellent question. I was positing that those pages were what Google deems the most powerful inner pages of the website. I will also experiment with other sets as well once I can put some kind of certainty to the results I have seen already.

Hi Robert.

Specific changes I made which tipped the scales, and then were reversed, brought the website back to where it was. This brings us into the realm of ideas of a decaying trust value for changes. If a trusted website makes sudden changes and drops, and those changes are immediately reversed, does the website always bounce right back? In my set of research, the answer so far is yes. I have yet to confirm it, but one of my tests will be around leaving overopt changes for a longer period of time and then seeing how close to the original changes the site gets to.

In fact, the phenomenon I discussed was precisely about getting to such as fine threshold point where Google wanted to keep both in and out of the 'filter' dataset, which is why I believe I bounced back and forth from 60 to 5 until the original site without keywords was reinstated, whereupon the rankings went back to 5 and stopped fluxing (out to 60)

I was thinking along those same lines - is it that ANY over optimization is simply enough to push a site out, or is it in fact specific changes that do more damage quicker than others.

I have tbe benefit of being able to perform on testing these two websites, so the next round of testing will be:

1. In-content primary keyword linking from top 10 pages to home VS the effects of in-nav linking using primary keyword home.

2. Thoroughly (ridicuously) overlinking internal pages to each other only while not linking to the root domain with any other term than home

3. Non related, but am going to change the domain name and put in a friend's name, which is an entirely different town, and see what changes occur.

Essentially, I am trying to define the current rough rule set (although it fluxes daily) to be a better sense of what is truly current on a day to day basis.

Robert Charlton




msg:3757121
 1:29 am on Oct 2, 2008 (gmt 0)

CainIV - Quick thoughts about some factors that may skew your tests. The effects you're testing, as I've observed them on inner pages, are related both to how competitive a query is, and to how "good" your inbound links are. Some terms are competitive enough that they absolutely depend on inbound links... and some don't.

There are also, of course, onpage factors, which have gotten increasingly complex... but my feeling is the algo now includes an overall statistical view of a page.

Some measure of user satisfaction, as I've mentioned before, also appears to be part of the algo... and again that seems to vary by how competitive a phrase is, where it is in the serps, and apparently also the market area.

g1smd




msg:3757208
 6:38 am on Oct 2, 2008 (gmt 0)

Just a quick side-note for readers new to the topic...

Make sure your links to the root page are pointing to "/" or to "www.domain.com/". That is, make sure the links are not pointing to "/index.html" or any such named file.

idolw




msg:3757224
 7:25 am on Oct 2, 2008 (gmt 0)

3. Link home from every page in content using variations caused a 3 page drop.

do you mean you were linking to home page from every page but with various anchor texts?

rise2it




msg:3757293
 10:08 am on Oct 2, 2008 (gmt 0)

CainIV, these are some great experiments - thanks for sharing the results you've seen so far.

When you changed the link text to the homepage of those two sites on ALL pages, how many pages did each of those sites have?

CainIV




msg:3757906
 4:57 am on Oct 3, 2008 (gmt 0)

Hi idolw:

"do you mean you were linking to home page from every page but with various anchor texts?"

Yes, they were primary, sec2 and sec2 keyword variations.

Hi rise2it:

"When you changed the link text to the homepage of those two sites on ALL pages, how many pages did each of those sites have?"

Both sites have approximately 90 pages. I never added any content because I want to keep control as much as possible over skewing any results.

rise2it




msg:3757947
 6:50 am on Oct 3, 2008 (gmt 0)

Thanks CainIV,

I'm working on doing the same thing changing all the links (several versions) from every page linking back to the homepage of a site of similar size.

When everything gets re-crawled in a week or so I'll come back and post my results here for comparison.

econman




msg:3758159
 2:40 pm on Oct 3, 2008 (gmt 0)

For several years we have used the site name (which are keywords) in the breadcrumb navigation links from every page back to the home page.

Does anyone have a feel for whether this could be a problem, or whether the apparent "overoptimization" penalty being discussed in this thread is limited to new sites, and/or recent changes to an existing site?

CainIV




msg:3758224
 4:56 pm on Oct 3, 2008 (gmt 0)

Hi rise2it. Keep in mind that the experiment I am doing is about intentionally 'walking the line' to redefine where the line actually is. I wouldn't suggest making large scale internal linking changes like that for successful ranking websites :)

The experiment is simply meant to shed some light on thresholds that are out there now (this changes daily, and is different for each website)

"Does anyone have a feel for whether this could be a problem, or whether the apparent "overoptimization" penalty being discussed in this thread is limited to new sites, and/or recent changes to an existing site?"

This is definitely what I am trying to find out. It seems, as tedster pointed out earlier, that websites that historically started with those types of internal anchors pointing home got over that assessment hump. Other websites that add it later in the lifecycle of the website often get hammered in the results.

What I am trying to do is develop theories as to the 'why' of all of these thresholds.

whitenight




msg:3758460
 11:31 pm on Oct 3, 2008 (gmt 0)

4. Linking home from the first 10 pages listed in google.com for site:domain.com/* brought increased ranking (from 5th to 3rd)

Hmm, this is a huge discovery if it one extrapolates this fully.
The very vague question I have (so i don't give away how powerful this info could be) is:
Is it a ratio of total pages or just a max of 10?

idolw




msg:3759506
 7:05 am on Oct 6, 2008 (gmt 0)

anyone tried this problem on other internal pages and not just home page?

Lorel




msg:3778693
 6:35 pm on Nov 2, 2008 (gmt 0)

2. Linking home using keyword in nav on all pages caused the same drop (6 pages).

Hmmmmm. Just realized I did this in the footer months ago. I just removed it and will report back in a few weeks.

CainIV




msg:3778709
 7:15 pm on Nov 2, 2008 (gmt 0)

Hi Lorel. I have found that if a website is launched with those footer to home keyword links, then it is almost as if Google ignores those. However, if they are added or removed as the website begins to rank, in general the rankings go south.

Be careful when adding or removing those footer links!:)

My site is a domain that I can afford to test with and do not rely on it for any income at the moment.

idolw




msg:3779043
 10:08 am on Nov 3, 2008 (gmt 0)

Hi Lorel. I have found that if a website is launched with those footer to home keyword links, then it is almost as if Google ignores those. However, if they are added or removed as the website begins to rank, in general the rankings go south.

Did you notice that only with links to home page or to other internals too?

CainIV




msg:3779318
 5:05 pm on Nov 3, 2008 (gmt 0)

Hi Idolw. So far I only notice it with links to home. I will be setting up a separate experiment that tests this too.

BradleyT




msg:3779379
 6:55 pm on Nov 3, 2008 (gmt 0)

Quick question and I assume the answer is NO. Do you have a header image link to the homepage that shows up (in HTML content) before any text link to the homepage?

[edited by: BradleyT at 6:56 pm (utc) on Nov. 3, 2008]

CainIV




msg:3779387
 7:06 pm on Nov 3, 2008 (gmt 0)

Hi Bradly. For this particular website I did, but that control stayed in place throughout the entire test (in other words the website stayed exactly intact throughout the process.

Best,
Todd

wingslevel




msg:3779417
 7:34 pm on Nov 3, 2008 (gmt 0)

I have long had the theory (alas, no proof)that link text is viewed in its entirety. That is to say internal and external links. Let's say that the home page has 50 ibl's with maybe 60% with the main keyword combo as anchor - if you added 10 internal links w/ that anchor it is probably ok - but 200 and now the ratio of inbound to internal gets skewed and trips a filter.

I think the issue is not so much what % of total pages on the site use anchor to link to home, but what % vs inbounds....

Robert Charlton




msg:3796775
 12:05 am on Nov 30, 2008 (gmt 0)

...ratio of inbound to internal...

wingslevel - An intriguing idea, but I haven't been able to get my head around how they'd allow for an extremely lopsided ratio while the site is getting known and still building its inbounds. A time-based requirement of some sort would probably also be unworkable.

Be careful when adding or removing those footer links!

CainIV - Could you comment about the "removing" part of this statement. I've been looking at a site where I'm thinking about removing the footer links, because they're a little too specific. The drops you report are all for adding links. What would your thoughts be about removing them?

youfoundjake




msg:3796845
 5:35 am on Nov 30, 2008 (gmt 0)

As a result of reading this, and a nudge from g1smd, I went ahead and changed the link from index.htm to the FQDN, all though I do have a 301 in place as well.

So if I am reading this correctly, if a site has been up for awhile, I shouldn't adjust the anchor text to use synonyms? If a site is not performing well yet, then could adjusting the anchor text help it more then hurt it?

Next I get to work on content negotiation. weee....

Quadrille




msg:3796960
 1:04 pm on Nov 30, 2008 (gmt 0)

Interesting experiment!

CainIV, as the one closest to this; did any of your results surprise you or disappoint you?

Do you feel confident enough to make any recommendation as a result?

Do your findings suggest any follow up research to clarify your interpretation of the results?

I'm a born sceptic, but I'm impressed with your set up; so much SE research makes no attempt at controls, for example!

I am concerned a little with the suggestion by some that it's difficult to be sure if it's 'the changed items' having the effect, or 'the change itself'; Do you think your reversions and restoration of rankings has answered that point?

And, of course, we'd need to replicate before being too definite.

Thanks for doing it - and sharing it.

Marcia




msg:3797496
 2:50 pm on Dec 1, 2008 (gmt 0)

anyone tried this problem on other internal pages and not just home page?

I won't usually cross-post to two threads, but in this case posting one from the yo-yo thread is appropriate, since it applies to excessive anchor text and interior pages on a site rather than the homepage, as was asked about:

I've just seen a site "yo-yo" up to #15-16 from languishing at down around 60-ish (and lower) for a long time, after it had been at #15-20 for a couple of years. It's unmistakeably because of an offending factor being removed from the site, and it's been holding fast at its current 16-ish position for a couple of weeks now.

It was an internal anchor text issue that I fought long and hard to have changed; and when it finally was, it took around a week (give or take a little, not 100% certain since I didn't check daily) to rebound.

Clarification:

Excessive anchor text had been added preceding the drop - and it was an interior page, not the homepage, that was affected.

Excessive internal anchor text started to appear to be an issue around the time of the Florida update, and IMHO it's one of the primary over-optimization factors to look at; I'm still convinced of it, and it's not at all uncommon among DIY SEO'ing, from what I've seen.

That site ranks super-fine for its "star" keyword phrases - for the homepage - but does not for interior pages and other keyphrases for anything near half-way competitive, and I continue to remain convinced that it's because of site-wide interior linking, including a mega-menu (fully meshed navigation with 105+ links on the homepage - plus additional instances of problematic anchor text on pages), including more than one link to the same page(s) with different anchor text in the same menu and on the same pages; and way excessive keyword repetition, to boot.

[edited by: Marcia at 3:01 pm (utc) on Dec. 1, 2008]

Quadrille




msg:3797506
 3:11 pm on Dec 1, 2008 (gmt 0)

Yo-Yo Effect - Observations and Understandings [webmasterworld.com]

Marcia




msg:3797551
 4:14 pm on Dec 1, 2008 (gmt 0)

I am concerned a little with the suggestion by some that it's difficult to be sure if it's 'the changed items' having the effect, or 'the change itself'; Do you think your reversions and restoration of rankings has answered that point?

Quadrille, per the historical factors, publicly published, we're privy to, changes (in anchor text) themselves are tracked and can have an effect - though we can only guess by how much, and/or what the thresholds might be. However, in the case I mentioned I have no doubt it's number of occurrences. Three times per page of identical anchor text to a page is overboard, and 4 times is definitely over-kill. And this wasn't our normal yo-yo up-down-up-down, this was over a protracted period of time within well observable parameters.

[edited by: eelixduppy at 8:46 pm (utc) on Dec. 2, 2008]

CainIV




msg:3797605
 5:40 pm on Dec 1, 2008 (gmt 0)

Hi guys:

Wingslevel -
I think the issue is not so much what % of total pages on the site use anchor to link to home, but what % vs inbounds....

I would agree, and would add that if I am interpreting my results correctly then this means that:

a) Linking home too many time poses a threshold penalty, especially when this element is changed and historically was not present at onset.

b) Removing or adding these types of links can trigger a performance hit on keywords for the domain.

c) Linking from content to the homepage using keyword rich links was much more 'fuzzier' and lead to much less of a penalty.

d) Testing the top 10 site:pages.com page, and linking to home using the primary keyword in content from those pages, while keeping all other pages historically the same created the biggest increases overall.

Robert C -
The drops you report are all for adding links. What would your thoughts be about removing them?

I would think the same process works for removing footer links especially if they are keyword rich, and most especially if any of them link to the homepage. My suggestion would be to remove links one at a time, and do this first on the lest important pillar pages in the website, then evaluate rankings / traffic from there.

Hi Quadrille -
CainIV, as the one closest to this; did any of your results surprise you or disappoint you?

The biggest result that surprised me was that linking from the top ten pages in content to the homepage had to negative effects on ranking, and only positive effects, and did not trigger any penalty or drop that I seen when adding sitewide, footer, nav, or higher amounts of bulk links pointing home.

Do you feel confident enough to make any recommendation as a result?

I am certainly comfortable enough to suggest that linking from content from the pillar or support pages of the website naturally to the home page, and across to other related topical themes may be something you should consider IF it also makes sense for the visitor. What I am not suggesting is for members to arbitrarily add links JUST for the sake or ranking improvements.

I am also comfortable in suggesting that overuse of keyword links to prominent pages in the footer and nav sections can quickly cause an overall drop in rankings, as this was the case with my control website.

As well, it is important to note that in response to what Marcia commented on, this website did experience the Yo-Yo effect when links were changed, especially when the number of links added, and the location of those links over time, conveyed an unnatural profile. This might suggest that the Yo-Yo effect is a phenomenon which can describe a tipping point in trust that is not fully realized.

I was around for Florida, but I can clearly see both sides of the story here.

Historical information, patents and insights are always important in our work. However, the biggest piece of the puzzle that many webmasters do not have access to is live testing on a ranking website. Therefore, it's very difficult to want to make experimental pages when weighed against the risk of dropped rankings.

I think for the sake of clarity my controlled experiment simply suggested some of what we already know:

Rankings are very influenced by changes to elements of a website which historically have not changed often.

Changes on historical elements on websites with age that have been causing a filter and loss of rankings can cause the website to revert back to better rankings within a surprisingly low amount of time.

Keyword rich nav and footer links and dangerous to mess around with if they have been there historically and the website currently has good rankings on most pages.

The more important the pillar page - be in based on linking hierarchy, age and citation by other websites - the more important and pronounced the changes were.

Finally, linking around to the top ten pillar pages in the website from content caused improvements in rankings, and these links seemed to suggest a 'fuzzy logic' about the nature of how Google views editorial links based on nav and template links.

Feel free to suggest any sweeping tests you want me to make on my domain for the purpose of further experiments.

tootricky




msg:3798112
 9:26 am on Dec 2, 2008 (gmt 0)

A very interesting test.

I have seen results similar to yours on a site that had a footer link to the homepage on each page of the website with optimised anchor text. This link was added about 2 months ago and the site dropped into 70s for that keyword. The template link was removed and the site bounced back to page 1 again.

I would love to this this test expanded to include variations with the nofollow attribute. To see whether the value of anchor tet is COMPLETELY nullified by it or not.

Shaddows




msg:3798137
 10:08 am on Dec 2, 2008 (gmt 0)

I would love to this this test expanded to include variations with the nofollow attribute. To see whether the value of anchor tet is COMPLETELY nullified by it or not.

Agree. If you started with no site-wide footer, then added one that was nofollowed, does it
1) Have precisely zero effect
2) Have the same (negative) reaction as adding a normal link
3) Have a different reaction.

Alternatively, if you had a site-wide link, and nofollowed it
1) Precisely zero effect
2) Same negative results as removing link
3) Something else

Finally, the effect of a two-stage addition/subtaction of a site-wide link (giving time to settle down between), i.e
normal -> nofollow -> removed
none -> nofollow -> normal

Note: a thread about the rel=nofollow attribute has been split
off. Please continue any discussion of that second topic here:

[webmasterworld.com...]

This could also shed light on SERP effects due to 'change' (non-sameness) as opposed to THE CHANGE IMPEMENTED.

So Cain, any chance I can take you up on your unbelievably kind offer?
Feel free to suggest any sweeping tests you want me to make on my domain for the purpose of further experiments.

[edited by: tedster at 10:27 am (utc) on Dec. 10, 2008]

This 87 message thread spans 3 pages: 87 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved