homepage Welcome to WebmasterWorld Guest from 184.73.104.82
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 87 message thread spans 3 pages: < < 87 ( 1 [2] 3 > >     
Experiments in keyword rich links to Home
CainIV

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3756885 posted 6:21 pm on Oct 1, 2008 (gmt 0)

Hi guys, I am doing an interesting experiment on two of my more throw away domains. The experiment is testing to try and determine more information about how linking to the homepage affects rankings. The testing involves various controls - linking to the root domain from the nav only using 'home', linking from the nav using 'main keyword', linking from nav using 'variations' of keyword, linking from content only (while nav links saying home) to home using 'keywords' etc, etc.

First, I should mention some points about the domain.

4 years old
Owned by me
Dedicated IP
Canonical comdomized
HTML only
Ranks top 5 in Google.com for main, second and third keyword phrases.
Total of 90 pages, all unique content (written by me)

Testing was done over a 3 month period, with grace periods in between testing.

Here is so far what I have found. Might tell us a little about the threshold and re ranking filters

1. Linking home from every page in content using the same keyword caused 6 page drop in rankings.

2. Linking home using keyword in nav on all pages caused the same drop.

3. Link home from every page in content using variations caused a 3 page drop.

4. Linking home from the first 10 pages listed in google.com for site:domain.com/* brought increased ranking (from 5th to 3rd)

What is really interesting is that I gotten this down to the 'by page' factor. When I *slightly* cross the threshold and add links to two extra pages, and then wait until they are cached, I tip the scales and drop, to page 6.

What is further interesting is that linking home from content using variations of keywords WAS quite effective to a point, after which the site plummeted.

As well, this might point to a 'hard line' being crossed in terms of threshold, at one point I had the website going between position 4 and 51-60 for the same keyword every second day (flipping back and forth)

My test will be about trying to -950 the website by being ridiculously deliberate in nav linking, and then seeing if I can reverse the results by removing those (and how long it takes for the trust to be reinstated to the website)

Any thoughts?

 

Shaddows

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3756885 posted 10:12 am on Dec 2, 2008 (gmt 0)

Also, as webusers as well as webmasters, who feels that not being able to click to homepage will be a major annoyance when you are casually suring? After all (the nofollow test notwithstanding), that will be the policy implication of Cains work?

Webmeister

10+ Year Member



 
Msg#: 3756885 posted 4:39 pm on Dec 2, 2008 (gmt 0)

I see lots of guesses here, but has anyone tested out any of these theories other then the author of the thread? If so, please post your results.

whitenight

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3756885 posted 4:58 pm on Dec 2, 2008 (gmt 0)

I would love to this this test expanded to include variations with the nofollow attribute. To see whether the value of anchor tet is COMPLETELY nullified by it or not

THIS IS VERY IMPORTANT
Nofollow is like cooking with salt.
You have to know where to place it and the effects of it.
They have been enough TESTS done that one can SEARCH for that explain its affects,
but ONE HAD BETTER MAKE SURE they know where they are placing it on the page and why.
It can make your dish simply perfect or completely inedible.

Also as important, you'll have to TEST IT on your pages to see how it affects YOUR site.

[edited by: eelixduppy at 9:00 pm (utc) on Dec. 2, 2008]

JS_Harris

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3756885 posted 5:04 am on Dec 3, 2008 (gmt 0)

Don't overdo it with keywords as a link to "home". You trigger review filters when you hit page 1 but are too heavy with any keyword. The drops are temporary but may become permanent if a human editor agrees with the flags.

CainIV

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3756885 posted 7:05 am on Dec 3, 2008 (gmt 0)

Shadows, the experiment in post #:3798137 makes sense and is worth trying. I have always wanted to know this too.

Just like everyone else here, I am trying to find where frayed end of the rope is :)

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3756885 posted 7:20 am on Dec 3, 2008 (gmt 0)

>>the experiment in post #:3798137 makes sense and is worth trying

Also, as webusers as well as webmasters, who feels that not being able to click to homepage will be a major annoyance

I'll cast my vote right now. As a user, I would HATE it. In fact, I re-did an information site for a client who wants the homepage link to lead to their ecom site instead of the homepage of the info site (related target audience), and after giving arguments against their last email on the topic was that they WANT it going to the ecom site anyway.

I absolutely will not; I would hate it as a user, and so would the other users (99.9% of whom come in on the homepage) who would no longer be able to navigate that site. That's 1,500 uniques a month now, who would which would go right down the loo and IMHO jeopardize the "trust" of the ecom site.

In this case it would be dishonest and deceptive, hijacking visitors unwittingly. I will NOT, and if they absolutely insist, I will tell them to get someone else to work on their sites altogether, no ifs ands or buts about it.

CainIV

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3756885 posted 7:32 am on Dec 3, 2008 (gmt 0)

To me, not being able to click to the homepage would be deceptive as well.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3756885 posted 8:08 am on Dec 3, 2008 (gmt 0)

>nofollow test

On the above referenced ecom site, I had the people use nofollow to the info pages and the shopping cart (the cart link was creating tons of dup errors in the cart}. That's what was done on all the static site pages, BUT the "web coder" reversed that on the cart pages (over 1,500 of them) and left the info and cart links intact and put nofollow on the link to the homepage - the one that said "Home." However, the top logo graphic was still linked normally with the anchor text phrase intact.

The nofollow on the top navbar "Home" links from the cart had no effect in this case, but I wouldn't like to see what could happen if there were no other links back to the homepage at all on over 1,500 shopping cart pages.

[edited by: Marcia at 8:10 am (utc) on Dec. 3, 2008]

whitenight

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3756885 posted 8:14 am on Dec 3, 2008 (gmt 0)

nofollow test

grrr. Any and every nofollow test is COMPLETELY site dependant for internal links.
(if you don't understand why, then you need to RESEARCH why)
So anyone who says "yay" or "nay" to whether rel=nofollow works or not is missing the point and giving false info.

<snip>

Any test that CAIN conducted regarding THIS TEST and rel=nofollow would only apply to HIS SITE.

THIS TEST, however, on its own merit, is "multi-site" valid. And would apply for most sites. (of course, individual sites may experience anomalous results)

[edited by: Receptional_Andy at 10:29 am (utc) on Dec. 3, 2008]
[edit reason] ToS violation [/edit]

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3756885 posted 9:08 am on Dec 3, 2008 (gmt 0)

Re: rel="nofollow"
but ONE HAD BETTER MAKE SURE they know where they are placing it on the page and why.

I can state the major reason why I use it on sites. FOR SURE.

Given a certain number of inbound links - and most important, the PageRank - of a site's homepage, which is then distributed throughout site pages (notwithstanding links to internal pages), if a site homepage has good, strong PR, then a lot of pages will be indexed, and be in the main index. However, if there aren't too many inbound links and/or the homepage PageRank is low to mediocre, then depending on the number of total pages in the site, a good number may be either excluded from indexing or relegated to the Supplemental Index (or partitions), depending on the aggregate link strength.

In the latter case, I "nofollow" unimportant links to pages such as About Us or Shipping so that whatever link juice there is will be distributed to the desired product pages on the site. And that isn't site dependent; it's a matter of record (published patents) regarding multiple partitions, and observing crawl frequency and index refresh dates; and given a choice of whether to have About Us or Best Widgets have PR passed to them and be indexed in the Main Index with decent PR and rankings, well - it's a hands-down decision.

Granted, it isn't for everyone, and neither are a lot of other possible choices recommended for the "totally wet behind the ears" (which incidentally, doesn't include most folks here); but I do have confidence that most folks know when they're clueless and should proceed with caution, and for those who aren't, they don't need to get slapped around to know whether something is for them or not.

I see lots of guesses here, but has anyone tested out any of these theories other then the author of the thread? If so, please post your results.

Florida update, November 2003. One site of mine dropped like a rock (as did others). After much reading, discussion and pondering, I decided that (along with another improvement* a member made by stickymail, unsolicited and offered as a kindness), I decided that there was a problem with excess occurrences of identical anchor text in internal navigation links. I made the correction and the site bounced right back. So did others.

Timeline:
Florida update: November, 2003
Historical Data Patent Applications: December, 2003

Right now I know for certain, I'd bet on it, that a couple of sites of mine are not ranking better at Google because of anchor text repetition; however, I choose not to make changes because they do well at MSN and Yahoo and I prefer not to rock that boat. However, I have other sites that are set up virtually the same way and on very close topics, but are doing fine at Google - but I've avoided the anchor text issue on those.

No, it isn't iron-clad but I'm as sure of it as can be - and have been since 2003.

*The other improvement, which I now know and understand but didn't then, was related to sitewide topical relevancy and keyword co-occurrence - which is why I tend to get a little fanatical on the subject of semantics and co-occurrence, which is a concept as old as IR itself, and a lesson worthy of learning well.

BTW CainIV, thank you SO MUCH for starting this thread. It's a big step forward toward what I like to call search engine de-spamification. ;)

[edited by: Marcia at 9:22 am (utc) on Dec. 3, 2008]

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3756885 posted 11:21 am on Dec 3, 2008 (gmt 0)

I'd like to clarify what now seems somewhat cryptic in my last post.

Back in the November, 2003 Florida update (when the site in question dropped in drastically), my main phrase was a compound phrase: (phrase1)(phrase2). But... what a kind fellow member pointed out was that there was nothing on the site to "back up" the (phrase 2) part of it. So what I did was add several content pages related to (phrase 2). Sometimes just making mention of a phrase on one page simply isn't enough, is what I learned then; the site itself has to be relevant. It was too barebones, so I added some relevant "meat" to the barebones site content.

In addition, the task was eliminating the excessive exact sitewide keyphrase repetition for (phrase1) in both the side and bottom navigation - and the top graphic anchor text linking to the homepage.

In discussing with a fellow member and good friend on IM (much smarter than I, he was then and still is), he sent me a link to a paper on TF/IDF - Term Frequency/Inverse Document Frequency - which was a lot to wrap my head around (and still is), but is an IR concept long established, and has proven to be totally helpful in figuring out words to include and choose in so many ways - particularly choosing page names and designing internal navigation and site architecture.

Using TF-IDF to Determine Word Relevance in Document Queries (PDF) [cs.rutgers.edu]

Since then, what I've tried to do when it seems necessary to link to pages in multiple spots from the same page(s) is to link with the prominent navigation using "Best Widgets" and from secondarily important navigation using just "Widgets" and/or longer phrases, in the case of text area, inline links - only to avoid the kind of excessive repetition a lot of sites (including mine) had at the time.

tootricky

5+ Year Member



 
Msg#: 3756885 posted 3:40 pm on Dec 3, 2008 (gmt 0)

Link sculpting using nofollows is a great technique for onsite SEO. I strongly disagree with other posters when they say that basic assumptions can not be learned about nofollows by conducting a test that includes their use. Of course link scuplting changes from site to site, but there are for sure things we can learn about how the affect RANKING (not page rank)

Link scuplting with no follows is one of the major things I change first when optimising a site and on its own without any other onsite changes it can make a huge difference to rankings.


I can state the major reason why I use it on sites. FOR SURE.

Given a certain number of inbound links - and most important, the PageRank - of a site's homepage, which is then distributed throughout site pages (notwithstanding links to internal pages), if a site homepage has good, strong PR, then a lot of pages will be indexed, and be in the main index. However, if there aren't too many inbound links and/or the homepage PageRank is low to mediocre, then depending on the number of total pages in the site, a good number may be either excluded from indexing or relegated to the Supplemental Index (or partitions), depending on the aggregate link strength.

In the latter case, I "nofollow" unimportant links to pages such as About Us or Shipping so that whatever link juice there is will be distributed to the desired product pages on the site. And that isn't site dependent; it's a matter of record (published patents) regarding multiple partitions, and observing crawl frequency and index refresh dates; and given a choice of whether to have About Us or Best Widgets have PR passed to them and be indexed in the Main Index with decent PR and rankings, well - it's a hands-down decision.

Granted, it isn't for everyone, and neither are a lot of other possible choices recommended for the "totally wet behind the ears" (which incidentally, doesn't include most folks here); but I do have confidence that most folks know when they're clueless and should proceed with caution, and for those who aren't, they don't need to get slapped around to know whether something is for them or not.


This is almost identical to how I approach it. About us (unless the content is relevant), contact us, privacy policies, terms and conditions.. etc. all nofollowed. I also nofollow links from the homepage that try to "skip" the natural data structure of the site: So for example that means no linking from the homepage to individual product pages on an eccomerce site without adding a nofollow.

I would also add that unless the anchor text is passing value with keywords I also nofollow. That means all links to "home" and the like get the treatment too.

I have seen rankings for pretty generic keywords rise 1-5 pages after this nofollow treatment on templated sites (1,000 pages or so)

For Webmeister I quote for the last page:


I have seen results similar to yours on a site that had a footer link to the homepage on each page of the website with optimised anchor text. This link was added about 2 months ago and the site dropped into 70s for that keyword. The template link was removed and the site bounced back to page 1 again.

What Cain is seeing is what I experienced: I over cooked the home page links and got punished for that keyword until I corrected the over optimisation.

James_WV

5+ Year Member



 
Msg#: 3756885 posted 4:33 pm on Dec 3, 2008 (gmt 0)

Hi Everyone,

Question to ask on the same lines: I have a site that specialises in widgets by location. Page structure is roughly:

Home Page --> location widgets --> widget list --> individual widgets

There are hundreds of individual widget pages - some being indexed and ranked some not. The main phrases I'm after are 'location widgets' (i.e. same as secondary pages), and the location widgets pages are the landing pages I'm aiming for.

Would it help or hinder me if after the description of the individual widget product we had a link on every individual widget pages - something along the lines of, sorry this widget isn't for you to find more please go back to location widgets (where location widgets would be a text link back to the landing page I want to rank highly).

What do you reckon - too spammy or would it help?

tootricky

5+ Year Member



 
Msg#: 3756885 posted 8:14 pm on Dec 3, 2008 (gmt 0)

I am working on a site at the moment similar in structure to the one you describe. In my opinion I would be inclined to pass value back from the individual widgets to the location widgets pages using a breadcrumb navigation or something similar with optimised anchor text: This is not only a user feature but good for link juice sculpting too: As well as link juice from the top of the site pyramid, the location pages will also have link juice coming from the bottom; this should increase the relative importance of the location pages compared to the rest of the site.

Just make sure that as much value from your homepage passes to the location pages as possible: Assess which other pages are receiving link juice from the home page and weigh up whether they need to or not: Are they "leaking" your link juice?

[edited by: tootricky at 8:31 pm (utc) on Dec. 3, 2008]

[edited by: Robert_Charlton at 8:43 am (utc) on Dec. 4, 2008]
[edit reason] fixed typo [/edit]

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3756885 posted 8:22 pm on Dec 3, 2008 (gmt 0)

It seems, as tedster pointed out earlier, that websites that historically started with those types of internal anchors pointing home got over that assessment hump. Other websites that add it later in the lifecycle of the website often get hammered in the results.

Likely, yes, because what's the point of making the changes if it's not for "optimization." Google is always trying to detect optimizers, the "Communists" of the internet, like Joseph McCarthy. :)

Linking home from the first 10 pages listed in google.com for site:domain.com/* brought increased ranking (from 5th to 3rd) ... Hmm, this is a huge discovery if it one extrapolates this fully. The very vague question I have (so I don't give away how powerful this info could be) is: Is it a ratio of total pages or just a max of 10?

I have occasionally wondered what the algo uses to determine the ranking of pages within a site on a clean, clear site search (just the domain name, no keywords). I don't see this topic discussed much and my own observations are no better than reading tea leaves. I haven't seen enough data to support any conclusions. I see no pattern and they may actually be randomized for all I know.

Total of 90 pages

I like case studies--we don't see enough here--and I wish we had at least one each month. So I appreciate initiating the discussion. My question though is if you have a site of under 100 pages, why aren't all the pages listed in the navigation menu for every page? Google has a limit of 100 links/page and you're within the limit.

Google's guidelines indicate that it likes good navigation systems, so any effort to make the most simple and logical sitewide navigation complete could be rewarded. At the very least each page gets the maximum number of internal links to it. The algo could/should/does give more ranking points with more internal links (it does for my sites).

To the questions about anchor text, if there's a binary analysis by Google of every website, "Is this site optimizing?" automatic scrutiny of the anchor text would likely be one of the first Google does. It is one of the most common methods webmasters use hoping to juice their site's ranking. It is often abused.

If Google sees one site using lots of repeat anchor text in links, but that's how it was set up 10 years ago (when it wasn't known to help ranking), and never changed, that is not a flag of optimization or overoptimization. But any changes to the anchor text, Google is going to guess were made to optimize. Why else does anyone do it? For fun? :/

Having said that, and having previously had long problems with 950 penalties, I will concede it's possible there's a very fine line between getting high ranking juice for internal anchor text (even repeats) and getting arrested by the 950 Police.

I don't play that game any more. I don't mess around with anchor text. I am very conservative in that respect. I don't want to be at the mercy of the Anchor Text Dial. Don't forget probably each site has a Trust Rank, and presumably trust erodes with fiddling and other unjustified internal changes geared to optimization.

p/g

internetheaven

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3756885 posted 12:43 pm on Dec 6, 2008 (gmt 0)

What concerns me most about threads like this that seem to have plenty of research involved - is that they tend to out and out call Google misleading.

Google, and every other search engine for that matter, has always stated "build sites for users, not search engines".

I've built mine for users and users like a good navigation panel. They like the logo at the top of the page to link back to the home page. They like a "contact us" link at the bottom of every page etc. etc. etc.

If Google does have a filter that actually penalises sites for having too many internal pages linking to another internal page, then they need to remove the advice "build websites for users, not search engines" from their webmaster guidelines.

If my analytics research tells me I need to make adjustments to my nav panel to help my users and Google penalises me for making the change then Google are betraying their own mantra. I wouldn't be surprised if a lawsuit popped out from this thread if what you're saying it true. Telling webmasters to build sites for users then penalising them for doing so seems a little off to me ...

Seb7

5+ Year Member



 
Msg#: 3756885 posted 1:22 pm on Dec 6, 2008 (gmt 0)

Interesting post. There’s obviously a fine line between good linking and spamming. Though with any Google experiment its hard to tell if the results are actually caused by making too many changes at once.

I think over linking must have a limited effect as some of the bigger websites (like wikipedia) would be penalised.

maximillianos

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3756885 posted 2:36 pm on Dec 6, 2008 (gmt 0)

What is the point of spending time on this? How does it add value to your visitors?

Of course, maybe you are just bored. In which case, have at it. ;-)

johnnie

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3756885 posted 3:08 pm on Dec 6, 2008 (gmt 0)

Great thread. Thanks for taking the effort in publishing this.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3756885 posted 5:25 pm on Dec 6, 2008 (gmt 0)

Really? Adding nofollow and doing nothing else will make a huge ranking difference?

It will make a difference in crawls and indexing, which for purposes of this discussion is the point of using it in internal navigation.

But Google still follow the links. And, the destination pages obtain PR.

Nope, not according to Matt Cutts, in numerous quotes and interviews he's done. What Matt says:

1) For Google, nofollow'ed links are completely dropped out of their link graph.

2) Google doesn't even use nofollow'ed links for discovery.

3) The nofollow meta tag does the same thing as rel="nofollow", but at the page level instead of the per link level.

Added:

Matt discusses it in some detail in a video at the Webmaster Central Blog [googlewebmastercentral.blogspot.com].

[edited by: Marcia at 6:12 pm (utc) on Dec. 6, 2008]

pageoneresults

WebmasterWorld Senior Member pageoneresults us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3756885 posted 5:40 pm on Dec 6, 2008 (gmt 0)

1) For Google, nofollow'ed links are completely dropped out of their link graph.

I'd like to see that link graph they are referring to so "I" can confirm that. :)

2) Google doesn't even use nofollow'ed links for discovery.

There are way too many external factors to upset the use of rel="nofollow" at the link level that the above statement is moot.

3) The nofollow meta tag does the same thing as rel="nofollow, but at the page level instead of the per link level.

And that is where it needs to happen, at the destination, not along the way to the destination. If I were a competitor and I wanted to upset your link sculpting routine and I see that you are not using a robots directive for the destination page to prevent it from being indexed, guess what? One link to the destination page from an external resource will upset your entire micro management of internal links.

I'll bet that all of those destination pages that do not contain a noindex directive have visible PR, don't they? So, all you've done with rel="nofollow" is told the SE's that you don't vouch for your own internal links. That seems a bit counter intuitive for me. I'm going to prevent the destination page from being indexed and/or followed, a simple process that doesn't involve upsetting the internal linking structure of one's website.

Experiments in keyword rich links to Home

What about footer links that contain a copyright notice with the company's name linked to the home page? How does that come into play in addition to "default" Home or Home Page links. Usually the last home page link in the source due to where footers naturally fall in the order of things.

rainborick

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3756885 posted 5:50 pm on Dec 6, 2008 (gmt 0)

I'm convinced that rel="nofollow" does prevent the flow of PageRank based on my own efforts in PageRank sculpting. Worked as expected for me with no apparent problems. But I'm not convinced that it prevents the link from being "discovered" or included in the "link graph". I've seen nofollow'ed links appear in the Webmaster Tools link report, so they're in the system at some level. It may be that the system changed since Matt made his comment about that, or the link report gets data at a level that's technically distinct from the "link graph". But they're in there somewhere.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3756885 posted 7:30 pm on Dec 6, 2008 (gmt 0)

My test will be about trying to -950 the website by being ridiculously deliberate in nav linking, and then seeing if I can reverse the results by removing those (and how long it takes for the trust to be reinstated to the website)

I haven't seen keyword-filled "Home" links cause a -950. It seems to be more about 1) many pages being linked to using the same keyword in many phrases and 2) too many co-occurring terms appearing - words that are thematically or semantically related.

If this experiment can trigger a -950 for just keywording the "Home" links, that will be a very interesting result. My prediction is you can depress rankings, but not trigger the -950 penalty. If you stuff many DIFFERENT nav links, then you can trip it.

But the challenge is that the experiment tries to reverse engineer a moving target. The threshold for the -950 keeps getting recaclulated, and the calcualtions accommodate common practices within each thematic niche, too.

[edited by: tedster at 10:10 am (utc) on Dec. 10, 2008]

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3756885 posted 7:59 pm on Dec 6, 2008 (gmt 0)

many pages being linked to using the same keyword in many phrases

In Spring of last year, at the height of the abundance of reported -950 penalties, I had occasion to work with some sites that were hit, and then, after looking at other contributing factors, started to focus on identifying and studying keyword densities using the Firefox browser add-on. It wasn't for density as such, but to identify numbers of raw occurrences of the primary keyword for the phrases the sites were hit for.

I ended up honing in on what seemed to be an outstandingly common feature in many sites - excessive repetition of that primary word in navigation - mostly "mega-menus". After counting and assessing a good number of pages/sites, I arrived at a threshold figure (which I've never specifically mentioned before) of exactly 16.8%. That's what I found it was - at that time, anyway. Repeatedly.

Clarification: The figure represented 16.8% of the total number of occurrences of links, mostly in phrases, on the homepages of the sites looked at - many of them, across a number of keyword sectors, primarily consumer goods. That pointed to not only a raw occurrence overage and over-use in anchor text, but also validated the phrase-based "spam" possibility - overboard co-occurrence - referred to in the patent applications.

New Patent Application - Spam Detection Based on Phrase Indexing [webmasterworld.com]

Google's 950 Penalty (part 4) - or is it Phrase Based Re-ranking? [webmasterworld.com]

Phrase Based Multiple Indexing and Keyword Co-Occurrence [webmasterworld.com]

[edited by: Marcia at 8:11 pm (utc) on Dec. 6, 2008]

CainIV

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3756885 posted 8:10 pm on Dec 6, 2008 (gmt 0)

It was when, during the experiment that I linked home excessively from nav using one version of the keyword to 'home' that the drop happened. However, I was unable to cause a -950 per se.

For the purpose of this thread, and not to get off topic, I will next try linking to home as many times as possible in navigation and try to get the site to -950.

This is something that has always been a bone of contention for me. Some websites that, for lack of a better word, excessively link home using keywords from all pages fare perfectly well. Others that do this historically cross a threshold one day almost out of nowhere and those are often the 'Help' posts we see here.

Still others change their navigation and add those types of links and then get bombed.

My experiment, both for fun and for some personal learning, is really about establishing *some* understanding about penalties. While they are being recalculated daily, and do change in their thresholds, there certainly are some hard and fast facts I can see, even over the last 4 month span.

I do like the idea for a massive nofollow test thread though :)

Rosalind

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3756885 posted 9:59 pm on Dec 6, 2008 (gmt 0)

But any changes to the anchor text, Google is going to guess were made to optimize. Why else does anyone do it? For fun? :/

For clarity, and possibly to reduce the size of the navigation bar in order to fit more in. Or you might want to distinguish between the blog home and main site home.

I think the home page should be a special case, because search engines have to differentiate between sites that are about "home", or even "index", and everyone else. A glance at today's serps shows they aren't getting it right. There's a high proportion of off-topic results for this word in Google, Live and Yahoo.

Bilbo



 
Msg#: 3756885 posted 1:26 am on Dec 7, 2008 (gmt 0)

I experimented with this a few years ago dropping keyword text links on a image alt saying"go to home from 'keyword'" and a home page icon.

It seemed to have little effect.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3756885 posted 9:04 am on Dec 7, 2008 (gmt 0)

I experimented with this a few years ago dropping keyword text links on a image alt saying"go to home from 'keyword'" and a home page icon.

It seemed to have little effect.


Bilbo, was the alt attribute the only anchor text link back to the homepage on pages in that case?

(Also, please keep in mind that "go to home from 'keyword'" in the alt attribute of a graphic is using the anchor text as just part of a longer phrase, including other words in the phrase. That makes a difference.)

Bilbo



 
Msg#: 3756885 posted 3:44 am on Dec 8, 2008 (gmt 0)

I thought it deeper than that the reason it didn't work, I thought google and others were using tests to identify images with alt tags and using same image with many different alt tags caused a problem.

Be interested in thoughts...

Hissingsid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3756885 posted 8:56 am on Dec 8, 2008 (gmt 0)

2. Linking home using keyword in nav on all pages caused the same drop (6 pages).

I have keyword in nav on all pages pointing back to home and my home page has been #1 occasionally dropping to #2 for many months for that 2 KW term.

The competitor who shuffles with me they are #2 occasionally #1 and the site has an internal page in the form www.domain.com/key-word.html which has at least two nav links back to this page (one in a drop down list) on every page of their site.

This competitor does not use key word links back to the ranked page within other text on site pages. I do.

FWIW I don't think that the experiment detailed in the first post of this thread is valid because:

1. There are far two many other variables. For example it may be that an authority page with many external back links using a particular set of keywords may gain a positive effect from keyword rich internal links but a lesser site may produce a negative effect if it did the same. In effect you might spoil the cake by putting in too much of one ingredient.

2. I agree with others that the result may be a measure of Googles evaluation of change rather than some other absolute.

Still it is very interesting and the debate created is very valuable.

Cheers

HS

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3756885 posted 3:05 am on Dec 9, 2008 (gmt 0)

There's specific mention made in several published patents of both anchor text and changes, but here we have what Jake found on the topic back in 2004:

[webmasterworld.com...]

This 87 message thread spans 3 pages: < < 87 ( 1 [2] 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved