homepage Welcome to WebmasterWorld Guest from 54.166.8.138
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 186 message thread spans 7 pages: < < 186 ( 1 2 3 [4] 5 6 7 > >     
Google's 950 Penalty - Part 5
steveb




msg:3250638
 11:44 pm on Feb 12, 2007 (gmt 0)

< continued from [webmasterworld.com...] >
< related threads: -950 Quick Summary [webmasterworld.com] -- -950 Part One [webmasterworld.com] >

"That's exactly the sort of sites I'm referring to"

Unfortunately some comments about this issue apparently can't be bothered to actually, horrors, look at the serps. Authority has a specific meaning with Google, and its plain that authority sites are what are commonly, mistakenly hit by this penalty. I don't think this is a good summary of the effect, but one simplistic way to look at would be to say authority sites with volume of quality in-links are being confused with spam sites with a volume of rotten quality in-links, sometimes.

One of the most interesting phenomenons is how an authority site can be #1 for red topic, blue topic, green topic and purple topic, but be 950 for orange topic, even though the linking, page structure and keyword usage is basically the same for all of them. Clearly a a ranking mistake is being made (either the 950 result, or all those #1's).

[edited by: tedster at 9:17 pm (utc) on Feb. 27, 2008]

 

Martin40




msg:3268404
 10:44 pm on Mar 1, 2007 (gmt 0)

A page can rank number 950 for a commercial 3-word phrase, but number 1 when you add another simple word to that phrase like "in" or "at." IOW, there appears to be something about that particular commercial phrase that sends the page to 950.

You observed well, but be careful to note that #1 positions tend to stick. What I'm seeing is that downward tendencies are site-wide and key-phrase-wide, because the homepage is affected and the internals rely on the homepage for.....whatever...PR, link juice, relevance, you name it.
IMO it's not a site-wide penalty, it's just the homepage that has a problem and internals with external IBLs are far less affected.
So this thing only looks key-phrase based because #1 positions stick. It you never had any #1 positions, then all your key-phrases are in trouble. But sites without #1 positions are lesser sites that don't seem to be affected, so there you go.

I'm making such a big point of the key-phrase based thing, because I think it makes this forum's top dogs barking up the wrong tree.

For me the question is becoming: why can we find neither rhyme nor reason to this Google update? Until June 2006 SEOs would be able to dissect an update.

[edited by: Martin40 at 10:46 pm (utc) on Mar. 1, 2007]

trakkerguy




msg:3268470
 11:56 pm on Mar 1, 2007 (gmt 0)

The site I've been helping on has exhibited all the different symptoms you've all described over the last 3 months.

In the beginning, most of the domain was affected and for most keyphrase searches the pages were EOS - end of serps. I greatly reduced the onpage keyword density, in particular the internal link text, and have been climbed back above #50 for various searches.

For this domain it varies between a page specific, and a domain wide "penalty" depending on how bad google views the main page to be (which is where most of the external links point).

The depth the pages are buried has also varied.

When I add more than one or two internal links with a particular keyphrase for the anchor text, the pages drop back to the end for that particular search. Too many internal links with the main keyword, from the index page, and most of the domain goes to the end of serps.

By avoiding the main keyword in internal links the whole domain regains rank. By avoiding a particular keyphrase in internal links, that keyphrase regains rank. Haven't gotten any real competitive phrase back in the top ten yet, though...


thedigitalauthor




msg:3269113
 4:03 pm on Mar 2, 2007 (gmt 0)

trakkerguy >>

So, should we be adding a hypertext linked "(click here for more information)" after each of our internal references. In my main site, the subjects of the articles are the key terms. If I write an article about Widget A and it references Widgets B and C, wouldn't it be "proper" to just hypertext link them? So now, if I happen to have a lot of other internal pages that also mention them, now I will be penalized? It just does not make sense (not that any of this does). Also, I believe that one of the reasons Wiki-P became so "authoritative" was that it has a huge amount of internal links to other pages that include key words.

[I think I am going to change my user name to "frustrated"]

thedigitalauthor




msg:3269134
 4:16 pm on Mar 2, 2007 (gmt 0)

This is strange. Less than a few minutes ago, one of my key terms was penalized. I wrote the previous posting, did a few other things, then went back to Google to do some unscientific research regarding the term. Strangely and happily, the term is back at #2. Other terms are also back to the top 10 in the Serp. I do not know how long this will last, but let's hope it stays. I also hope everyone else is coming back to life.

To quote someone in a much earlier post "switch on"

randle




msg:3269207
 5:14 pm on Mar 2, 2007 (gmt 0)

By avoiding the main keyword in internal links the whole domain regains rank. By avoiding a particular keyphrase in internal links, that keyphrase regains rank.

This has been our experience. BUT we are a long way from calling anything conclusive, lots and lots of other factors including just leaving things alone could rescue you. However, scaling this back coincided with multiple “sites” coming back up. Not to what they once were, but top 20, beats the heck out of 950.

Whatever this thing is they are definitely playing around with it, and many sites must sit right on the line and get hit with the slightest adjustment G makes, and vice versa; affected sites come back.

It does seem like we have entered a new era where their tolerance for collateral damage has dramatically risen.

trakkerguy




msg:3269301
 7:01 pm on Mar 2, 2007 (gmt 0)

Yes, I have seen the ranks fluctuate when we have done nothing and G adjusts things. And have seen rankings change in response to on page changes. Sitting and hoping may be smart if you've just been hit, but for how long?

I know there are several factors, but a part of it seems to be having too much competitive, keyword rich anchor text on internal links.

Unfortunately, you have to reallllyyyy strip down internal links (and keywords throughout the page?), which can hurt your rank in MSN and Y - a fine line to walk.

It seems that "trusted" or "authority" sites can spread their reach with keyphrase dense internal links, but others get sent to end of serps. How is the difference determined?

[edited by: trakkerguy at 7:20 pm (utc) on Mar. 2, 2007]

randle




msg:3269320
 7:17 pm on Mar 2, 2007 (gmt 0)

It seems that "trusted" or "authority" sites can spread their reach with keyphrase dense internal links, but others get sent to end of serps.

Agreed, sites with lots and lots of really high quality back links seem immune from this thing. Perhaps it’s some sort of ratios relative to where all of this “anchor text” for these key word phrases is coming from. If the internal ones off your own site are a tiny percent of the overall links with those keywords, perhaps they figure it’s legit.

We found that for the majority of our sites that got hit, they did not score very highly when searching by allinanchor:

northweb




msg:3269325
 7:21 pm on Mar 2, 2007 (gmt 0)

Unfortunately, you have to reallllyyyy strip down internal links (and keywords throughout the page?), which can hurt your rank in MSN and Y - a fine line to walk.

Yes, that's the move to make and it could effect your msn and yah traffic. That's the problem, even if we strip down the sites, change navigation, make site unique there is still no guarantee google will let us back in.

yep, it's a fine line. crazy!

northweb

trakkerguy




msg:3269327
 7:22 pm on Mar 2, 2007 (gmt 0)

Yes, that is what I was going to ask/suggest. I know the inbounds for the domain I am working with are very uniform for the main keyword, and mostly to the main page.

jk3210




msg:3269338
 7:30 pm on Mar 2, 2007 (gmt 0)

With the reports of this thing beginning as early as 2005, does anyone get the impression that it's something that is being phased-in, rather than an all-at-once algo change?

trakkerguy




msg:3269370
 7:56 pm on Mar 2, 2007 (gmt 0)

Every few weeks or so G tweaks things, or a data refresh, or whatever, and more sites report being hit with it. And occasionally some come back.

From the reports on this post, it seems like it is continuing to expand and more sites are hit than recover, which is why I don't think G is going to scrap this filter/penalty.

thedigitalauthor




msg:3269500
 10:29 pm on Mar 2, 2007 (gmt 0)

A few hours ago I said that I was back, well, I am back in the penalty box. This really did not last very long.

Switch off.

annej




msg:3269744
 4:42 am on Mar 3, 2007 (gmt 0)

In my main site, the subjects of the articles are the key terms. If I write an article about Widget A and it references Widgets B and C, wouldn't it be "proper" to just hypertext link them?

It would make good usability sense. It always seemed logical to me to have the pages linked to each other in a related section. Assuming the visitor was interested in one page they might want to explore other pages in the same section.

I've had to rethink all this. I've subdivided some topic sections on my affected site so that the navigation on each page has fewer links. In one large section I took out the navigation to all other pages in the section. Instead I picked 3 or 4 of the closest related links for each page.

It seems to have really helped to decrease the internal navigation as it then decreases the repetition of key words in the anchor text. I have almost all my pages back and they have been back for a while.

I agree we need to look at other possibilities but I believe a great deal of this is phrase based. Remember a phrase in this definition can be one word.

In most cases the pages that disappeared earlier had one or more key words that are also used in advertisements. In one case the popular product is totally unrelated to how the term is used in my article but the spider doesn't know that.

MHes




msg:3269865
 10:06 am on Mar 3, 2007 (gmt 0)

annej

How can you tell when a link has been acknowledged?

I ask because we have changed internal navigation and the cache has shown the new linking pattern for 3 weeks.... but a link:search for a page shows the linking pattern according to late January. I realise link:search does not show all links, but removed links are showing.

anax




msg:3269893
 11:16 am on Mar 3, 2007 (gmt 0)

I've had to rethink all this. I've subdivided some topic sections on my affected site so that the navigation on each page has fewer links. In one large section I took out the navigation to all other pages in the section. Instead I picked 3 or 4 of the closest related links for each page.

See, this is the sort of thing lots of us are now starting to do because of Google's chaos. We are saying things like, maybe I should not have links, or maybe I should degrade the navigation, or maybe I should stop making internal links, or maybe I should make it less clear what the page is about. This is insanity. We *were* making good, white hat sites and thinking about our users, but now we're only thinking about what Google might have done to destroy our traffic that *was good* and we're making our sites worse for users. (I'm not talking about new sites, I'm talking about established sites that Google has knocked down overnight to 20% of their former traffic.)

annej




msg:3270070
 4:22 pm on Mar 3, 2007 (gmt 0)

How can you tell when a link has been acknowledged?

I've been using the cache date as it is all I have to go on. I have also seen the variations depending on the search phrase.

This is insanity. We *were* making good, white hat sites and thinking about our users

I agree. But I'm willing to be a bit insane at least temporarily to get the pages back. They aren't doing anyone any good if they can't be found. What really got me started on this was the loss of a page with some original research that is not anywhere else on the internet. The information was from another historian who doesn't have a website.

I've already started a gradual process of putting a few key words back here and there in cases where they are important for the user. I'm waiting for the pages to get spidered to see how that goes.

On the other hand this mess has made me organize my sites better. We know that it's best to keep contents pages short for usability.

MHes




msg:3270122
 5:34 pm on Mar 3, 2007 (gmt 0)

annej

I would be surprised if the cache was a signal that the links between pages were being acknowledged. I would guess this is done offline but when and how often is unknown. I think there will be two things going on here:

1) the calculation of N (phrase count) and linking pattern between pages.
2) Ranking based on cached words of a page.

Thus your page may rank well with recent changes picked up by the cache but your page may be still subject to the older N value and link pattern. When the new N value is calculated with the recent changes, the rankings may then change dramatically. In otherwords, you may add 'phrases' and see improvement, only to be hit when N is re-evaluated and you have gone over the top. Then no changes will make a difference, unless when N is evaluated again the phrase count is lower again. We have compensated for this situation by utilising redeeming pages that override N values. The result is in fact a much better structure and quality to the site..... google is clever!

trakkerguy




msg:3270127
 5:39 pm on Mar 3, 2007 (gmt 0)

maybe I should stop making internal links, or maybe I should make it less clear what the page is about. This is insanity

I agree it seems ridiculous to go back and take out or modify internal links that aid navigation, but hopefully you can find a way to do it without making it less clear to users. For instance, instead of "keyword1 keyword2" in the anchor text, putting keyword1 in front of the href, and using just keyword 2 as anchor text has worked for me. In this case, "keyword1" is more commonly searched and abused by spammers.

It also seems ridiculous (and financially painful to some) to send a top rank page back to the last page, but if they just dropped it down a little we would be wondering why. At least it's a clear signal.

Annej - am glad you hear you've got most of your pages "back". Does that mean they are near the previous rank?

madmatt69




msg:3270151
 6:22 pm on Mar 3, 2007 (gmt 0)

It just surprises me that some google rep must be following these threads from time to time, yet rather than be pro-active and say they're testing some new algo, and maybe ask if they can take a look at some example serps we're seeing so they can improve their quality, they just don't acknowledge anything.

Yeah we can use the link in the serps "Dissatisfied?" but who knows if that does anything at all. Google's quickly becoming that nameless, faceless corporation :(
</rant>

Back on topic - annej, I've been keeping track of my cache date too and it hasn't changed since Feb 28 - and webmaster tools says that's the last time they accessed my homepage, however I see googlebot crawling everyday.

I know of a few sites that have linked to me in the past week or so as well, but I haven't noticed any change in my serps.

Miamacs




msg:3270160
 6:37 pm on Mar 3, 2007 (gmt 0)

Okay I know, as even I do it, that most of us skip long posts.

But please do read this one.
As it is a quite complicated theory that I came up with and need feedback for. I myself couldn't find where I could be wrong anymore. Took me a lot of time and research, but without more data on individual instances it just can't be proven. Please try it on your sites for a moment, I don't want to regret putting it together.

Overall, everything comes down to the anchor text used, and how the internal navigation passes relevancy based parameters, making internal pages rank high for things they don't even have inbounds for ( but the homepage has, and passes/narrows it down ). Or get the site penalized. When not a single page has on-topic inbounds, or the internal navigation fails to deliver the relevancy around.

Seems that the penalty is for a combination of overusing ( assigning relevance to subpages in a repetitive manner ) and misusing anchor text ( trying to create internal anchor texts that the homepage itself is not relevant for/or are WATCHED by the phrase filters ). There is no "fine line" in between ranking top 10 and 950. It is more like a "zone", a different one for each site, that gets calculated based on relevancy, trust, and the sheer amount of links.

The NEW parameter however, is the phrase/relevancy based reranking.
The rest was there since last year already.

It's Google trying to aim for trusted sites which target money terms they are "not relevant" for. With the collateral being articles mentioning or having internal navigation with these, in an excessive amount.

From what's been experienced:

- Some people reported pages penalized for their key terms.
- Some people reported pages penalized altogether.
- Some people reported sitewide effects for certain terms.
- Some people reported the homepage gone while internal pages still alive.
- And then THE penalty, a sitewide 950 for everything.

The reasons:

These are both different stages of the same problem, and the proof that there are several filters at work that cause this effect overall. With tripping only a few, or not having too many instances, the penalty may not be sitewide. Yet.

Then it turns over once the "irrelevant" or "filtered" anchor text links are reaching the number to get the homepage penalized for "being spam", which then spreads the problem to the entire site. Even to those subpages that were relevant to the overall theme. For the homepage that is the foundation... is now gone 950 because of a penalty.

What happens:

- Pages that used to rank even without an inbound with the keyword to the entire site ( or the navigation not passing the relevancy ) suddently don't.
- Homepage not passing weight to subs that itself is "irrelevant" for. These pages now don't rank for anything.
- All links containing the problematic keyphrases are devalued.
- Too many "irrelevant" links appear on a page and it gets a penalty. Homepage penalized.
- No links pass any weight with the homepage gone. Navigation devalued, entire site goes 950.

Which means it's the homepage that's getting a penalty for the links that appear on it are "irrelevant" in Google's eyes. Or are tripping the "phrase based filter" for aiming "neighbouring" money terms, even if not on purpose.

The filters:

- Phrase based filter checking if trusted sites are trying for competitive terms without having inbounds passing them relevancy.
- Only certain terms are affected, not everything is monitored.
- The filter is pretty dumb as in it's working with quite lacking sets of associations. Doesn't know that "Gadgets" and "Widgets" are essentially the same, because no one ever told it.
- It checks authorities, for a site without any weight or trust won't appear in the primary index no matter what it does. Thus it's not a potential host of spam.
- However authorities could do as they please, for they already have passed the thresholds. Whatever they put up, it appears in the index.
- To add a counter effect, if a site is trusted for "A" but not trusted for "B" and it adds 200 links of articles about "B", and A is not relevant to B ( according to Google ), these get discounted, and worst, the homepage penalized.
- Otherwise trusted pages without being absolutely relevant for a money term get demoted...
- Pages having links on them with such anchors get demoted. Extremely high parameters would make a site immune. Being top 10 is not an indication of this.
- Pages with enough parameters but all too many such phrases/links get demoted.

Seems important:

- This became an issue since trust is now relevancy based.
- Meaning a page that is the super authority for "A" doesn't have a "super parameter" that would allow it to compete anywhere it wants to. About time btw.

Are your hompages trying to pass any links to subpages that they themselves don't have a single inbound for... but are competitive terms? Phrases that the homepages aren't "relevant for" semantically. Ie. You have 1000 inbounds for "Widget Store" and we all know that "Gadgets" and "Widgets" are the same. But Google doesn't know yet, it only knows that trying for "Gadgets" means trying to be relevant for a money term, and if a page that doesn't have ANY inbounds for it adds 100 links with this term, it "has to be" spam.

With no controll over how Google should sort out its priorities, all you can do is...

- Get inbounds for every theme and distribute it properly with the navigation.
- Don't use an excessive number of links with watched phrases you're not relevant to.
- Edit the navigation and/or article titles/descriptions not to include any.

If I'm right with at least 50% of the above, we'd still need to know what excessive usage is.

Things like, number of links in the navigation, number of the keywords/phrases repeated in most of them, out of these how many were phrases you don't have inbounds for, and could not "create" your own relevancy by narrowing down the broad relevancy of the homepage either.

I've been using the cache date as it is all I have to go on. I have also seen the variations depending on the search phrase.

I'd say add some days for the first few links to be accounted for, and weeks for the rest. Once cached, the link data is fed to an offline algorithm that checks the available data for many things, including linking patterns ( frequency ), and the target pages ( relevancy ). A higher pagerank or trust can give it some boost, but the last time I checked no link was accounted for sooner than 4 days after being cached. And if there are many, their weight gets calculated gradually not all links at the same time.

trakkerguy




msg:3270166
 6:49 pm on Mar 3, 2007 (gmt 0)

Miamacs - thanks for the long post. Explains it well, almost exactly what I've been experiencing, and thinking about this.

Do you think Mhes post could also fit in your theory? Meaning, a lot of internal links for a phrase, not enough inbounds to support them, AND not enough related content on the target page to justify all the internal links.

madmatt69




msg:3270174
 6:59 pm on Mar 3, 2007 (gmt 0)

Very good theory, makes a lot of sense to what I've been seeing.

I'll try and get some more inbounds to the penalized sub pages.

Whats interesting to note is in the last few days, the penalized pages have been ranking for other variations of the maiin keyphrase - but not the keyphrase itself.

annej




msg:3270207
 8:07 pm on Mar 3, 2007 (gmt 0)

you may add 'phrases' and see improvement, only to be hit when N is re-evaluated and you have gone over the top

I'll just have to give it time to be sure then. I understand it's a lot more complicated than just the cache but that's all I have. The only key words I'll bring back are where it really makes a difference for the user.

Annej - am glad you hear you've got most of your pages "back". Does that mean they are near the previous rank?

Yes, very close to where they were before. Mostly ranked first to fifth.

Miamacs, I printed it out so it will be easier to study. I'll get back to your later.

steveb




msg:3270234
 9:20 pm on Mar 3, 2007 (gmt 0)

"All links containing the problematic keyphrases are devalued."

Just an example of why trying to logically understand why someone drops a plate on the floor is a hopeless exercise.

If I have page 950 penalized for a term, I have other pages ranking for the term, like the same content in two different languages with exactly the same linking... one is #950, the other is #1 in their respective languages (the search term is the same in both languages).
===

Aiyah, looks like this page is broken...

Martin40




msg:3270283
 10:40 pm on Mar 3, 2007 (gmt 0)

Thanks to trakkerguy for bringing this thread to life and miamics post clinched it for me. I still don't fully understand this thing, but it has made me replace my site-wide menu with a site map.

Miamics, how in the world do you know all this?

trakkerguy




msg:3270306
 11:13 pm on Mar 3, 2007 (gmt 0)

Uh oh, I think the big G is hacking the server - afraid we're making progress...

annej




msg:3270405
 2:51 am on Mar 4, 2007 (gmt 0)

I don't know what I did in my last message but I messed up the formatting of this thread and now I can't get in to change it. I wrote Tedster so hopefully he will be able to fix it.

Miamacs, You message is very interesting. I agree there are many factors going on. One of my problems is that very few people link to my individual articles. If they like an article they like the whole site and link to the homepage.

Otherwise trusted pages without being absolutely relevant for a money term get demoted

That helps me a lot in understanding what is going on. One problem phrase on my site is the name of a certain war. Yet my site is on a hobby so Google probably sees things like the materials needed for the hobby, techniques for the hobby and specific kinds of widgets made in my hobby as relevant. The war would never be related to the hobby unless you are talking the history of the hobby which only a few sites do. So Google probably sees the war as irrelevant.

Another is a term that has had problems is for a popular stuffed animal not the hobby my site addresses. Another often refers to a part of jewelry or to building construction not to the product of my hobby. This theory does not fit for all my problem phrases but does fit several of them.

MHes




msg:3270523
 8:53 am on Mar 4, 2007 (gmt 0)

Miamacs

We were lucky in so far we have a site and are in a sector whereby finding the cause and effect of 950 was relatively clear. Having looked at other sites the pattern fits. Based on what we know and have applied my comments are as follows:

>Overall, everything comes down to the anchor text used

Phrases on the page are more important BUT linking can compensate for a high phrase count. Phrases on a page causes suspicion but not certain death. The N value as per the patient is not always called for every search phrase.

>The NEW parameter however, is the phrase/relevancy based reranking.
Agree

> For the homepage that is the foundation...
Disagree. A home page is treated the same way as any other page. If your home page is hit, others can rank normally.

>- All links containing the problematic keyphrases are devalued.
They effect N (phrase count) for a page. They can thus help or hinder. The N value is calculated with no 'search phrase' and done offline but is referred to according to a phrase if seen in the search phrase.

>Too many "irrelevant" links appear on a page and it gets a penalty. Homepage penalized.
Irrelevant links can do no harm unless they contribute to making a high N value offline.

> Only certain terms are affected, not everything is monitored.
Agree

>Doesn't know that "Gadgets" and "Widgets" are essentially the same, because no one ever told it.
Disagree. Predictive phrases as per the patient are a factor.

>authorities could do as they please, for they already have passed the thresholds.
Disagree. All sites are treated the same.

>To add a counter effect, if a site is trusted for "A" but not trusted for "B" and it adds 200 links of articles about "B", and A is not relevant to B ( according to Google ), these get discounted, and worst, the homepage penalized.
This is about 'local rank' and dealt with this way. Different filter kicking in.

> if a page that doesn't have ANY inbounds for it adds 100 links with this term, it "has to be" spam.
A page can rank fine without relevant inbounds if N is not triggered...if it is, then inbounds become relevant for it still to rank.

anax




msg:3270739
 4:31 pm on Mar 4, 2007 (gmt 0)

Thanks to everyone who is contributing to this thread. I'm still trying to grasp the complexity of this.

So, is a basic recommendation for average webmasters like me to reduce the number of times a phrase might be repeated on a page? (Even if I think it's legitimate?) And would this apply to the meta element and page title also?

Nick0r




msg:3270770
 5:45 pm on Mar 4, 2007 (gmt 0)

Yep, titles and metas also play a part.

annej




msg:3270781
 6:00 pm on Mar 4, 2007 (gmt 0)

MHes

If your home page is hit, others can rank normally.

Are you saying that even though the homepage has been affected it's PR is still there so that deep pages are not affected?

Predictive phrases as per the patient are a factor.

Yes, but how extensive is the predictive factor? It would predict the common words and phrases. But would it predict phrases that are rarely seen online as in my war and hobby example in my last message?

All sites are treated the same.

I think you may be right there. My site is 11 years old and I suspect it is viewed as an authority given the gov and edu links plus many more related links. But I think only the homepage is protected. It is the one with all the links, the individual pages have very few. It's the links that give protection not any authority or age designation.

anax

So, is a basic recommendation for average webmasters like me to reduce the number of times a phrase might be repeated on a page? (Even if I think it's legitimate?) And would this apply to the meta element and page title also?

First, I wouldn't change anything unless the page is affected. It's not the number of times any phrase is used. It only becomes a problem if there are phrases that are designated as possible spam phrases.

If the page has dropped way down in the serps it won't hurt to reduce repetition as long as the page and internal links still make sense.

This 186 message thread spans 7 pages: < < 186 ( 1 2 3 [4] 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved