as many has said here, we see sites, with H1, heavy link trades, lots of keywords and much more at the top and this is what we have talked about could be wrong with those sites in 950 filter, but how about this:
You are a adsense member, you have adsense on all your sites, you have more sites be cause you need a safty net, if google plays around so at least one or 2 sites still rank and you can pay your bills, but Google uses there adsense to see which sites you own, if you own more in the same category, then only 1 of those sites rank with all pages and it dossent matter if the one is about all widgets and the other is specialy about blue widgets.
|Problem is it's not that long ago that reciprocal/exchanged links were being lumped together with bought links by both Google and MC. |
IMO there's a difference between reciprocal and exchanged links, the former can be accidental, the latter not.
Absolutely in agreement. There's a big difference between mass_production and selectivity. Distinguishing and defining differences is an important thing to do - and a lot depends on individuality, editorial judgment, theme relevancy, quantity, and rate of accrual over time.
There is a BIG difference between a link exchange scheme and high quality, appropriate reciprocal links, done selectively and editorially chosen, between like-minded quality sites in the same/similar niche catering to the same demographic audience.
|Would Adam endorse exchanged links? |
glengara, I'm not a mind-reader and can't speak for Adam, but I can't see anyone from any search engine endorsing a mass-produced, indicriminate link exchange program/scheme. But I believe the post done in the Google group (previously quoted here) adequately addressed the issue of Google's attitude toward the the traditional practice of legitimate reciprocal links, as traditionally done, in proper balance.
We can all make our own choices for ourselves whether we want to listen to and trust folks who have been around since link-gaming started and/or what's publicly stated by someone who's designated and trusted as an official spokesman and some person(s) stating different, contrary opinions and handing out misinformed, irrelevant misconceptions.
|before I post what I did I would like to look a few affected sites and try the theory on those |
Good to hear you've had a result but why the secret squirel act? There are other posters here that have published their theories after getting their positions back in an attempt to try and find what's causing this.
You're right M, but you've been reading the Runes for a while now :-)
A lot of people seem to assume reciprocal and exchanged are the same thing, so when they read of a G spokesman broadly endorsing reciprocals....
Cautious good news here...
My site appears to have stabilized, and has stopped going from top 15 to 965 every few days.
I made some changes 7 days ago, and it hasn't done a 950 on me since. I cleaned up the spamminess of my breadcrumb links, added a few pages, changed a few internal links and consolidated my reciprocal links "directory" into a single short "resources" page of only recip links of quality (plus a few non-recip's).
The biggest change I think that I made was the breadcrumb links.
I'll post here again if I start getting the 950 hit again, but I think I may have gotten myself off the edge of this filter.
That's great news - can you just clarify what you mean by cleaning up the spamiest breadcrumb links. Were these just main keywords that pointed back to the previous page or were they 'stuffed' or what. Without giving too much away, can you give an example using the widget factor :) Thanks
I'm not clear on how breadcrumbs could be spammy. Please explain. I'm concerned about setting my navigation so that is is not considered at all spammy but is still helping my visitors find what they are interested in.
|thinks that the same page is spam for certain keywords |
I don't mind if Google thinks a page that is just slightly related belongs at 950 (thought I really don't understand why they would rank a page like that at all). But it's a serious problem when they send a page that is exactly on the topic the searcher is looking for and they still 950 it.
Spammy breadcrumbs -- sound like a recipe, doesn't it? I think I saw it in the Adwords in my gmail once.
Widgets Home > Widget Types > Big Widgets > Big Yellow Widgets
That kind of pattern might do it, no? Just as the same labels might hurt in a regular menu.
Here's how it's done:
I've got a site like that, never changed it because it did well in other engines, first one and then another. But "demerits" for excessive use of internal anchor text, with repetitions on the same page reared its head at the time of the Florida update, which is when that site got hit. Another was corrected (with another modification or two to raise "hub" and topical relevancy for the site) and came back, but that one that kept the spammy navigation never came back at Google.
It's nothing new, I've seen it and have been mentioning it for four years.
We can sometimes forget that with well over 100 factors in the algo, any factor - or combination of factors - can cause a rankings drop. There's little that's new under the sun, but we are seeing something now that's "new and different" and re-hashing same ole' same ole' that's been going on for years - excesses, link stuff, etc., - only detracts from digging in and figuring out whaat this strange new thing is.
Here's an illustration:
That wasn't the first instance either, nor the first "link exchange program." How long ago was that, how many years ago?
Tedster pretty much nailed it...
|Widgets Home > Widget Types > Big Widgets > Big Yellow Widgets |
Mine were like this...
Blue Widgets > Blue Widget Programs > Blue Widget Courses > Blue Widget Book
A few were really as bad as that.
I changed it to...
Blue Widgets > Programs > Courses > Book
Having the repeated double keyword seems to be the key offense (as Marcia pointed out).
I have some other subdomains which have had no problems. They do have breadcrumbs, and they do link back to this main domain, but they don't have any phrase repeated in the breadcrumb like the above.
These are subdomains are like this...
Blue Widgets > Blue Doodles > Blue Wierd Doodles
... and that seems to be OK. I haven't touched those.
|Blue Widgets > Blue Doodles > Blue Wierd Doodles |
... and that seems to be OK. I haven't touched those.
That makes sense. I would expect problems to come from over-using the more semantically specific words, and not from repeating the more general ones.
If you're 950'd and you search for your page, do you see it at the end of the results or do you have to click "repeat the search with the omitted results included" in order to see it?
And if you click it does your page show up back in it's original position or is it at the end of the results?
I believe having to click for omitted results is a duplicate content problem, and not related to -950.
Agreed - the affected result shows up near the end of the results without clicking on an "Omitted Results" link. So if there are only 450 results before that link in a particular search, then the affected result might show up at 430 or so. It's near the end of results, and not always at #950.
[edited by: tedster at 12:20 am (utc) on April 23, 2007]
Here's where Andrei Broder first talked about "shingles" (first as far as I know, anyway). US Patent # 6119124
Method for clustering closely resembling data objects [patft.uspto.gov]
He does make mention of duplicates, but there is a difference between duplicate, near-duplicate, and resembling other documents closely - within the same site. The "click for more results" is taking off the "clustering" filter, which then shows the pages first omitted or "clustered out" but mentions documents being similar - which is resembling others.
I have sites with pages "similar" but there's nothing duplicate about them. It isn't duplicate content, it's redundancy. I've been suspecting for a long time (and have been mentioning it in threads, too) that what's first picked up, or at the beginning of the page/code for documents during the process of returning the results for queries is looked at for similarity - hence the importance of unique page titles and descriptions, and/or what's first in (top of) code that'll be picked up for snippets. I still believe it, even more so now.
I do believe there's a close correlation between the process for snippet generation and clustering of "similar" documents. According to some papers they're both done at query time and are query dependent.
<edited for clarification>
[edited by: Marcia at 12:00 am (utc) on April 23, 2007]
|Method for clustering closely resembling data objects |
Reading the patent again suggest that these pages are group based on the similarities of 'finger print' that is associated with the clustering. Do you believe that by modifying and changing pages (so they are completely different) where similarities have occured that the pages can be de-clustered?
There are two main choices to "bust out" pages from this penalty.
A) To de-optimise. You need to get your pages so that they don't hit the N phrase count needed to throw your pages to -9xx. This is NOT a good fix, as you will find your pages ranking significantly lower than before due to keyword density being thrashed.
B) To use the alternative remedying factors to save your pages from being phrase based re-ranked. There are a few ways a page can be saved from being evaluated by this system, and a lot of it is to do with the themes of your pages, as Miamacs has explained.
About the whole directories being dropped - that is usually to do with one page (on a specific theme) carrying the rest of the directory down. If that one is successfully fixed, the rest should be fine too.
|B) To use the alternative remedying factors to save your pages from being phrase based re-ranked. There are a few ways a page can be saved from being evaluated by this system, and a lot of it is to do with the themes of your pages, as Miamacs has explained. |
Care to elaborate a bit more on this?
Whoa... about that structure thing again.
My directories on one of my problem sites aren't nearly as deep. However the structure would be like this -
Would something like this easily trip the filter?
I understand what you mean by spammy bread crumbs now. That is something I've learned to watch myself on. If your site is about say, "old widgets", it's easy to write your navigation with a lot of repetition of the phrase simply because that is what the site is about.
These are 2 different thing points that have been discussed though, breadcrubs and navigation, aren't they?
1. Aren't the basics that have been discussed here that breadcrumbs need be very concise following the exact user route without repetitive use of keywords / phrases and
2. Navigation links can contain keywords that point to pages that talk specifically about the subject relating to those keywords in the navigation, ensuring that phrased based and co-occurence filters are not tripped by the destination pages.
Or have read too much on this topic now and just confused myself :-)
My mistake was in setting up my navigation as
I realize it really wasn't necessary to repeat the word widgeting as the visitors know that's what the site is about.
BUT in the case of the phase based aspect of the 950 situation I suspect that some phrases can be damaging whether repeated or not. I think this is another issue.
The problem is that you can never be sure if the changes you make brought the page or pages back or if Google just shifted the filter. But a while ago I took a phrase out of my navigation that occurs a lot in MFAs and it seemed to bring the page back. Luckily there was another phrase that described this topic.
I think in some cases the navigation can very accurately point to a page about a topic and the topic has the same phrase in it, even in the title but you still trigger a phrase based flag. This is why this is all so frustrating.
added - Go back in this thread to read and reread Miamacs message #3311620. It shows the different levels of how complicated this puzzle can be.
MFA = Made For AdSense
I'm sure you've seen those sites. It's nothing more than a simple nav that reloads adsense data. In the end, there is no information.... just adsense links to click on.
Got it - thanks :-)
I've been monitoring this thread since the start and although being virtually white hat I was affected by the sudden drop in rankings for certain pages and phrases.
I've made some adjustments to the site but nothing seemed to work enough that would make a big difference.
I would say only one page has started to appear in the top 3 pages of Google but hey its a start.
Today I took a fresh look and realised that on some of the pages, I had written a php script that was pulling in data from a database and output links to internal pages not external web sites.
At the top of the page I was buidling anchor links:
Category Keyword Phrase 1 (#Category Keyword Phrase 1)
Category Keyword Phrase 2 (#Category Keyword Phrase 2)
Category Keyword Phrase 3 (#Category Keyword Phrase 3)
Then further down the page relevant content links would appear under each category like this
Category Keyword Phrase 1 -
Back to top of Page link
Category Keyword Phrase 2 -
Back to top of Page link
Now this was done 3 years ago and all was fine then but I'm betting this is where my problem exists.
As for breadcrumbs I think the same thing could happen and thats why we're all experiencing different problems and the logic seems to be a bit cloudy.
So another solution I'm trying, remove the keyphrases in the #anchor links as its this specific phrase that has dropped in rankings.
I also checked on competitors and they also had similar #anchor keyphrase links and they've dropped as well.
As I said before most of the site is white hat, no hidden text no misleading web pages but its about avioding mistakes these days.
If we find a solution then it could be a positive thing for SEO business as webmasters will have to be more careful then ever before to build good web sites that avoid being dropped through careless mistakes.
Given what others have experienced and reported we can be grateful or happy that it appears to be a temporary downgrade and that rankings could recover quickly.
Not like a couple of years back when you had to wait 3 months!
Now with all this, can you remember the good old days where you did not have to make your site for google, but where you had free hands to create a site you want. This bread crum stuff is just pure cr.., you can even place your company info on every page, be cause google says so, I dont use bread crums, but still I think its cr..
Can anyone write a very brief summary of this huge 7 part thread?
What things casue "950 penalty"?
|What things casue "950 penalty"? |
Unfortunately thats what we're all still trying to work out.
It does look like excessive keyword phrases in links throughout a website including links to external websites is the cause. In breadcrumb navigation or not.
Some people are mention breadcrumbs is to blame but I would say it makes no difference. Google just see's links with keyphrases, perhaps if its too much then it could be tagged as a method of Google Bombing.
Just keep trying to find the answers.
There's an interesting report from errorsamac in this thread:
I'm in the process of testing this out, but I think I have found a flaw in how Google
is counting links and/or preventing Google bombs. If you have a high PR site
(PR7 for example) and you link to a normal, non-authority, PR3 site, you can take
that site out of the SERPs for the particular keyword or phrase. I just tested it on
one of my sites that I don't care about...
...did the url you targeted go to "end of results" -- as in what people
call the "-950 penalty"?
Yes, I just checked and both sites are hit with the -950 penalty....
...as far as I know, there is no known way to recover from this
(other than to get good links to offset the bad links)
| This 194 message thread spans 7 pages: < < 194 ( 1 2 3 4 5  7 ) > > |