Forum Moderators: Robert Charlton & goodroi
trinorthlighting
Annej,
I read in another thread that you wrote that you have a recip links page. That is probably what is causing your site some grief.
In addition, having reciprocal links (or a recip links page, or even a whole directory with links) is NOT what causes this phenomenon. There are sites with reciprocal link pages and even directories with a percentage of recips that are untouched and have top-notch rankings. And that is a verifiable fact.
Remember, the algo is completely automated with very little human input. You probably need to take a long hard look at who your linking to and if they are spamming.
Remember, Google guidelines state not to have your site link to bad neighborhoods. If one of the sites you are linking to is spamming Google, it can have a drastic effect on your site. Check to see if all the sites you link to are following Google guidelines. If they are not, you might want to drop that particular link.
If a site is SPAMMING by a pattern of linking out to bad neighborhoods, it'll cause a problem with the SITE - not individual content pages that are simply not ranking. This is not the case, not by any means.
I don't know how many times it has to be repeated and requested to please not try to accuse anyone with this phenomenon of somehow spamming, because there's no basis in reality and it can cause unnecessary stress that's unfounded and unjustified and without basis. Trying to help is always appreciated, but this is serious, it's no place for folks to be chasing windmills.
[edited by: tedster at 9:16 pm (utc) on Feb. 27, 2008]
You are a adsense member, you have adsense on all your sites, you have more sites be cause you need a safty net, if google plays around so at least one or 2 sites still rank and you can pay your bills, but Google uses there adsense to see which sites you own, if you own more in the same category, then only 1 of those sites rank with all pages and it dossent matter if the one is about all widgets and the other is specialy about blue widgets.
Problem is it's not that long ago that reciprocal/exchanged links were being lumped together with bought links by both Google and MC.IMO there's a difference between reciprocal and exchanged links, the former can be accidental, the latter not.
There is a BIG difference between a link exchange scheme and high quality, appropriate reciprocal links, done selectively and editorially chosen, between like-minded quality sites in the same/similar niche catering to the same demographic audience.
Would Adam endorse exchanged links?
We can all make our own choices for ourselves whether we want to listen to and trust folks who have been around since link-gaming started and/or what's publicly stated by someone who's designated and trusted as an official spokesman and some person(s) stating different, contrary opinions and handing out misinformed, irrelevant misconceptions.
before I post what I did I would like to look a few affected sites and try the theory on those
Good to hear you've had a result but why the secret squirel act? There are other posters here that have published their theories after getting their positions back in an attempt to try and find what's causing this.
My site appears to have stabilized, and has stopped going from top 15 to 965 every few days.
I made some changes 7 days ago, and it hasn't done a 950 on me since. I cleaned up the spamminess of my breadcrumb links, added a few pages, changed a few internal links and consolidated my reciprocal links "directory" into a single short "resources" page of only recip links of quality (plus a few non-recip's).
The biggest change I think that I made was the breadcrumb links.
I'll post here again if I start getting the 950 hit again, but I think I may have gotten myself off the edge of this filter.
thinks that the same page is spam for certain keywords
I don't mind if Google thinks a page that is just slightly related belongs at 950 (thought I really don't understand why they would rank a page like that at all). But it's a serious problem when they send a page that is exactly on the topic the searcher is looking for and they still 950 it.
Here's how it's done:
[webmasterworld.com...]
I've got a site like that, never changed it because it did well in other engines, first one and then another. But "demerits" for excessive use of internal anchor text, with repetitions on the same page reared its head at the time of the Florida update, which is when that site got hit. Another was corrected (with another modification or two to raise "hub" and topical relevancy for the site) and came back, but that one that kept the spammy navigation never came back at Google.
It's nothing new, I've seen it and have been mentioning it for four years.
We can sometimes forget that with well over 100 factors in the algo, any factor - or combination of factors - can cause a rankings drop. There's little that's new under the sun, but we are seeing something now that's "new and different" and re-hashing same ole' same ole' that's been going on for years - excesses, link stuff, etc., - only detracts from digging in and figuring out whaat this strange new thing is.
[google.com...]
That wasn't the first instance either, nor the first "link exchange program." How long ago was that, how many years ago?
Widgets Home > Widget Types > Big Widgets > Big Yellow Widgets
Mine were like this...
Blue Widgets > Blue Widget Programs > Blue Widget Courses > Blue Widget Book
A few were really as bad as that.
I changed it to...
Blue Widgets > Programs > Courses > Book
Having the repeated double keyword seems to be the key offense (as Marcia pointed out).
I have some other subdomains which have had no problems. They do have breadcrumbs, and they do link back to this main domain, but they don't have any phrase repeated in the breadcrumb like the above.
These are subdomains are like this...
Blue Widgets > Blue Doodles > Blue Wierd Doodles
... and that seems to be OK. I haven't touched those.
[edited by: tedster at 12:20 am (utc) on April 23, 2007]
Method for clustering closely resembling data objects [patft.uspto.gov]
He does make mention of duplicates, but there is a difference between duplicate, near-duplicate, and resembling other documents closely - within the same site. The "click for more results" is taking off the "clustering" filter, which then shows the pages first omitted or "clustered out" but mentions documents being similar - which is resembling others.
I have sites with pages "similar" but there's nothing duplicate about them. It isn't duplicate content, it's redundancy. I've been suspecting for a long time (and have been mentioning it in threads, too) that what's first picked up, or at the beginning of the page/code for documents during the process of returning the results for queries is looked at for similarity - hence the importance of unique page titles and descriptions, and/or what's first in (top of) code that'll be picked up for snippets. I still believe it, even more so now.
I do believe there's a close correlation between the process for snippet generation and clustering of "similar" documents. According to some papers they're both done at query time and are query dependent.
<edited for clarification>
[edited by: Marcia at 12:00 am (utc) on April 23, 2007]
Method for clustering closely resembling data objects
Reading the patent again suggest that these pages are group based on the similarities of 'finger print' that is associated with the clustering. Do you believe that by modifying and changing pages (so they are completely different) where similarities have occured that the pages can be de-clustered?
A) To de-optimise. You need to get your pages so that they don't hit the N phrase count needed to throw your pages to -9xx. This is NOT a good fix, as you will find your pages ranking significantly lower than before due to keyword density being thrashed.
B) To use the alternative remedying factors to save your pages from being phrase based re-ranked. There are a few ways a page can be saved from being evaluated by this system, and a lot of it is to do with the themes of your pages, as Miamacs has explained.
About the whole directories being dropped - that is usually to do with one page (on a specific theme) carrying the rest of the directory down. If that one is successfully fixed, the rest should be fine too.
These are 2 different thing points that have been discussed though, breadcrubs and navigation, aren't they?
1. Aren't the basics that have been discussed here that breadcrumbs need be very concise following the exact user route without repetitive use of keywords / phrases and
2. Navigation links can contain keywords that point to pages that talk specifically about the subject relating to those keywords in the navigation, ensuring that phrased based and co-occurence filters are not tripped by the destination pages.
Or have read too much on this topic now and just confused myself :-)
widgeting
widgeting antiques
widgeting patterns
etc
I realize it really wasn't necessary to repeat the word widgeting as the visitors know that's what the site is about.
BUT in the case of the phase based aspect of the 950 situation I suspect that some phrases can be damaging whether repeated or not. I think this is another issue.
The problem is that you can never be sure if the changes you make brought the page or pages back or if Google just shifted the filter. But a while ago I took a phrase out of my navigation that occurs a lot in MFAs and it seemed to bring the page back. Luckily there was another phrase that described this topic.
I think in some cases the navigation can very accurately point to a page about a topic and the topic has the same phrase in it, even in the title but you still trigger a phrase based flag. This is why this is all so frustrating.
added - Go back in this thread to read and reread Miamacs message #3311620. It shows the different levels of how complicated this puzzle can be.
I've been monitoring this thread since the start and although being virtually white hat I was affected by the sudden drop in rankings for certain pages and phrases.
I've made some adjustments to the site but nothing seemed to work enough that would make a big difference.
I would say only one page has started to appear in the top 3 pages of Google but hey its a start.
Today I took a fresh look and realised that on some of the pages, I had written a php script that was pulling in data from a database and output links to internal pages not external web sites.
At the top of the page I was buidling anchor links:
Category Keyword Phrase 1 (#Category Keyword Phrase 1)
Category Keyword Phrase 2 (#Category Keyword Phrase 2)
Category Keyword Phrase 3 (#Category Keyword Phrase 3)
Then further down the page relevant content links would appear under each category like this
------------------------------
Category Keyword Phrase 1 -
page A.htm
Page B.htmBack to top of Page link
------------------------------
Category Keyword Phrase 2 -
page C.htm
Page D.htmBack to top of Page link
------------------------------
Now this was done 3 years ago and all was fine then but I'm betting this is where my problem exists.
As for breadcrumbs I think the same thing could happen and thats why we're all experiencing different problems and the logic seems to be a bit cloudy.
So another solution I'm trying, remove the keyphrases in the #anchor links as its this specific phrase that has dropped in rankings.
I also checked on competitors and they also had similar #anchor keyphrase links and they've dropped as well.
As I said before most of the site is white hat, no hidden text no misleading web pages but its about avioding mistakes these days.
If we find a solution then it could be a positive thing for SEO business as webmasters will have to be more careful then ever before to build good web sites that avoid being dropped through careless mistakes.
Added -Given what others have experienced and reported we can be grateful or happy that it appears to be a temporary downgrade and that rankings could recover quickly.
Not like a couple of years back when you had to wait 3 months!
What things casue "950 penalty"?
Unfortunately thats what we're all still trying to work out.
It does look like excessive keyword phrases in links throughout a website including links to external websites is the cause. In breadcrumb navigation or not.
Some people are mention breadcrumbs is to blame but I would say it makes no difference. Google just see's links with keyphrases, perhaps if its too much then it could be tagged as a method of Google Bombing.
Just keep trying to find the answers.