Welcome to WebmasterWorld Guest from 126.96.36.199
I read in another thread that you wrote that you have a recip links page. That is probably what is causing your site some grief.
In addition, having reciprocal links (or a recip links page, or even a whole directory with links) is NOT what causes this phenomenon. There are sites with reciprocal link pages and even directories with a percentage of recips that are untouched and have top-notch rankings. And that is a verifiable fact.
Remember, the algo is completely automated with very little human input. You probably need to take a long hard look at who your linking to and if they are spamming.
Remember, Google guidelines state not to have your site link to bad neighborhoods. If one of the sites you are linking to is spamming Google, it can have a drastic effect on your site. Check to see if all the sites you link to are following Google guidelines. If they are not, you might want to drop that particular link.
If a site is SPAMMING by a pattern of linking out to bad neighborhoods, it'll cause a problem with the SITE - not individual content pages that are simply not ranking. This is not the case, not by any means.
I don't know how many times it has to be repeated and requested to please not try to accuse anyone with this phenomenon of somehow spamming, because there's no basis in reality and it can cause unnecessary stress that's unfounded and unjustified and without basis. Trying to help is always appreciated, but this is serious, it's no place for folks to be chasing windmills.
[edited by: tedster at 9:16 pm (utc) on Feb. 27, 2008]
There are tons and tons of Ebay, Amazon, MSN Shopping, Bizrate, etc. pages down there, (as well as some normal ecom sites) that have clusters of pages in the 900's, 4-500's, 700's - all from the same site, clusters of pages with the same (or very close) phrase targeting. Some sites that have page one rankings, in some cases #1 rankings - for the same keyphrase. It isn't links in those cases.
There's just so much storage they're gonna use for a site, and for all practical appearances it looks like redundancies are being demoted to get them out of the way, yet still keep them in the index.
I have seen some cases that seem to be link related, one in particular that's been link spamming since before Florida. I've always watched and wondered when they'd get hit - they're the very last site listed in the main index for the primary keyphrase.
IMHO it's not all for the same reason, but there's an elusive "something" that's hitting some (many) sites that's different and we haven't quite gotten our heads wrapped around it yet. There's a LOT in those patents that seems to apply in a lot of cases and explain a lot, but we're still skirting around the edges.
>>not your traditional links scheme penalty....
It's very easy with Google to assume most everything is due to linking, but I really do think semantic, phrase-based and co-occurrence factors are coming into play, and that's something the individual webmaster really doesn't have control over, or at best it would be very little, because some of the factors are derived from figures that involve the whole index.
[edited by: Marcia at 10:32 am (utc) on April 17, 2007]
Before this term becomes alike to "the sandbox" , where a common, similar symptom was thought to be a single filter. No wonder it took one and a half years (!) for its different possible causes to become widespread knowledge.
There's clearly no SINGLE problem behind the phenomenon of seeing a site go -150, -350, -950. This is but a new kind of penalty box / warning that is applied. Only that it doesn't give away for WHAT it is applied for.
It'd make sense ( to confuse us all ), and again everyone is right. Let's try this on for a change:
- There's a phrase based penaltyfilterreranking that sends URLs (pages) -"950" for queries
- There may be a new, more strict linking profile check that sends SITES (domains) -XYZ for everything
We should've renamed the thread or start two different threads not to confuse the two. We've been reaching the same conclusions for the third time this month from thematic trustrank to too many colliding themes to anchor texts being too generic for a niche site. ( Oh wait, did I mention that before? )
I've been set on read-only mode for I didn't want to anger those who're not interested in the phrase-based stuff. I'm such a sensitive person.
I think alot has to do with the new external links within web pages and the use of rel="nofollow"
I have a block of pages that did well for long keyphrases which have disappeared.
At first I decided it was overuse of keyphrases within internal links but after turning it down a bit still not much change.
Then I noticed I had a link to a certain similar web site "Not Duplicated" on all these web pages which I've now removed.
My home page and other still rank highly in top 10 results for some phrases so its not a complete penalty.
Perhaps just a penalty against keyphrases on pages that have received a penalty that would make much more logical sense.
We'll just have to wait and see if anything improves.
I found the Google Guy post in this thread. [webmasterworld.com...]
I also found this on the topic. [cs.cornell.edu...]
I'm mostly referring to the phrase based filter, not other possible causes. I really do think it is at the base of the penalty because it works page by page. That could be why many sites that were never touched before are have pages dropping to the end of the results. Before our overall site strength protected us but no more.
In my case my site is strong on both inbound and outbound links including some from edu and gov sites. BUT many of my individual pages are weak on inbound links because other sites generally link to one of two or three top pages on the site.
For example I had a subsection in which all pages dropped to the 950 region. A single good inbound link to the contents page of the section brought the whole section back.
So I think we are looking at a situation where the page's strength (authority, hub or both) can give some protection from the phrase based problem. Since sometimes it is almost impossible to sort out what words or phrases are causing the problem strengthening the page with even one or two good links might make the difference.
I'm wondering if one or two strong outgoing links might help as well but I haven't tried that.
As of now, there is no pattern of occurences that can be connected-up in a meaningful way that makes any sense. Sure, we know WHAT occurs, but they still appear to be unconnectable. Consider...
Some sites lose one directory...
Some sites lose 90% of eveything...
Some sites lose 90%, but keep the big-money terms...
Some sites lose the big-money terms, but keep the long-tails...
Some sites recover by reducing phrase density...
Some sites recover by reducing internal links...
Some sites recover by reducing phrase density within internal links...
Some sites recover with new inbounds...
Some sites don't recover with new inbounds...
Some sites recover with no changes at all...
Some sites recover in and out...
Some sites never recover at all...
Some sites never get hit at all...
Where's the pattern in all that?
<<A single good inbound link to the contents page of the section brought the whole section back>>
Notice that word "good."
I've been *leaning toward* the idea that this -950 is being caused by a change in how trustrank values are distributed within a site via internal linkage, because that would seem to be a broad and reasonable goal that Google would want to pursue, i.e. how much trust should be placed on an internal link to a page that either has no external links (no "second opinions"), or only has untrusted external links.
And the word "good" in annej's post fits right in to prior observances that some pages recover with new externals and some don't. The difference would be the level of trust a new external link brings. It fits the pattern.
At some point, Google definitely made the transition from a "links, links, links" approach, to a "link trust, link trust, link trust" approach. But the flaw in that concept was that trusted authority sites appeared to have too much ranking power for terms they really had no business ranking for. Maybe Google is just now getting around to fixing that flaw and the effect of that fix is now working its way across the net.
Just an idea to consider.
And all that is based on the vague assumption that trustrank values via internal links even their goal in the first place. Which I'm far from sure of.
1. I get a "penalty" and loss a certain %
2. I keep adding content
3. The old natural growing due the new content just stops
Shouldn't I be seeing some sings of recovery due the new content?
It's not like I have a website that I update from time to time, this is updated daily.
[edited by: Biggus_D at 6:08 pm (utc) on April 17, 2007]
I think the difference between site-wide and page specific is how the external links come in. I've seen a few sites, but the smallest, easiest example had almost all the incoming links to the main page, for the main keyword. If that keyword/main page is hit, all the subpages dependent on it are also, unless very specific terms searched. So in the beginning, it acted like "site-wide" but wasn't really.
The one page with good external backlinks came out of penalty long before the main page and all its dependents.
Too many and too varied of keywords/phrases on a page, and not properly supported by external backlinks is what I see. And if hit, that page doesn't pass on trust to other pages.
[edited by: trakkerguy at 6:33 pm (utc) on April 17, 2007]
Decreasing the scope
Sorry if this is a basic question but when you say decreasing the scope are you saying that a page needs to cover just one subject / keyword/phrase and if you have a page that already does that, reduce the optimisation levels on that page there by taking a more natural approach to the language used as viewed by Google
But add "hairy widgets", "free widgets", "best free widgets", ""online widgets", and you may have problem if your incoming links don't support that you are an authority that should rank for that expanded scope.
The more competive the term, the better your links need to be to validate your site should rank. And if you "claim" to be knowledgable for all the highly searched "#*$!x widget" phrases, you need better links to support that broader claim of authority about widgets in general.
I'm talking about having the terms in the anchor text. Really not sure though if just having the terms on the page cause the problem.
This is about in bound/outbound links, downgrading of exchanges and also the no follow that was recently place on wiki pages.
While linking is, and has been for a long time, a factor in other respects, and still is, I don't believe it's got anything to do with this particular 950 phenomenon we're trying to hone in on. There's been no evidence of the 950 drop being related, despite your observation, which has been effectively countered with logic by more than one person.
But if you insist, then explain for us how linking factors, recips and wikipedia links/no follow can affect only specific interior pages and more importantly - certain specific phrases that relate to and appear on specific interior pages - not at all the whole site.
If you insist, tell us how that works. We can tell you how phrase-based indexing can affect individual phrases on individual pages, now you explain for us how links and recips can affect individual phrases on individual pages.
Go for it, we're all ears!
[edited by: Marcia at 8:30 pm (utc) on April 17, 2007]
"big green widgets" might be ok as it is less scope,
but that "green widget bowling" or "download green widgets" might be trouble if they change it to a different theme that your page is not qualified for?
Thats kind of how I see it...
[edited by: trakkerguy at 8:37 pm (utc) on April 17, 2007]
Why do you think quite a few people have come out of this with getting some "good" external links.
You keep saying phrased based. What advice are you giving? Tell people to sit and wait?
I say go find a "good" and trust worthy link to your -950 page and bam, you will be out of the penalty.
Trust me on this, a member here had a page and emailed me and I linked to his site from a very trusted site of ours that has always ranked top 10. It was a very relevant topic to our site and I am sure it will add value to our customers. Guess what, his page is no longer -950. As a matter of fact his page is number 12 now. Our page has not budged an inch. It took two days for the page to come out of -950
[edited by: trinorthlighting at 8:59 pm (utc) on April 17, 2007]
I see one spammy looking site in the top five with the keyword phrase twice in the title.
Some informational, tightly focused authority sites are the collateral damage of this new algo tweak.
Let’s say, you (Honest Abe) have mom and pop site with link exchanges to "Slim Shady Site” Well "Slim Shady" has just recently started a bunch or doorway pages, is sending out massive amount of email spam, and is doing sneaky redirects. Google flags the site as spam and looks who links to “Slim Shady Site”. Now Google bot has no idea that your site is run by “Honest Abe”. Sure, Google bot can look at the who is and see “Honest Abe” and “Slim Shady” and even the different IP addresses. Well, spammers long ago have figured out that they can register many sites on many IP’s and even under different names. So Google bot really can not tell the difference. So now since “Slim Shady” is spamming and Google bot now flags “Honest Abe’s” website as a possible link scheme. Bam, penalty hits, fully automated and very stealthy.
Look at your outbound links. Look for hidden script linking your site to bad neighborhoods as well. Look for hackers who might have hit your site and dropped some hidden text links.
Right from Google Guidelines here:
In particular, avoid links to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links.
It’s all right there in front of you and all wrote in very plain English. Those mom and pop link exchanges can kill a site, put no follows on them so Google knows you do not endorse the site or are participating in link schemes. Be very careful who you link out to.
I haven't heard many theories about what causes this penalty which makes sense when you see large sections of site go to the last 50 results, then back to the top 10, then back to the last 50, then back to the top 10.... over and over again.
I too once thought that a few good links to an internal page that was pushed to the bottom of the results could get that page to come back. And I even saw it "work" once. But then a week or so later it was back at the bottom. In the end I believe it was pure coincidence.... when I thought something I had done brought the site back turns out to have been false since within a few days it went back to where it was.
Some people have only been "cycled" down to the bottom once or twice.... others have been up and down with this phenomenon over a dozen times.