| 7:46 pm on Feb 3, 2009 (gmt 0)|
Whether it's algortihmic or manual is a moving line. Many types of ranking drops are first introduced as manual but then later moved to an automated system.
The most common cause for across the board ranking drops involve linking that seems manipulative or involves "bad neighborhood". An undetected server hack that cloaks for googlebot is often a culprit these days.
An iframe might trigger a penalty in some cases - but that would be a Google bug, because there's certainly no prohibition about using an iframe if the framed page is legitimate. One publicized case of an incorrect Google penalty for an iframe happened last year. According to Matt Cutts, Google does run an automated routine looking for "suspicious" large blank areas in a page's layout. In that case, the algorithm incorrectly identified one site's iframe as such a suspicious area and an automatic penalty was put in place.
I've been looking for that reference, but it eludes me right now. I believe it was a case on the Google Webmaster Help forums, and either Matt Cutts or JohnMu got involved. We discussed it here, too, but that discussion may have been just part of a larger thread.
These ideas I just listed are far from an exhaustive list of possibilities, just the first few things I thought of. If your ranking problem just happened today, I'd wait a couple days before taking action. It may be a Google bug that clears up on it's own. But do use the time to scrutinize your site for technical isses, undetected hacks and so on.
If rankings don't come back quickly, then you may need to send a reconsideration request -- and in that case, it's always good to have a few things to say about clean-up efforts you've made.
| 8:06 pm on Feb 3, 2009 (gmt 0)|
I did do some needed cleaning that may have triggered something, but I doubt it.
I wonder if this has anything to do with our minus 50...I had an irate customer that claimed to have gotten tons of SPAM after visiting our site (not us, possible one of our partners). What if they filled out a SPAM report with Google? Would that trigger a filter/penalty? I would imagine if it did, we would be gone all together...thoughts?
| 8:08 pm on Feb 3, 2009 (gmt 0)|
A spam report, on its own, will not cause a penalty - not even a flood of spam reports. However, if a Google person investigates and finds a problem on your site, THEN you might have trouble.
| 8:09 pm on Feb 3, 2009 (gmt 0)|
not a chance this is the cause of the filter.
|SPAM report with Google? Would that trigger a filter/penalty |
| 8:34 pm on Feb 3, 2009 (gmt 0)|
That is good to know...
Hopefully it is a glitch b/c the site is pretty squeaky clean.
| 9:47 pm on Feb 3, 2009 (gmt 0)|
The -50 across the board is algorithmic usually; when you say the site is squeaky clean, I assume you mean both onpage and off page? No purchased links?
| 10:07 pm on Feb 3, 2009 (gmt 0)|
I'll never forget a few years ago when "squeaky clean" Google itself had a very spammy outbound link on one of their Help pages for a few days. It was apparently placed by a not-so-honest employee at the time.
We've had many, many threads here about "squeaky clean" sites that eventually discovered either a hacked server, a shady employee/contractor, or a major technical oversight. So definitely, take nothing for granted.
You know what you intend, but that may not be what's actually happening on your server.
| 10:47 pm on Feb 3, 2009 (gmt 0)|
I have some of my sites in the -50 box for a few months ... there is no chance to recover without a reconsideration request (probably there is no chance to recover at all) ... I think it's a manual penalty.
| 10:57 pm on Feb 3, 2009 (gmt 0)|
No purchased links...
| 11:28 pm on Feb 3, 2009 (gmt 0)|
I have checked hundreds of sites in the -50 box and the common issue is not enough value for the visitor (poor design, thin URLs, no valueable content) .. that's all I can say.
| 11:51 pm on Feb 3, 2009 (gmt 0)|
Well...that's not us. We ad unique content regularly...
| 12:18 am on Feb 4, 2009 (gmt 0)|
When you type only your domain name as a search without the extension, is your site appearing on the first page?
| 12:50 am on Feb 4, 2009 (gmt 0)|
Would there be some sort of identifier in my logs for a human review?
Also, for the past week or so up until we got hit, one of internal pages was piggy backing our home page for our money term in rankings. That interior page has little content...its a calculator. I was surprised it was ranking for the money term. Perhaps that tripped something?
| 9:20 am on Feb 4, 2009 (gmt 0)|
textex, You definately got the penalty/filter. There have been other threads about this and possible solutions by some people that got out of it. Please search.
| 7:01 pm on Feb 4, 2009 (gmt 0)|
|...its a calculator. I was surprised it was ranking for the money term. |
Did you do any intentional link building for that calculator page?
| 8:27 pm on Feb 4, 2009 (gmt 0)|
Not at all...
| 1:32 pm on Feb 9, 2009 (gmt 0)|
textex check your navigational menus.
a) Usually I see the "megamenu" problem has been extended and it's now -for me- an automated penalty. Think about your users and ask yourself if you can simplify the navigational logic inside your website.
b) reduce external links with the same anchor text or with similar anchor texts.
c) check linking homepage from internal pages.
| 3:39 pm on Feb 9, 2009 (gmt 0)|
|I've been looking for that reference, but it eludes me right now. I believe it was a case on the Google Webmaster Help forums, and either Matt Cutts or JohnMu got involved. We discussed it here, too, but that discussion may have been just part of a larger thread. |
Matt Cutts on Usenet [groups.google.com]
(Feb 22, 2008)
P.S. It's interesting Matt says, "It's definitely possible to extricate your site, but I would make an effort to contact the sites with your sponsored links and request that they remove the links, and then do a reconsideration request. Maybe in the text of your reconsideration request, I'd include a pointer to this thread as well."
I didn't know Google actually read those Recon Reqs. :)
| 6:50 pm on Feb 9, 2009 (gmt 0)|
In this thread (Feb 22, 2008) it becomes clear a link devaluation is always connected to some sort of negative flag which is added to a domain since MC suggests a reconsideration request it means to me just removing those links will not remove the link devaluation for the whole site.
| 11:20 pm on Feb 12, 2009 (gmt 0)|
When I see someone say they have a penalty and they swear the site is "squeaky clean" I have doubts. What they often mean is there is nothing we have done that google could ever detect on and off page. Google can and will penalize your site for off page factors. If you are buying links or using linking schemes like counters and wp themes they will get you. The whole "you can't get a penalty from external factors" is false. If you do something in bulk Google will find it.
It won't be from just doing a few things. You have to be doing a lot of it. Google assumes that nobody would take that big of a risk and expense to get somebody penalized. Large authoritative websites are immune to this kind of penalty. Large authoritative websites don't have the same rules as most websites.
| 12:40 am on Feb 13, 2009 (gmt 0)|
We don't buy links, nor do we exchange links. Period.
I have noticed that links many of the links to our site are linking to our company name which is 'Blue Widget Center'. I think that non-variation may come across as spammy even though we have never followed who links to us and how until now.
| 3:15 am on Feb 13, 2009 (gmt 0)|
I'm seeing and hearing enough reports about minus-50 drops that I'm wondering whether Google has turned up some knobs, and/or whether this might be the -950 in different packaging.
I've also just taken a look at a blue-chip site that I wouldn't expect to see dropping, and misterjinx appears to be right on. Megamenu was right there.
Additionally, at least with this site, there seems to be an over-optimization component... perhaps triggered by too much use of target keywords in anchor text from home. It's not what I'd call intentional overoptimization... just collateral damage from thinking about rankings certain terms too much. The site appears to be at a threshold level that Google might adjust, but it's hard to say which way they'll go.
| 11:39 am on Feb 14, 2009 (gmt 0)|
We just had head 2 sites out of 4 released from the -50.
The penalty was definitely for incoming links. We were using link schemes to get loads of crap links to sites and that was paying off for a long long time until we got busted.
We stopped using the crap link networks for all 4 sites immediately. We filed reinclusion for the sites.
2 of the 4 sites were released after exactly 90 days.
The other 2 seem stuck in there for some reason.
Each of the 4 sites is different site. All in one vertical but comvering various niches and offering various products / services.
Any ideas anyone?
| 7:14 pm on Feb 14, 2009 (gmt 0)|
I was convinced my clients penalty was based on how we got the links. After reading a bunch of threads on the subject i'm now thinking it was how they were linked to. I think Robert is correct. Google turned a few knobs on their penalty algo and then ran it back in October or so. I think they have tuned it down a little very recently.
What google likes to do is every once in a while do something drastic to clean out the system. They can learn a lot from that. They get tons of re inclusion requests. They now know what sites are flagged as having an SEO involved. They really like to do it right before Thanksgiving. Google likes to learn as they go along. They do something and see how many people scream. They can then take all the re inclusion requests and study those sites. They can also take google analytics data and see who they knocked out. There is no doubt in my mind that they use GA data.
They always say that the departments don't talk but I don't believe that for a minute. If you get a $10,000 check from adsense your website is going to get a once over from the spam team. It is very easy to sift through all the sites that get big Adsense checks because there are not that many.
| 11:25 pm on Feb 16, 2009 (gmt 0)|
"c) check linking homepage from internal pages."
Could you explain how problems could arise from linking to homepage from internal pages?
| 12:36 am on Feb 17, 2009 (gmt 0)|
-> linking homepage from internal pages
there are lots of threads on this. basically, if you use link text other than "home" on links back to your homepage - for example if you linked "blue widgets" from the footer of all 10,000 of your pages that would be some pretty serious link text - google tends to take a dim view of this tactic
| 1:30 am on Feb 18, 2009 (gmt 0)|
Linking to homepage from all pages with keyword rich anchor text works wonders in some other search engines. I don't think it causes enough negative affect in Google to justify not doing it.
| 2:45 am on Feb 18, 2009 (gmt 0)|
In the worst case it will cause a -950, but this is a different topic.
| 5:47 am on Feb 18, 2009 (gmt 0)|
SEOPTI that is so wrong. I do it all the time on sites and rank very well in MSN and do not have any penalty's in google.
[edited by: tedster at 6:03 am (utc) on Feb. 18, 2009]
| This 38 message thread spans 2 pages: 38 (  2 ) > > |