|Basically, if it was intended to be fair, they would then go and manually check all the other 30 sites and, to be sure, at the very least 25 in my segment would have gotten the same penalty which would have moved me up to #5 spot. But they will never do that ‘cause they’ll never have enough human reviewers for that. |
You don't know that it was a manual penalty, because--if there was a penalty--it could have been the result of multiple factors that just happened to trigger an automated penalizer. Maybe you were doing X, Y, and Z, while the other guys were doing only X and Y, Y and Z, or X and Z. Only Google knows.
But even if the penalty were applied by hand, so what? What does "being fair" have to do with anything? Google isn't refereeing a football game; it's trying to discourage behavior that has a negative influence on its search results. All it needs to do is penalize enough offenders to make the others reconsider their behavior. Think of a Google penalty as the equivalent of a traffic ticket: When drivers on the highway see a car pulled over, they tend to slow down (at least until the cop is out of sight).
One can argue that, if Google is playing traffic cop, the strategy isn't working. After all, Webmasters are still spamming, just as drivers are still speeding. But the situation won't be improved by eliminating penalties until such time as all offenders can be caught--just as speeding won't be stopped by taking cops off the street until all speeders can be caught.
|You are apparently missing the whole point of this thread. When this penalty is imposed onto your site, all your pages slide 30 positions down, regardless of the keyword. |
The logic to correct the problem is usually the same whether it is one page, 100 pages or a whole site. The people who work for Google are pretty clear on what kinds of pages they favor in their index and which one they don't, so there are usually some pretty standard type things to check. Compared to getting banned, dropping to page 3 is often recoverable - it means you are still in the ball park.
This is an interesting thread...I have about a hundred active sites that I've built up over the years. I have one site that ranks #31 for two ultra competitive terms....I've NEVER gotten past #31...I'll bounce around #31 to #38 and always come back to #31. Pretty sure that's an anchor text issue.
Another site I own has always been #11 to #14 for another large term, with unique content, varied anchor text, 15,000 pages....old domain...and thousands of natural back links... after read this thread about keyword density I realize that I had put a large table in my footer with about 75 links to inernal pages, with the primary keyword repreated in each link. Anyways, it was redundant and unnatural and really no need for it...so I removed it completely... my guess is that in 1-2 months I'll be bouncing around #2-#5 for my primary term and probably it will boost all the other pages for other terms as well.
I can't wait!
Thanks for all the good advice!
There's what looks like a +10 report -- thanks. And also thanks for mentioning footer links. I have definitely seen troubles in that area. In fact, I've seen domains banned from Yahoo for going "over the top at the bottom", so it only makes sense that Google looks at it closely, too.
Does anyone else with a new +30 see extreme crawling?
[edited by: tedster at 5:20 pm (utc) on Oct. 15, 2006]
there was another thread where an ink seller had the same problem.
To all: Could it be that more than one thing is in play here? Meaning links+something else? Could this be the "commit 2+ offences" penalty?
ted, no crawling here.. one visit a day.. hoping the googlebot takes a bit soon.. it's been months since a deep crawl for me.
No additional crawling noted here, just the typical handful of pages daily.
Jane, one can spend countless hours, days, weeks attempting to make "fixes" but if this penalty
is manually imposed on one's domain, not pages, domain
then all is for naught.
[edited by: tedster at 9:54 pm (utc) on Oct. 15, 2006]
let's clarify: does everyone have problems with getting Google to spider the site (i.e. not enough Googlebot vists?) Can you post how many visits and how many pages do you have in total?
Also, how often does your index gets refreshed?
Google visits about 100-200 a day and I have ~1400 pages (I have blocked them via robots from outgoing and internal links such "comment on this," "send page" etc.)
The index page is picked up daily but is refreshed on the cache every 5-6 days.
|Jane, one can spend countless hours, days, weeks attempting to make "fixes" but if this penalty |
is manually imposed on one's domain, not pages, domain
then all is for naught.
I obviously don't know exactly what is in the algo, but I don't know why you guys think dropping 30 positions has to be a manual penalty. Out of the billions of web pages out there, only a pretty small percent most likely have manual penalties, and I suspect those penalties are reserved for sites caught doing something particularly egregious.
I would agree with Jane (and others) that this is most likely an automated penality, but it certainly is significant when sites stop ranking for their own brand / domain name (or at least drop to 31st place) - while I can understand keyword searches benefiting from such a move, I can't see how it benefits relevance to remove a site from searches for the site name.
I also have a site affected by this - 5 year old site, haven't touched it in terms of SEO or link building in well over two years. DMOZ and Yahoo directory listing, some old recriprocal links, some affiliate links (a few low content pages). Its optimised at a very basic level - not even close to being considered "over optimised".
Not really that bothered about the penalty - slightly bemused more than anything (first site I've ever had penalised in 5 years of SEO!). :)
I'm looking into it in a bit more detail though and I'll happily share any findings. So far from various forum threads, I've seen a variety of theories, which collectively suggest (as has been mentioned) that it isn't a specific penalty for x, y or z but more a slap on the wrists for committing a minor infraction of quality guidelines.
[edited by: tedster at 11:51 pm (utc) on Oct. 15, 2006]
|haven't touched it in terms of SEO or link building in well over two years. |
That might be your problem right there.
SEO to rank for your own name is not needed. His site is 5 years old and he said that he has enough links to rank better than #31. If he was at #4 or 5 then we could argue that with soem SEO he'd be on top.
[edited by: tedster at 12:47 am (utc) on Oct. 16, 2006]
|That might be your problem right there. |
Lol classic - sorry but perhaps I didn't clarify the situation. I have many sites that haven't needed a finger of work for years - they all grow naturally and easily maintain or improve their rankings. Certainly "no SEO" doesn't warrant a penalty - worst case scenario it might mean a drop in rankings, but not throwing absolutely everything to 31st place.
The name of the site is fairly unique (not a keyword) - I've used it for usernames in the past and now those profile pages out rank the site for a search for "sitename". Those aren't "quality" results regardless of how the content on my site is perceived. Ban the site completely - fine, I don't care - at least those end up being relevant results (based on what Google has in its index).
I've spent a lot of time dealing with sites that have had penalities slapped on them (clients coming from dodgy SEO agencies) - never seen anything like this. I've no doubt that I'll find my site's particular problem but that really isn't the issue. This is affecting a lot of sites, a lot of which only have very grey or borderline infringments of quality guidelines. Perhaps better than an outright ban, but a significant change in SEO for Google (for better or worse) and certainly worthy of further research.
Google bot crawling:
About 10K/day on a 150,000 pages site that was hit by +30 penalty
After more careful analysis on my data I would have to say that it's not all that different from what it was before the penalty. Well, maybe about 1K/day increase or less, so I might have called undue attention to an unrelated fact.
I remember MC saying this is applied when you become 30 years old (just kidding :)
I have been on the 31st place from 23rd June 2006. Before this, the site was the no. 1 for a competitive word in my language. Now, even for ourcompanyname.com is on 31st place.
I have to say that the site was part of the Digital Point Coop and that it had links on every page ( quite often to some low quality sites).
The site was fixed in June but is still on the 31st place for every query that has more than 30 results.
Funny thing, at the last PageRank update it went from PR4 to PR6.
I have a site that has the same penalty for it's domain name "BUT" the site ranks well for keywords.
If I search for my domain I get a ton of scrapers and supplemental results in the top 30 then my site shows at #31.
If I search for keywords then the site ranks just fine, and for some very competitive terms as well.
Could this be some sort of over use of keyword filter? I used to have the domain name in the title tag so the many scrapers would have copied this in their link to me. They would have also copied other keywords as well from the title so I am not clear why this happened. Just thought I would throw this out there as I am only getting the penalty for the domain name and not the actual keywords.
This site has been around since 2001 so it is well established on the net. Anyone else have this happen to their site?
|while I can understand keyword searches benefiting from such a move, I can't see how it benefits relevance to remove a site from searches for the site name. |
Google probably feels that making the user dig deeper in such cases is a reasonable tradeoff for deterring bad behavior.
True, but when it affects sites that have no bad behaviour and are punished unjustly for close to 7 months now.. things that make you go hmmmmmmmmmmmmmmmmmmmmmm.
[edited by: AustrianOak at 4:26 pm (utc) on Oct. 16, 2006]
|True, but when it affects sites that have no bad behaviour and are punished unjustly for close to 7 months now.. things that make you go hmmmmmmmmmmmmmmmmmmmmmm. |
With automated penalties, some level of collateral damage is unavoidable. I doubt if Google has a policy of "let's give a -30 penalty to clean sites for no good reason just because it's fun."
its a red herring talking about 'clean' sites or even 'dirty' sites. This is not about cheating or balls on spamming with scraper sites by the 1000's. We are talking about thresholds for filters and combinations of filters that are looking for classic old time 'optimised sites'. This is done to avoid predictable manipulation of the serps by any 12 year old as was becoming the case. Its now about two things, sweet spots for total overall optimisation (just like keyword density was the IN thing for sweet spots)and authroirty/trustrank scores. Get more of the latter and you can do more of the former. It is unfair to label says that got tanked as 'not clean'. More than that, fopr the majority its simply untrue. Its simply the state of the algo in the current filter.
"With automated penalties, some level of collateral damage is unavoidable. I doubt if Google has a policy of "let's give a -30 penalty to clean sites for no good reason just because it's fun.""
NO. I am sure they don't. But they do have a "We could care less.. we are gods" policy ;)
Unfortunately there is no facts if there has been more collateral damage then their objective achieved.
[edited by: AustrianOak at 6:37 pm (utc) on Oct. 16, 2006]
I have had a 30+ penalty for 10+ months now. However, unlike some posters, my 30 serp penalty is on my targeted search terms-- search terms that I ranked #1 on for all of 2005--not my domain name.
Since I have been in this business for 8 1/2 years you can imagine how much analysis, content changing, content refresheing, testing, etc I have done. Nothing has worked.
After all my analyses, I am convinced this penalty is manually applied, not algorithmic. Whatever triggered the 30+ penalty, and I really have no idea what it could have been, has been removed now, given all the changes I made. Let me say that to my knowledge I have never spammed Google. Therefore, if it were algorithmic I would have either risen or fallen in the rankings, not stayed static at #31.
I made a post regarding this at [webmasterworld.com...] and only got 3 replies from newbies. So either no one has had much luck getting this penalties removed or they aren't saying.
So I am really at Google's mercy. And that is not a comfortable position to be in. They applied the penalty manually. After 10 months, it is time they at least made a review and take it off manually. Or tell me what is wrong and I will gladly fix it.
And all the "notifying webmasters of penalties" BS they have put out is just that. I have filed reinclusion requests, volunteered for an evaluation, etc. I check my Webmaster Tools daily and no one at G has contacted me.
I think it is time for Google Guy, Adam or some of the other G knowledge base to address this 30+ penalty issue. What about it guys?
this thread is very interesting as my site also is position EXACTLY #31 in google searches from #1.
i gave a couple of my little sites a text anchor link last year, now since then the pages that the dup linked too does not appear in G index and my site has dropped to #31, interesting would you say, i am still fighting the battle, but am running out of ideas.
how on earth can you possibly say it was applied manually. Given that google like to automate as much as possible, and that 1000's of sites have suffered in the same way, how do you conclude it was manual?
|NO. I am sure they don't. But they do have a "We could care less.. we are gods" policy ;) |
Sure, just like we're gods in our own little heavens. :-)
|Unfortunately there is no facts if there has been more collateral damage then their objective achieved. |
Collateral damage is likely to be highest when a change takes place, with a decline as bugs are identified and fixed. So trying to guess at a figure would be like trying to zero in on a moving target.
One of my sites got hit October last year, it used to rank high for virtualy whatever I targetted, had about 5000 strong pages then overnight it was all gone. I always thought it was only a matter of time, as I was doing some pretty competitive stuff at the time and was pushing the boundaries a bit. It was also a site that I had used
in the various SEO competitions.
As I was expecting it I didn't even bother about trying to do anything about it and just left it as it was, then back in January I deleted all but about 50 pages and tidied everything up.
It is now crawled every 3 or 4 days, is PR5, but will not rank for anything.
I've just cheked the obvious search phrases,
<Sorry, no specific search terms.
See Forum Charter [webmasterworld.com]>
and I am 31 for all of them.
Looks like you guys hoping it is only a temporary thing are in for a rough ride.
[edited by: tedster at 9:10 pm (utc) on Oct. 16, 2006]
blimey...you deleted 4950 pages of a 5000 page site and expected to come back?
|zero in on a moving target |
I believe this is exactly how a patriot missile works!
I left it as it was for at least 3 months, but with no sign of recovery.
A couple of thousand weren't particulary good pages and were out of date anyway and a lot of content I moved sideways to another site, so didn't want to leave it up on the other one for fear of a duplicate content penalty.
Basically, I just put the site in to mothballs and concentrated on another site. I was of the opinion that whatever I did the site would not recover.
"Collateral damage is likely to be highest when a change takes place, with a decline as bugs are identified and fixed. So trying to guess at a figure would be like trying to zero in on a moving target."
Exactly.. now you get it!
I am equally 100% sure, this is NOT a manual penalty. Why manually penalise 30 places? If its worth a manual penalty, its worth removal. I'm yet to see a single site that's been hit by the plus 30 penalty that does not have some or all of these problems.
duplicate content caused by bad content management system
big links campaign
errors in html
keyword anchor spam
Usually, you need at least 3 of these going on big time. I HAVE heard of people recovering, and its by fixing these issues.
If your site is one big affiliate site, and has a massive duplicate anchor site map.
eg red widgets
greeny blue widgets
bluey green widgets
widgety widgida widgets
You've tripped a filter
[edited by: tedster at 1:16 am (utc) on Oct. 17, 2006]
| This 194 message thread spans 7 pages: < < 194 ( 1 2  4 5 6 7 ) > > |