homepage Welcome to WebmasterWorld Guest from 54.198.140.148
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 194 message thread spans 7 pages: < < 194 ( 1 2 3 4 5 [6] 7 > >     
The "Minus Thirty" Penalty?
#1 yesterday and #31 today
1script




msg:3119217
 2:36 am on Oct 13, 2006 (gmt 0)

Hello everyone,

I just got my site rank #31 on its own domain name and bunch of keywords/phrases I usually watch were bumped from #1 to precisely #31. Those #2 through #10 are sort of all over the map but generally within the first 60 results.

Does anyone have some experience with this? What would the respectful audience here think a most likely reason for such penalty is? What do you suggest as the best strategy to fix this?

There has not been any major redesign recently, just routine adding pages here and there. Some unique, some syndicated industry-related content.

Thanks for any idea or comment!

D~

 

Jane_Doe




msg:3128879
 5:42 pm on Oct 20, 2006 (gmt 0)

It would be interesting to hear if anyone who has had the -31 for domain name has ever recovered.

Sure, all of the time and worse than -30. See Ted's post above.

I've only had one penalty that I thought was really undeserved, wrote to Google about it and my site got put back in the index.

[edited by: Jane_Doe at 5:46 pm (utc) on Oct. 20, 2006]

hvacdirect




msg:3128899
 6:03 pm on Oct 20, 2006 (gmt 0)

Nice post Tedster.

I may be completely off base on this, but here's my thoughts on this. The -30 penalty seems different than most penalties, in that its an obvious adjustment of a pages rank. Whatever has tripped the filters isn't bad enough to ban the page, it isn't bad enough to move the page to -300, but it's bad enough that the page cannot be in the first page of the serps. Perhaps google should have added some randomness to it, reduce a page from 25 to 50 spots, and no one would have noticed the patern:)

Too me it looks like a determination that the page is:

1) Not offering the user the experience they expected when clicking the SERP result when back at the original spot (+30).

or

2) By all other indications (PR, keywords content, links, body text, etc) the page should be at +30, but due to something else its been moved down.

Could this be results of user experience monitoring by click backs? In other words, google places the page in a spot and monitors all of the pages above and below it. It then notices that when people click on said page, they often click back and then go to a page below it and don't click back at the same time the pages above and below it are performing within a prescribed range. For example, lets say they have statistics that for the top 5 pages the click back percentage is normally (and they would have a lot of data to back this up) like this:

1: 10 %
2: 15 %
3: 16 %
4: 18 %
5: 22 %

But a page is sitting at #1 and its click back is actually 45%, so they drop it to the 3rd pages of the results where its normal to see a 45% click back.

If this is true then how can one fix it?

1) You could improve the page such that users are more satisfied, and assuming they are still monitoring it, it should move back up.

or

2) You can remove some of the SEO and the page will naturally rank in the lower spot.

To me it seems like it may be an indication that the page is already optimized, actually optimized so well that google is expecting more out of it. So why not improve the page?

This could be another tool in their arsenal to fight off the super SEO's and MFA sites that know how to get the right link juice, keyword optimization, etc to get a page high in the serps only to give the user a bunch of ads to follow on further.

Then again, I could be all wrong, thanks for reading my ramblings.

tedster




msg:3128914
 6:19 pm on Oct 20, 2006 (gmt 0)

Nice post, hvacdirect.

The -30 penalty seems different than most penalties, in that its an obvious adjustment of a pages rank.

Bingo -- that is the factor that makes many feel this may be manually applied.

I hadn't specifically thought about Google's user data -- the clicks back to the SERP for a different choice. That could be one of many tip-offs. Now I doubt some of the affected sites I've looked at would suffer from such an affliction, but many of them certainly might.

The "get a good rank and then switch out the page" crowd is another factor I wasn't considering. It's certainly possible, especially since Google has an extremely long memory for historical changes these days. But again, I haven't spotted anything quite like this.

MrStitch




msg:3128948
 6:37 pm on Oct 20, 2006 (gmt 0)

hvacdirect: Now THATS a good post.

I've been preaching the same thing for quite some time. I've seen lots of pages rank extremely well, and the page is nothing but garbage.

For example: A particular site is #1 for its competative term. If you look at the stats, he has 50,000 unique visitors a month, but only 85,000 page views.

This tells me that a major percentage of visitors don't see anything useful imediately, and just leaves. The portion that stays, gets to the second page, and leaves.

Looking at the site, you can definately see why... it sucks, in both functionality, and appearance.

I was hoping that at some point, Google would start tracking these types of things, and make it part of their algo.

theBear




msg:3128964
 6:53 pm on Oct 20, 2006 (gmt 0)

"For example: A particular site is #1 for its competative term. If you look at the stats, he has 50,000 unique visitors a month, but only 85,000 page views."

Another alternative view, 35,000 people found what they wanted on the first page and 15,000 people had to go to other pages.

hvacdirect




msg:3129008
 7:23 pm on Oct 20, 2006 (gmt 0)

"Another alternative view, 35,000 people found what they wanted on the first page and 15,000 people had to go to other pages."

That's the hard part about user data, it all can mean different types of satisfaction.

If I'm searching for the density of gold, click a page and find it I'm done, Click back and search for the something else, perfectly satisfied customer. On the other hand if I'm searching for the Declaration of Independence, find it, and read it and it's broken up into 10 different pages I'd have to be on the site for a while, then notice an ad to buy a copy of it and go there... Both are different user actions but both are good results.

theBear




msg:3129044
 7:44 pm on Oct 20, 2006 (gmt 0)

Is it not the point with a search engine to land on the page that has what you want [not that it happens in all cases]?

Would not this fact result in a lot of one page visits?

Even if one was talking about going back to the search results and trying another one as being an indication of failure to find what they wanted on the first page, there is no way to tell for certain that the user didn't find what they were looking for on the first page or is just surfing to see if another page offers additional information.

All a wonderful statistical exercise, but possibly one that starts out with an incorrect assumption.

[edited by: theBear at 8:09 pm (utc) on Oct. 20, 2006]

AndrewSlk




msg:3129327
 12:06 am on Oct 21, 2006 (gmt 0)

Hello All,

I saw this topic a few days ago, but I didn't view it untill today, when my site has been moved from #1 #2 #3 position for different keywords to exactly top position on 4-th page for ALL of those keywords.
BUT, ranking for all other keywords for which my site was on any other page but #1, was not changed.

Let say I have site spectacularwidgets.com
My site was on #12 page for SK "widgets" before -30 penalty, and now it's on the same page. But, for SK "spectacularwidgets.com" it's on #4 page after 3 pages of supplemental spam sites (was #1), as well as for other SK's such as "spectacular widgets", "awesome widgets", "widget beautification" which are also had displayed my site on #1 page.

My site is 6-year old, it consists of 4 pages, has PR4, hasn't duplicate content at all, and I didn't make any changes to site layout and content before penalty. (Google has same cache for my site now, after penalty, as it was just a 5 days ago before penalty.)

All this is not interesting except one thing.

One year after I've created my site (let say it will be spectacularwidgets.com again) someone has bought domain
spectacular-widgets.com and also made it about "spectacular widgets". After few years, my site became #55 on Google for SK "widgets" and his became #1 for same SK. So, very long time both of our sites was on such places until summer updates, when my site was dropped to #11 page for SK "widgets" and his dropped from #1 pos. to #5. And yesterday, his site was also dropped from #1 to #4 page for SK "widgets" while I'm still staying on my previous position.

So, both of our sites in one day recieved a -30 penalties, despite that we have different content, different inbound and outbound links, different site layouts. Our sites are different except one thing. They are both have same subjects AND they are both resource DIRECTORIES with links to many highly related resources about "spectacular widgets" with unique description. I'll say more, our link database is quite unique and highly related to our narrow subjects, and there are only few such resources on the web, each with it's own links and texts.

So, for me it looks like Google desided that directory sites are all well naturally optimised and hi-ranked but they are not so interesting for surfers, and perhaps it was the reason for penalty.

Look at your affected by "-30 penalty" sites. Are they looks like directories or link resourse sites? If yes, perhaps we will find one more reason for this penalty.

And in general, this penalty looks very strange for me. 5 years our sites was hi-ranked in Google and they are was good enough for Google until yesterday. If it was G algorithm, it is very human like algorythm. :-/

[edited by: tedster at 12:23 am (utc) on Oct. 21, 2006]
[edit reason] widgetize the language [/edit]

Reilly




msg:3129669
 9:27 am on Oct 21, 2006 (gmt 0)

here the answer on all questions

[groups.google.com...]

[edited by: tedster at 3:10 pm (utc) on Oct. 21, 2006]

AustrianOak




msg:3129833
 3:07 pm on Oct 21, 2006 (gmt 0)

Thanks Reilly, the link has already been posted on this thread.

It's a good thread.. but doesn't answer everything.. nothing will apart from a direct note/message from google perhaps via webmaster site tools stating the exact reasons for the ban/penalty. Guidelines are obviously not enough and haven't been for years.

[edited by: AustrianOak at 3:34 pm (utc) on Oct. 21, 2006]

wackybrit




msg:3129898
 4:00 pm on Oct 21, 2006 (gmt 0)

In my case my affected site is a long-running blog with no black-hat SEO that's a PR 7. Its ranks were excellent until the late August update.. then crashed until late September.. then great again till today. So now it's crashed down the listings again. It's crazy, Google are having a real play about.

Least I know it's losing them money as well, since my Adsense crashes 90% whenever this happens :)

wackybrit




msg:3129903
 4:03 pm on Oct 21, 2006 (gmt 0)

Actually, based on some comments above.. I wonder if they Adsense to track it as well.

My Adsense CTR on my blog is a very healthy double digits (less than 50 but higher than 10) .. so I wonder if they think 'they're not finding relevant stuff so let's cut out the middle man'.. and then apply a blanket drop to the WHOLE site.

In my case, the posts that get the good CTRs are those blog posts where I've reviewed a certain group of products, and the people in the Adsense ads sell those products.. so I figured it was a *good* thing for everyone.. but perhaps Google don't see it that way.

tflight




msg:3129937
 4:43 pm on Oct 21, 2006 (gmt 0)

My Adsense CTR on my blog is a very healthy double digits (less than 50 but higher than 10) .. so I wonder if they think 'they're not finding relevant stuff so let's cut out the middle man'.. [...] In my case, the posts that get the good CTRs are those blog posts where I've reviewed a certain group of products, and the people in the Adsense ads sell those products.

You know, I wonder if you are onto something here. I can't help but thinking that because so many people have gone up and down in these data refreshes without changing their sites that the underlying factor must not be something "on-page".

Also, since the data refreshes used to only happen once every few months, then monthly, and now seemingly every 2-3 weeks the "factor" we are looking for must be something in aggregate about a site that needs a bunch of calculation and then pushed out in these weekly/monthly batches.

One of my sites impacted by these data refreshes is alarmingly similar to what you have described. For each page view the CTR to a paying ad (from a variety of sources, not just AdSense) is in the same range you mentioned.

I also noticed a small trend with which pages were more impacted in a data refresh than others. I just realized something those pages all have in common... high CTR to advertisers.

So in short, I'm almost willing to go along and think that perhaps visitor behavior accumulated by the Google Toolbar is being tallied and if you look "like a middleman" as you say that this isn't a good thing in the eyes of Google.

Perhaps this is all part of a crackdown on MFA sites.... Get rid of the sites where people go there from the search engine results but don't stay there very long.

Maybe this could even explain why we see such a "yo-yo" effect from data refreshes... I know from my visitors that people that come from Google search results are more likely to click on ads. So when my site is ranking well my CTR goes up. Then a data refresh knocks me out and my CTR goes down. With the lower CTR due to fewer click-happy Google visitors my site no longer looks as bad and then later returns in the next data refresh.

I can say too that I can rationalize this theory by looking at my historical CTR data. In the weeks leading up to a loss in rank my CTR is a little higher than average. In the weeks leading up to a recovery my CTR is lower than average.

And one more thing.... this might also explain why some pages on my site don't seem to be impacted as much by the data refresh. Popular pages with a low CTR never seem to be impacted as much as popular pages which have a high CTR.

Perhaps this is what the Google people are cryptically referring to when they keep telling us to make sure we are providing something useful to visitors. Perhaps they rank the "usefulness" of a page by how often a visitor views a page on your site and clicks to another page on your site versus clicking to another site.

wackybrit




msg:3130169
 10:03 pm on Oct 21, 2006 (gmt 0)

Maybe this could even explain why we see such a "yo-yo" effect from data refreshes... I know from my visitors that people that come from Google search results are more likely to click on ads. So when my site is ranking well my CTR goes up. Then a data refresh knocks me out and my CTR goes down. With the lower CTR due to fewer click-happy Google visitors my site no longer looks as bad and then later returns in the next data refresh.

I can say too that I can rationalize this theory by looking at my historical CTR data. In the weeks leading up to a loss in rank my CTR is a little higher than average. In the weeks leading up to a recovery my CTR is lower than average.

This is a compelling theory, since I've noticed the same thing.

I also think PageRank plays a small part. My PR7 homepage has not fallen thirty places, but from #1 to #8 on searching for the name of the site (my real name).. whereas the high CTR pages on the searches they appeared well for have fallen from #1 to about #95-#100 in every case (way more than a minus thirty!)

tedster




msg:3130182
 10:26 pm on Oct 21, 2006 (gmt 0)

wackybrit -- from your comments, it seems that you are affected by something other than this "minus thirty" phenomenon. Just to keep this thread on target, let me again focus on the symptoms. EVERY search that previously returned a top rank for a url from these affected domains now shows that url at position #31. This is even true for a search on the domain name itself, which normally returns the domain in the first position.

The fact that many of your results have shifted downward for different amounts (depending on the search involved) is a different picture. Not a pretty one at all, but not the specific thing we are discussing here.

wackybrit




msg:3130281
 12:27 am on Oct 22, 2006 (gmt 0)

From all I've read though, it seems like slightly different symptoms of the same disease (the bobbing up and down on refresh days, the arbitrary drop of most rankings). It may be because the site is PR 7 that it gets slightly different treatment.

In any case, I haven't got any more to add anyway so I shouldn't be muddying the waters here anymore :) but do await other reports in case they have some of the same hallmarks.

walkman




msg:3130440
 4:32 am on Oct 22, 2006 (gmt 0)

31+ is a worst one it seems. Maybe x of the bad things have to match before being relegated there.

shredder




msg:3131163
 1:15 am on Oct 23, 2006 (gmt 0)

Hi,
I'm from Germany. 3 of my websites are dropped to 31, since August 11. The possible reason is unclear. Today I registerd to Sitemaps and have done a reinclusion request. For better using, I had textanchors on my sites. I removed them. I hope it will work.

walkman




msg:3131242
 2:14 am on Oct 23, 2006 (gmt 0)

Just tried a allinurl:"domain.com" search and I am at the bottom, with a parked e-domain.com (with ads) and some domain.com.co.tlds before me.

With &filter=0 I am on top and the rest of my pages are there as well. Does this tell you anything? I know I am filtered but does the allinurl hint at something more specific?

DXL




msg:3133675
 10:19 pm on Oct 24, 2006 (gmt 0)

For example: A particular site is #1 for its competative term. If you look at the stats, he has 50,000 unique visitors a month, but only 85,000 page views. This tells me that a major percentage of visitors don't see anything useful imediately, and just leaves. The portion that stays, gets to the second page, and leaves. Looking at the site, you can definately see why... it sucks, in both functionality, and appearance.

I have some sites that contain song lyrics, its not uncommon to see 35k people a month entering a particular song's page via Google, and then 32k people exiting the site from that same page. That page was listed #1 in Google, they found what they needed and left, there's no way for Google to determine how useful the page really is short of human editing.

As a sidenote, I found it funny how a search engine submission company that I still have a newsletter subscription to pointed out this thread in their last mailout (they'll get you listed on powerhouse SE's link "Hot Skunk"). Their article starts by mentioning that the 30 rank drop is possibly the result of Black Hat SEO techniques, and immediately asks people to consider an "SEO solution that won't get you penalized" that they offer. Gotta love spin.

1script




msg:3135883
 6:36 pm on Oct 26, 2006 (gmt 0)

Just to follow up on the issue:

I think someone suggested here that this may be a sign of greater troubles to come, and it does look that way now, two weeks later. The site has been continuing sliding down way past -30 on anything but it's own domain name. Basically, I can no longer find it on any of the keywords I used to track.

Additionally, based on this thread I decided to remove site map pages that looked like keywords-stuffed anchor text. The pages had a healthy PR and so I 301-ed them to the homepage. Now I wonder if it was a smart thing to do ...

walkman




msg:3136159
 10:41 pm on Oct 26, 2006 (gmt 0)

after seeing many sites with dupes and and still ranking high, I have to believe that high trustrank will make up for thin affiliates and other "guidelines."

mairusx




msg:3137665
 12:33 am on Oct 28, 2006 (gmt 0)

Hi . my site is affected to with this disease. So i will tell his story:
1. 6 month ago i created a site that is a job search engine for major sites in my contry.
I spoke with all the site's admin/owners and almost accepted that i can use their "jobs"
The most of the jobs com's with the xml format ..
for seo purpose i did't go on the a top keywords , because the competition is stronger (5,620,000) so i go to the content of my site and the results are good.
After 2 months i so that on the top keywords are not from the content , but from the first page that have the stronger competition. I was about 15 page on this keywords.
After another 2 months this keywords go to 50- 60 position (page 6-7) .
Now from Octomber i think (maybe september , but i'm not shoure) i so that i'm in the 31 position.
On 25 and 26 Octomber i was at 8-9 position on the first page (strange because it changes from hour to hour).
Now i'm on 31 position will all my keywords .
with a set of keywords when i start (no competition for me ) i wos nr 1. , now i'm 31 :D

So if the "Minus Thirty" Penalty is true , i'm on the first position after 6 months? i have stats that confirm this (not on my site, but public trafic sites)

[edited by: tedster at 12:43 am (utc) on Oct. 28, 2006]

TravelMan




msg:3139570
 10:48 am on Oct 30, 2006 (gmt 0)

Working on the assumption that the work of the EVAL* team maybe contributing to the demise of some sites

disclaimer:I'm not suggesting that its the reason for everybody, just that it may be for some

What I'd like to know is that if you have been identified as some kind of thin affiliate, either recently or in the past, then IS there a way to remove the penalty and rank on your merits, and if so by what route.

The point being that a site could have been a borderline thin affiliate case a year or so ago, has subsequently added all sorts of user added value and yet still be locked in the doldrums of page 3 hell.

Would it not be fair to give sites that fell into this category a fresh chance to shine?Maybe Matt Cutts or Adam Lasnik could be kind enough to offer up a suggestion perhaps? Perhaps the webmaster console could be used to help people out here.

* http://www.searchbistro.com/index.php?/archives/19-Google-Secret-Lab,-Prelude.html

joel2280




msg:3139585
 11:24 am on Oct 30, 2006 (gmt 0)

This is intresting on Google secret labs
[searchbistro.com...]

guessme




msg:3139628
 12:06 pm on Oct 30, 2006 (gmt 0)

Hi guys I am 100% or say 110% sure that this is a manual review penalty....

I am working for a company who's website was hit by this penalty yesterday... we moniter around 1000+ keywords on a

particular section. Every single one of them went on page 4 or below.

Please note that the WHOLE website is not hit by this penalty but only all folders & files coming under a Sub-folder.

lets say the website name is www.xyz.com & there is a sub section www.xyz.com/sub-section/ in this there are many sub folders

& files.
This particular folder /sub-section/ is facing -30 penalty.

if we search www.xyz.com in google we are at #1 position & but if we search www.xyz.com/sub-section/ in google we are at

exactly #31 which was not like this before.

The website www.xyz.com still ranks where it was ranking for most of the keywords before. But all the pages & keywords under

the /sub-section/ have gone to page 4 (31+) or below.

The good thing is that site is not banned totally but the bad thing is 80% of the revenue of this site was coming from this

SUB-SECTION only... which hurts.

Ok now to prove that it was a manual review... the main website has lot of anchor text repetition.... extraa long titles &

description & keywords but this was not banned... BECAUSE this is a very old & trusted site & gives exactly what the page was

intented to.

Where as the SUB-SECTION which has been banned is totally white hat full of original content.. each page having at least 500

words of original content. No anchor text spamming... No anchor text repetition... No big titles n all....

But the content is not exactly what user may want to see.... like page made xyz-pictures has no pictures in it instead it has

content which say xyz-pictures etc. There are 100s of such examples which an algo cannot determine that the page is really

useful to a visitor or not... coz the page has really good content optimized for that keyword but its not doing exactly what

a user may want....

To make it more specific.... lets say page is about halloween gifts but its only content & not offering to buy any gifts or

guide from where to buy it.... N this cannot be caught by SERPS... only a Manual or HUMAN review can judge that the site is

not appropiate for this keyword.... Dont bother about backlinks as this site has loads of them from wikipedia, dmoz, yahoo

etc...

Moreover if this was an automated ban they why should we send it to manual re-inclusion review... why not serp can

automatically detect the changes made & put back the site out of ban.

I have told a lots of facts even bash my own company's site.. but this is truth its seems to be a manual ban.
So sites having -30 penalty are more likely to be that does not offer exactly what a user might want to see for that

keyword...

Replies... suggestions...n even more bashing is awaited :)

AustrianOak




msg:3139768
 2:38 pm on Oct 30, 2006 (gmt 0)

guessme, excellent post.

It's an extremely scary thought (just in time for halloween) that it is 110% manual - the '-30 penalty'. What hope is there?

MATT, ANDY or GOOGLEGUY! Can you answer us just this one question.. from this -30 penalty is there hope for recovery AND what steps need to be taken? (pages changes obviously, re-inclusion requests, is there a penatly time-out time period?, etc)

CALLING MATT, ANDY & GOOGLEGUY!

MrStitch




msg:3139831
 3:24 pm on Oct 30, 2006 (gmt 0)

But the content is not exactly what user may want to see.... like page made xyz-pictures has no pictures in it instead it has

content which say xyz-pictures etc. There are 100s of such examples which an algo cannot determine that the page is really

useful to a visitor or not... coz the page has really good content optimized for that keyword but its not doing exactly what

a user may want....

To make it more specific.... lets say page is about halloween gifts but its only content & not offering to buy any gifts or

guide from where to buy it.... N this cannot be caught by SERPS... only a Manual or HUMAN review can judge that the site is

not appropiate for this keyword.... Dont bother about backlinks as this site has loads of them from wikipedia, dmoz, yahoo

etc...

According to what you're saying, I would have to AGREE with google on the adjustment. I'm not sure what your site sells, but there is someone beating me for my search term, because of tactics just like yours. Their product is something that can be 'applied' to my product. When the user searches for my search term, I can personally guarantee that they are NOT looking for this 'other thing'. So I give props to Google on this one.

The person in question in my sector, has only dropped to #2, but the people that have taken their place is a well known company, that offers a great product. I have absolutely no beef with that, and to them I say... congradulations.

jwc2349




msg:3139838
 3:29 pm on Oct 30, 2006 (gmt 0)

I agree wholeheartedly. It is time that Google gave us some guidance. How about it guys?

I have worked my tail off for 10 1/2 months and am still at #31 for my targeted searches after being #1 for eleven + months of 2005. I keep tweaking and testing but it is like running down a mountain in the fog at 100 mph without any headlights on.

Come on, shed a little light Google. We'll think a lot more of you!

MrStitch




msg:3139950
 4:28 pm on Oct 30, 2006 (gmt 0)

Perhaps this is a content vs. natural search function. Possibly in some key sectors, google could be looking at what a person is searching for based on the keyword.

If someone was looking for Playstations, G could now be assuming that they are looking to BUY them, not read a bunch of useless information about them.

Then, they are probably adding more weight to the natural search terms. I.E. - "Playstation vs. Xbox" or "playstation graphics" "parts inside playstation"

Which, believe it or not, I've seen here recently. I've dropped a little in rank for my primary search term, but have gotten replacement traffic for the natural searchs... since I also offer content on the product in question.

Looking at the -30 penalty for some of you, see if you offer strictly content for those terms, or also product that people are buying. Can you get any results from natural search variations, instead of the core keyword?

Looking at everything that's happened, and what everyone is saying, I'm guessing we'll see and hear more about this in the future, so don't give up yet.

guessme




msg:3140109
 6:50 pm on Oct 30, 2006 (gmt 0)

I knw where my company's site lacks... n if I was one the of Google's person to give -30 to sites for keywords in my sector...

I would have given the penalty to MY COMPANY"S SITE just like some guy at google did.... I am trying to be real. Thinking like general user.

But where my company's site lacks CANNOT be determined by any SERP or ALGO only humans can judge that...

Like I said they may find that keyword on the page with proper content but they will not find what exactly are they looking for... like real products to buy or images to download etc etc.

So all in all this is a MANUAL BAN... manual ban is unlikely to have any expiry date unless the person who banned the site monitors those sites for some time, to give them another chance....

At best u can do is correct all the mistakes... improve where the site lacks... & then file a re-inclusion request. Sending for re-inclusion without finding or correcting the cause will be of no use.

We Will be improving & filling re-inclusion request :)

But I'll say it once more... This is a Manual Process(Ban)... So next time u optimize ur site or write content... make it for people not for serps... just what google keeps on saying...

Regards!

This 194 message thread spans 7 pages: < < 194 ( 1 2 3 4 5 [6] 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved