homepage Welcome to WebmasterWorld Guest from 50.16.130.188
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 194 message thread spans 7 pages: < < 194 ( 1 2 3 [4] 5 6 7 > >     
The "Minus Thirty" Penalty?
#1 yesterday and #31 today
1script




msg:3119217
 2:36 am on Oct 13, 2006 (gmt 0)

Hello everyone,

I just got my site rank #31 on its own domain name and bunch of keywords/phrases I usually watch were bumped from #1 to precisely #31. Those #2 through #10 are sort of all over the map but generally within the first 60 results.

Does anyone have some experience with this? What would the respectful audience here think a most likely reason for such penalty is? What do you suggest as the best strategy to fix this?

There has not been any major redesign recently, just routine adding pages here and there. Some unique, some syndicated industry-related content.

Thanks for any idea or comment!

D~

 

nippi




msg:3123420
 10:44 pm on Oct 16, 2006 (gmt 0)

I am equally 100% sure, this is NOT a manual penalty. Why manually penalise 30 places? If its worth a manual penalty, its worth removal. I'm yet to see a single site that's been hit by the plus 30 penalty that does not have some or all of these problems.

duplicate content caused by bad content management system
thin affilaites
big links campaign
errors in html
keyword anchor spam

Usually, you need at least 3 of these going on big time. I HAVE heard of people recovering, and its by fixing these issues.

If your site is one big affiliate site, and has a massive duplicate anchor site map.

eg red widgets
blue widgets
green widgets
pink widgets
purple widgets
greeny blue widgets
bluey green widgets
widgety widgida widgets

You've tripped a filter

[edited by: tedster at 1:16 am (utc) on Oct. 17, 2006]

1script




msg:3123505
 12:27 am on Oct 17, 2006 (gmt 0)

I've just made a discovery that you guys in the "minus 30" club will like:

You actually have to have a kick a.. site to get hit with just 30. If your site is of a lesser Google-perceived value (younger, less PR, less unique vs. original content ratio), you will get hit with "minus 54". I just had three more of my sites bite the dust with "minus 54" laid upon them. It's actually even harsher than just that: "minus 54" is when you search for your domain name. None of the actual keywords are to be found anywhere in the first 100.

mtlnx




msg:3123668
 5:06 am on Oct 17, 2006 (gmt 0)

My site is 8 years old.
Basically I did implemented mod_rewrite, which then meant googlebot crawled thousands (eg 5000 per day) of pages, mostly duplicate content.
About ten days later I noticed the drop in ranking for a 1 word search term of about 30 spots - ~30 to ~60. Other search terms weren't really affected.
I then excluded those pages via robots.txt.

Changing URLs will drop your site in the rankings. It seems to me that it would be better to redirect from the old to new with 301s, not block the old URLs with robots.txt.

HocusPocus




msg:3123773
 7:01 am on Oct 17, 2006 (gmt 0)

For evidence towards a -30 penalty -

Try an "almost unique" search from your site on Google that returns Less than 30 results. For any such term your site should be always BOTTOM or thereabouts for this Serp?

kazisdaman2




msg:3123921
 10:28 am on Oct 17, 2006 (gmt 0)

For evidence towards a -30 penalty -

Try an "almost unique" search from your site on Google that returns Less than 30 results. For any such term your site should be always BOTTOM or thereabouts for this Serp?

--------------

thats a good idea for you guys, whats the results?

This thread is also very interesting to me too. I had no IDEA about this kind of junk and actually did some promoting with anchor texts like this weekend for about maybe 10 sites to mine, hope thats not excessive.

I had no clue, although I can't say I felt completely legimate doing it.

my site is #2 of google, only 2 competing sites. So this will be interesting, if $#@$ hits the fan, at least I've learned.

UK_Web_Guy




msg:3123945
 10:59 am on Oct 17, 2006 (gmt 0)

This type of "filter" has been going on for years now, this is nothing new.

I've seen sites come in and go out, the guys never change anything on the sites.

I actually think it's something along the lines of google have placed a? over site and it needs to build up some trust to be allowed back up.

[edited by: UK_Web_Guy at 11:01 am (utc) on Oct. 17, 2006]

monster88




msg:3124282
 4:34 pm on Oct 17, 2006 (gmt 0)

I have a similar situation on my site. It's a PR 8 site that used to rank for just about anything I wanted. Then I added several hundred thin affiliate pages and all my rankings dropped from page 1 to page 4. I still rank #1 for my site name though. I also still rank high for specific searches in "quotes". I'm not sure what caused the penalty. I think I got a manual penalty but I'm hoping it's an automated filter of some sort. From other people's experiences on this thread, it doesn't look good for a revival.

nuevojefe




msg:3124538
 7:28 pm on Oct 17, 2006 (gmt 0)

For phrases where you've been affected are you noticing many URL only listings which previously weren't URL only? In one search where we see a drop of about 30 spots there lots of URL only results for other sites that have never been URL only before (and shouldn't be now).

cabbagehead




msg:3124702
 9:25 pm on Oct 17, 2006 (gmt 0)

> "For phrases where you've been affected "

.. this is exactly what I wanted to ask! Let's please clarify the effects for a moment:

Scenarios:

1. Are you >30 for every single phrase or just some phrases?

2. If your domain is xyz.com and you search for "xyz" ... are you #31 for that phrase as well?

3. Is it only for competitive phrases?

4. Or is it only for phrases you've specifically targeted and perhaps over-optimized for?

5. Is this the same thing as "the sandbox" everyone loves so much?

sim64




msg:3124738
 9:43 pm on Oct 17, 2006 (gmt 0)

I know the question wasn't specifically aimed to me, but here are my answers to your questions.

1. Of 10 phrases I have just searched for where I would expect to be on the first page I was number 31 all 10 times.

2. If my pagee was xyz.com I am number 31 for "xyz"

3. It is any phrase even obscure ones, if there are less than 30 results returned I am last

4. No, all phrases

5. No, nothing to do with the sandbox. Mine is a 5 Year old site that used to be PR6 (now 5). It is still crawled regularly and PR hasn't dropped in over 12 months.

nippi




msg:3124773
 10:19 pm on Oct 17, 2006 (gmt 0)

UK_Web_Guy

Its difficult to quanitify "did nothing" as you can not actually "do nothing"

50 sites may have added links to you, which tipped you back over the not trusted/trusted limit.

then in in the "did little" category

a slight content change may have fixed things.
a slight update may have removed a hidden link orphaned in code you were not aware of.

yes, its being going on for years, I had a site tank for 3 months(+30) 2 years ago, issues then were.

a. added 300 recip links.
b. home page sitemap with 400 links in it, all with one word the same.
c. hiddenish text.

fixed all, added 100 more links, removed the ones where no link back, site recovered.

cabbagehead




msg:3124799
 10:45 pm on Oct 17, 2006 (gmt 0)

> Its difficult to quanitify "did nothing" as you can not actually "do nothing"

Well... I would have to agree with the person above who said that suddenly something happened and they've tweaked and analyzed endlessly and nothing happened. That's clearly a penalty, not a algorithmic shuffle. Now, the argument I guess is whether it was manually imposed or not. I believe it is Google's *INTENT* to automate everything but I can speak from personal experience on my own sites and tell you that when my algos are not ready or not yet up to snuff, I often fill the gap with manual edits ... otherwise my community will falter. So I do not think manual edits are out of the realm of possibility, regardless of what the Google press releases may say.

...this is very unfortunate. I really do wish Google would provide better webmaster relations on such things. Such a heavy handed approach needs to be used very delicately or it can ruin what people have worked very hard to create. Same complaint as always about Google in that regard.

jwc2349




msg:3124813
 11:14 pm on Oct 17, 2006 (gmt 0)

"...I really do wish Google would provide better webmaster relations on such things. Such a heavy handed approach needs to be used very delicately or it can ruin what people have worked very hard to create..."

That is exactly the point. Many of us webmasters have worked for years to build the "trust factor" with Google only to be nailed with a penalty and we have no idea what triggered it. Yet Google keeps saying that they are trying to open up lines of communication with legitimate webmasters, notifying them of reasons for penalties. Well I have been waiting for 10 months to hear from that higher authority and not one tidbit. I certainly have tried to communicate with them but they have not reciprocated.

It seems to me that Google can easily identify legitimate websites that have incurred a penalty and then communicate with them. For example, a website that 1) is at least 5 years old, 2) that never before was assessed a penalty, 3) with a true PR5 or higher and 3) an presently-assessed penalty would be an ideal candidate for communication from Google. That would screen out all the whiners and provide a tremendous service to the webmaster.

If the webmaster then did not remove the reasons for the penalty within a time certain, then they shouldn't just be penalized, they should be banned.

Seems easy and fair to me. I certainly volunteer to be the guinea pif.

photopassjapan




msg:3124853
 11:58 pm on Oct 17, 2006 (gmt 0)

Hahaa... here's automatation for you :D

[images.google.com ]

Even as one of the people lobbying for hybrid results...
I honestly... not making it up... i whole heartedly Laughed Out Loud X)

Missed this on sept 1st when it was news ( original thread: [webmasterworld.com ] )... for i had barely joined and was preoccupied with our site's problems... but now it's on webmaster tools asking whether we'd like to participate with the images of our site allowing others to decide... well basically the fate of the pictures. ( Okay, exaggerating, it's only beta and all but... but you see, pictures are pretty much THE things we have on our website so it makes me wonder. )

I mean i, for one, have always thought there should be a fine tuning mechanism involving a board of people...

- Who don't know each other
- Who have no commercial interest in corrupting the data
- Who see the pages popped up random for them
- rank for design, ethics, usability
- and are reviewed by each other randomly in at least three rounds...

I've been wondering what it would cost to hire such personnel and how employment could be dealt with to achieve full anonymity... And never thought how EASY the real answer was.

It was right before my eyes, how come i never thought of this?

There're tens of thousands of BORED people surfing Google every other second! Let THEM decide! :)

( call them googlaholics )
( btw, read the sep 1st thread here on WW, it's funny )

I just really hope now that they make the same system for websites in general. Perhaps there aleady is such a system... perhaps Google has always been building its database manually ;)

Hehee... this is fun. Sorry this is news to me, i just noticed, don't know how i could've missed it so far...

It works 100% the way i envisioned the system for reviewing spammy sites only it's in reverse drive! And i've never would have thought it would be us, the visitors doing the job... i mean wow. I can't tell a good business model when it says "good business model" stamped all over it.

There should be a back-end to Google. Half of the world would be typing in search queries on one side, while the rest would be typing in the meta-keywords on the other!

( though the sept thread mentions this as well... oh okay, i'll shut my mouth now. But really... )

This is... The best idea... and at the same time...
...the best joke i've seen in years.
[wipes tears from eyes]
Sniff.

I've seen such stuff enough times on photo and art sites... always loved to play...

Sniff.

...I just wonder how long i have to click PASS until i see our pics come up.

kidder




msg:3124879
 12:11 am on Oct 18, 2006 (gmt 0)

Yeah and webmasters have to push harder becasue they get beaten out by cheats and scraper sites. When they push the limits bad things happen. It's only natural to push back

lfgoal




msg:3125080
 3:43 am on Oct 18, 2006 (gmt 0)

Whenever I'm hungry for paranoia, I'll always drop by here to see what's on the "daily specials" menu.

No offense to the originator of this thread or any others suffering from the 31 dilemma. I'm just curious about this whole internal-link-anchor-text-paranoia-thingy.

Let's say you have a site with 500 pages of content and let's say the site is categorically about red cars. Let's further say that there's a page on the site that is about "cleaning tips for red cars". Lastly, let's say that every page on the site has a link to this one page and that the link says "cleaning tips for red cars".

Are there webmasters out there so paranoid to believe that there's a possible penalty for this (that was rhetorical as I know there are)?

Let's apply logic here. In this example, the site is about red cars. If you have a page on such a site that is about...the cleaning of red cars, why WOULDN'T you place a link TO this one page on every page of your site? If it's a well written page that offers lots of tips on red car cleaning and you think your visitors would like to read the page, what would be the prob?

And, in this example, why SHOULDN'T you use the anchor text "cleaning tips for red cars" in every link if, in fact, the page is ACTUALLY ABOUT cleaning tips for red cars?

Should a person worry about offending googlebot because it may get bored reading the same anchor text over and over and, thus, lash out at the site with a boredom-inspired penalty?

Perhaps one should appease the bot by "varying" the anchor text in the internal links to this page? I mean, it won't change what the page is about, but perhaps googlebot would like to see "red car cleaning tips" or "tips for cleaning red cars" or "cleaning cars the red tips way" just to break up the monotony of endless crawls.

Here's what I say: if I have a page about cleaning tips for red cars and I think the page offers good content to my site's visitors, I'm going to include a link to it from every page of my red car website
and I'm going to do it so they can EASILY FIND the page.

And I'm going to do more than that. I'm also going to make every link say "cleaning tips for red cars" because...that's what the page is about.

And lastly, I'm not going to vary my stinking anchor text because...why should I? Who am I trying to fake out?

kidder




msg:3125128
 4:48 am on Oct 18, 2006 (gmt 0)

Bravo to that

cabbagehead




msg:3125141
 5:11 am on Oct 18, 2006 (gmt 0)

> And lastly, I'm not going to vary my stinking anchor text because...why should I? Who am I trying to fake out?

... spoken like someone who hasn't (yet?) been burned. ;-)

europeforvisitors




msg:3125183
 6:10 am on Oct 18, 2006 (gmt 0)

And I'm going to do more than that. I'm also going to make every link say "cleaning tips for red cars" because...that's what the page is about.

Makes perfect sense. If I have a major section of my site devoted to widgets, I'll have a link with the anchor text "widgets." What the heck else am I supposed to call that section?

Links from external sources are a different matter. If I talk 100 Webmasters into linking to my Acme-widgets.com site with the word "widgets," that's going to look unnatural. To a human being, at least, it would make more sense for most of those links to read "Acme Widgets" or "Acme-widgets.com."

It's also worth remembering that factors such as anchor text aren't likely to be considered in isolation. (I think someone else may have pointed this out earlier in the thread.) Profiling doesn't say "fat guy is likely to be a drunk." It says "fat guy in a Bud Lite t-shirt with an open beer bottle in his hand, a Jimmy Buffet CD in his car stereo, and alcohol on his breath is likely to be a drunk."

M_Bison




msg:3125247
 7:39 am on Oct 18, 2006 (gmt 0)

Changing URLs will drop your site in the rankings. It seems to me that it would be better to redirect from the old to new with 301s, not block the old URLs with robots.txt.

I did 301 the old pages.

I only used robots.txt when I found out I had been hit with this "minus 30" penalty.

UK_Web_Guy




msg:3125325
 9:47 am on Oct 18, 2006 (gmt 0)

Has anyone who has been hit with this contacted Google via the webmaster console?

To the guy with the PR8 who added loads of thin affiliate pages, you obviously know why your site has been hit, have you removed these pages and contacted google?

norton radstock




msg:3125352
 10:29 am on Oct 18, 2006 (gmt 0)

Does anyone with a definite minus 30 penalty not have adsense on their site?

With two sites hit, and having corresponded with others in a similar position, I am convinced it is a thin affiliate issue. As people have suggested, Google cannot hand index every site, but they could feasibly start with adsense publishers.......

My response is to steadily add more content and reduce the number of Adsense/affiliate links. It would be great if GoogleGuy could let people know the re-evaluation process when sites are ready.

1script




msg:3125444
 12:01 pm on Oct 18, 2006 (gmt 0)

Add AdSense into the mix? Their mantra so far has been that AdSense and search are completely separate. Well, turned out that they share indexed data but that's about as far as Google wanted to go in admitting any links between the two systems.

To me it would only make sense if they use AdSense to profile sites because with the JavaScript they have SO MUCH data about the traffic for that site! For example they can see if the visitor stays on the page for longer than three seconds on average. If they do not, then it may not bee a good reading so let's whack it with a penalty. I can think of many more creative uses for the data they gather using the JS code we put on our sites to profile the sites.

However, I would not hold my breath waiting for Google to leak more data about AdSense - Google search correlation. Scare enough webmasters to remove AdSense code from their sites and see your revenue drop? I don't think so.

SEOPTI




msg:3125493
 12:43 pm on Oct 18, 2006 (gmt 0)

Their search Monopoly is a big pain for all of us. I hope Yahoo and msn will kick their a** one day.

[edited by: SEOPTI at 12:43 pm (utc) on Oct. 18, 2006]

bkleinhe




msg:3125507
 12:54 pm on Oct 18, 2006 (gmt 0)

Per the advice from this forum, I removed an unnecessary table from the footer of my website which contained over 50 instances of my primary keyword. This was done 3 days agao. It was redundant but I thought it would held the search engines catalogue the content of each page better.

While I am still at #12 for my primary keyword on google....for the first time ever, I jumped 109 spots on MSN to #5 for a very comeptitive keyword (38,000,000 results in the travel sector) and I jumped 15 spots on Yahoo to #40. Both are the highest spots in 18 months.

Google takes a lot longer to react.

we shall see, good things so far!

indigojo




msg:3125522
 1:08 pm on Oct 18, 2006 (gmt 0)

We also have this penalty - PR8 for years and still PR8, ranked well across the board until last november - now and for almost a year you can take our unique copyright statement and search to find dozens of scrapers and PDF copies of our sites' articles, whilst we languish on page 4. This is a manual edit, I have no doubt. We have spent countless hours fixing every issue that could possibly trigger this to no avail. Even emaiuled MC and AL directly as well as Google - zip on that count! The only positive being we have found new traffic sources - and concentrated on growing our traffic by other methods we probably would never have bothered with. Yahoo and MSN still rank us appropriately. see [webmasterworld.com...]

nippi




msg:3125575
 1:42 pm on Oct 18, 2006 (gmt 0)

lfgoal

Logic says, if Google says only go for about 100 links per page, and you place 500 links on one page with one word common to all, its going to raise a flag.

If your site is about red cars, you would not put “red cars” in every anchor for site visitors. That much repetition, can only possibly be for search engines, your visitors know your site is about red cars, no need to tell them in evey anchor.

Yes, I believe Google is sophisticated to see that’s what its for, and its for search engines, not people.

This, is not an idle opinion. I’ve now done an analysis of over 30 separate, unrelated search terms, checked out the top 50 sites, and am seeing minuscule numbers of sites with large anchor repetition in the top10, and the ones that are, are big authority sites.

I am not bothering to question overly the logic of Google, I am testing whether there is realistic cause and effect, and am seeing it.

If some would like to sticky me a top 10 site for a competitive phrase that has 50+ repetitions of the main keyword in anchors on the home page I’d be grateful, and a little surprised.

norton radstock

1st two sites of mien affected did not have adsense, or affiliates.

sailorjwd




msg:3125588
 1:55 pm on Oct 18, 2006 (gmt 0)

I was penalized 18 months ago for a dup content issue.

It took exactly 6 months for the penalty to be removed.

During that time an exact search for my unique company name listed my site way down (may have been 50+, I didn't count). Most-all sites that referred to my company appeared before mine.

Besides fixing the dup content issue i also greatly reduced my internal linking. Since then i've only brought the internal linking back by 50% of what it used to be... search results have never been better.

lfgoal




msg:3126343
 11:31 pm on Oct 18, 2006 (gmt 0)

"If your site is about red cars, you would not put “red cars” in every anchor for site visitors. That much repetition, can only possibly be for search engines, your visitors know your site is about red cars, no need to tell them in evey anchor."

I would put "red cars" in every single link that led to a page titled "cleaning tips for red cars" because that's what the page is about. And if I wanted every visitor to my site to have an optimal chance of finding the vaunted "cleaning tips for red cars" page (because it's just so doggone good), I would place the link on every page of my site and I wouldn't worry about it. That would be building the site for visitors and not for search engines.

Whitey




msg:3126449
 2:18 am on Oct 19, 2006 (gmt 0)

Not sure if this means anything to anyone, but I'm observing one of our site's :

"uncontested phrase in brackets" - Result No 1/2 - page 1
without brackets - page 3

"heavily competed phrase in brackets" - page 3/4 [ approx 30 below ]
without brackets - page 3/4 same as above! [ that's strange ]

I'm not sure if we're classified as an old [ 3.5 yrs ] site being reintroduced to the SERP's after a few fixes, going through some filter checks

or

If we're continuing to trigger a filter that puts us into the -30 bracket for only some common phrases.

or

a combination of both

....................................................................

Somehow, i think this is a penalty by degree system which ascribes a score to a range of filters for promotion or demotion. What each filter check is worth, [ other than the obvious one's ] only Google will ever know, but i think we're getting close to a stronger gut feel as to what is the most severe and the least severe ones [ accumulated could be a problem ]. So some degree is able to be estimated in conceptual terms.

It would be good to have a dot point list of things that might trigger this, and a dot point list of *factual* things that oppose it [ ie Trust - IBL's in ; age ; brand ; fixing the problems! ; etc ].

Then from this, it may be possible to arrange those things into *severity* levels.

Then from this we might be able to see sites that are existing, recovering and new.

This is a sort of parallel thread with, Filters exist - the Sandbox doesn't. How to build Trust. [webmasterworld.com], where we've cobbled together some thoughts.

I think this would get us closer to understanding things.

...................................................................

However, I do appeal to our friends Matt, Adam , Vanessa and GoogleGuy to provide some light on this. Clearly, webmasters need to understand in broad terms what these triggers are to better quality control their sites, and they have *TRUST* at least from those who subscribe and identify themselves at WMC/Sitemaps.

We really need some systematic guidance beyond the existing guideline, on the application of filters, broken down and reportable, either here or eventually and ideally through Webmastercentral.

In the interim something spoken or written would help. We don't need secret soup.

[edited by: Whitey at 2:21 am (utc) on Oct. 19, 2006]

daveblake




msg:3126817
 11:31 am on Oct 19, 2006 (gmt 0)

My PR5 site was demoted by Google on April 26 to the 31 spot across hundreds of key phrases and the unique domain itself (xyz of xyz.com). All were formerly at No 1 or No 2 spot. I have documented the terms, and I also keep a weekly spreadsheet on Google site rankings and site counts.

We are >30 for every single phrase not just some phrases.

We have a Google webmaster account, and regularly submit in a new sitemap. We have copntacted Google via this account plus about a dozen other ways. All Google ever did was submit an automated response telling me to read the webmaster guidelines.

We have spent hundreds of man hours on this 4 year old site, which has several thousand pages of unique content, on a step-by-step basis trying to eliminate any potential problems . At several points we have updated Google that changes have been made and requested that they re-evaluate (whether manually or automatically - who knows or cares?). These corrections were mostly minor, a little inter-site linking, some 301 redirects that needed to be put in place etc. Some things we just can't help, like several spam sites that have duplicated bits of our content. There is no hope of getting these sites to remove it.

The rub is we have invested years of effort to build up a useful site that brought in some good business. It gets dumped by G, OK obviously you don't get told about this, and it matters not that we were a Tier 1 Adwords customer. But we rigorously try to detect the reason for this penalty, and having failed we waste more money contracting a top SEO firm to see whether they can pick up on something we missed. Still no luck.

Where is the communication from Google? It is the complete lack of any support or help that irks me. We have given up any hope of getting this site back up - and from all I have read on forums etc about PENALTY 31, none of the victims have recovered, even those who have completely rebuilt their site have not been restored. Which makes me wonder whether it would take manual intervention to remove a PENALTY 31.

This 194 message thread spans 7 pages: < < 194 ( 1 2 3 [4] 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved