homepage Welcome to WebmasterWorld Guest from 54.226.43.155
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 534 message thread spans 18 pages: < < 534 ( 1 2 3 4 5 [6] 7 8 9 10 11 12 13 14 ... 18 > >     
Penguin Recovery Tips - a think tank thread
bostonyear




msg:4451493
 7:35 pm on May 9, 2012 (gmt 0)

Since the main Penguin Update thread has 700 posts and counting, I'm hoping to start a new thread solely focused on Penguin recovery tips. I have a site that was hit by Penguin and I am trying to work my way out of it.

I think reason I was penalized was my content. I was inadvertently keyword stuffing. This is just the way I have been writing content for years. I have updated the content on my main pages where I have fixed the blatant keyword stuffing. My density levels are much more in line. My main question is:

I have over 80 blog posts that have some instances of keyword stuffing. Do I need to go back and fix all of these pages? Some of the posts are over 3 years old? I also have some really old pages that are buried in my site that may have poor content. Should fixing these old pages be a priority?

 

claaarky




msg:4454777
 10:25 pm on May 17, 2012 (gmt 0)

I don't think the -domain.com thing has anything specifically to do with Penguin.

To me this is more of a general diagnostic tool. It seems to indicate how much impact other sites have on your rankings and if that impact is significant then I would say it's a strong indication Google sees your page as very similar, but weaker in some way.

Then you can start figuring out why. It could just be a duplicate content or domain authority issue. Solve that and you could flip positions with the site holding you down.

It would be interesting to know if any sites seeing higher rankings when using -amazon.com or ebay are affiliates of Amazon or Ebay.

smallcompany




msg:4454855
 3:57 am on May 18, 2012 (gmt 0)

Just tested it and yes,

-amazon.com
-example.com

brought it back to where it was prior April 24.

RedCardinal




msg:4454870
 4:43 am on May 18, 2012 (gmt 0)

Try [keyword -.co.uk] or any other TLD.

Maybe this isn't about domains affecting your rank at all, but about a hole they missed when they were building in the Penguin filter.

I'll bet this will be patched fairly quickly. There may also be light at the end of the tunnel :)

diberry




msg:4454871
 4:54 am on May 18, 2012 (gmt 0)

This is so odd. I just did a "domain name -wikipedia" search for my Penguinized domain, and it brought me back from #4 to #1 (even though Wikipedia was not the only site ranking above me). It also brought back my most popular - and most Penguinized - page to #2. No clue what it means.

johnmoose




msg:4454906
 8:32 am on May 18, 2012 (gmt 0)

The minus sign before a keyword excludes any result with that term. So you will see serps without that, and consequently your site will seem 'to rise' in the serps.
See also:[support.google.com ]

John

Play_Bach




msg:4454930
 9:39 am on May 18, 2012 (gmt 0)

Right John, except -waffle does not exist anywhere on the site I'm testing. Yet -waffle reorders the results. Just like in Florida, there's something going on here with adding -waffle to reveal pre-Panda/Penguin results.

RedCardinal




msg:4454940
 9:53 am on May 18, 2012 (gmt 0)

I'd say the Penguin filter simply isn't running when you use a specific modifier.

For me this only seems to works with modifier strings that would affect the domains being listed. So [query -waffle] shows Penguin results, but [query -co.uk] shows results sans Penguin. Trying [query -.fr] shows Penguin results, and I can only guess this is due to there being no .fr results in the set.

MamaDawg




msg:4454976
 12:07 pm on May 18, 2012 (gmt 0)

Just tried this myself on a penguinized site (survived all Panda iterations, traffic dropped like a rock on 4/24):

-amazon.com
-example.com
...

Every domain I've tried brings it back to #2. This is for a keyword which is irrellevant to Amazon or any of the other negative-match domain tests I've done. The usual "authority" domains have never ranked for this term.

unikat




msg:4454978
 12:17 pm on May 18, 2012 (gmt 0)

I can also confirm that in my niche when adding -[a-z0-9].tld Google shows pre-Penguin results.
I've tried many combinations and all of them shows pre-Penguin results:

-site.com
-test.com
-#*$!.com | .fr | .co.uk | whatever tld

crobb305




msg:4454985
 12:31 pm on May 18, 2012 (gmt 0)


The minus sign before a keyword excludes any result with that term. So you will see serps without that, and consequently your site will seem 'to rise' in the serps.
See also:[support.google.com ]

John


This particular phenomenon has nothing to do with excluding phrases that exist on the site. That's what is baffling about it. You can make up a domain, with any word, say supercalifragilisticexpialidocious.tld and it will still work. I guarantee that word isn't on my site.

unikat




msg:4454993
 1:01 pm on May 18, 2012 (gmt 0)

Also I've just noticed that the results aren't just pre-Penguin.
My website actually go up in this results.

I've remembered the positions of several keywords for which I've ranked well (lets say 8-9-10 position on the first page). And now they are 4-5 positions up in these results along with the slightly changed titles and descriptions which I did recently as a test.

But still these pages didn't show in regular SERP :(

Planet13




msg:4454998
 1:11 pm on May 18, 2012 (gmt 0)

Ok, so what does this all mean?

That Penguin is more of an exact match filter (as opposed to Panda which is supposed to be more domain wide)?

Also, another example of Penguin (I am pretty sure) at work is when my target page dropped out of the SERPs and was replaced by my home page.

That dropped page is NOT back in the SERPs when using ANY of the -keyword / -domain/tld techniques mentioned above. (although it does rearrange the listings of the SERPs for the other sites that are in there).

crobb305




msg:4455000
 1:21 pm on May 18, 2012 (gmt 0)


Ok, so what does this all mean?

That Penguin is more of an exact match filter (as opposed to Panda which is supposed to be more domain wide)?

Also, another example of Penguin (I am pretty sure) at work is when my target page dropped out of the SERPs and was replaced by my home page.

That dropped page is NOT back in the SERPs when using ANY of the -keyword / -domain/tld techniques mentioned above. (although it does rearrange the listings of the SERPs for the other sites that are in there).


I am wondering if it can help us see if a site is penalized (with a possible penalty expiration at some unknown date in the future) vs. deindexed or has complete backlink devaluation (will not appear even with -example.com, assuming example.com doesn't exist on the site) vs. partial backlink devaluation and temporary penalty (-example.com brings the page back up, but not exactly where it was). In my case, the operator brings my site back to position 4 or 5, but not #1 where it used to be. In this case, I think it tells me that some of my backlinks have been devalued, and when/if my penalty is lifted, I won't rank as well as before. This is just a theory. Many here report they are still #1 with the operator...maybe those pages have been dinged for lighter forms of spam or fewer unnatural links (very sensitive filtering process).

I feel that if backlinks were truly devalued, as some have speculated, then those links would no longer have any impact on ranking, even if some gibberish were appended to the query string. To me, that's the takeaway. We may glean some insight into the fraction of backlinks that may have been permanently devalued. Or, maybe a page is filtered/penalized for a reason unrelated to backlinks.

Jez123




msg:4455002
 1:35 pm on May 18, 2012 (gmt 0)

I think it tells me that some of my backlinks have been devalued, and when/if my penalty is lifted, I won't rank as well as before.


Or it means that the links are devalued, no penalty is applied, and that's where you will stay until you either get more links or your existing links are restored to their former value.

This is something I have been worrying about. If the links are just devalued then there is no re evaluation and restoring to former positions. We are all waiting, basically, when we should be link building. Whatever link buidling is now allowed within the google regime, that is.

Planet13




msg:4455008
 1:46 pm on May 18, 2012 (gmt 0)

We are all waiting, basically, when we should be link building...


Or improving quality content... or advertising / marketing...

crobb305




msg:4455011
 1:50 pm on May 18, 2012 (gmt 0)

We are all waiting, basically, when we should be link building...


Some of us are "waiting" for Google to account for all the changes that have been made over the past 4 weeks (i.e., link removal, new links, new content, new marketing). In my case, Google still shows a sitewide link in my WMT that was removed a month ago. Also, doing a site: search shows that my title changes made a month ago have still not been incorporated into the index. In fact, I haven't had any deep spidering activity in 5 weeks. Some of us are working.

Jez123




msg:4455012
 1:54 pm on May 18, 2012 (gmt 0)

I'm not saying we aren't working :-) Just that we may not be working on the right things :(

Shaddows




msg:4455014
 1:54 pm on May 18, 2012 (gmt 0)

The only two explanations that make any sense (and neither impact recovery strategies) are:

1) There is a different recipe for cooking up SERPs including excluded terms / domains. As Penguine (and pos Panda) are resourse-intenstive, they are not baked in to this recipe. Implication: the SERP is generated holistically, rather than for the "include" terms and then subtracting the "exclude" terms.

2) Adding an "exclude" breaks the sequence for generating displayed SERPs, the assumption being that you build "normal" SERPs then remove "excluded" items. Implication: Penguine(/Panda) is a post-algo modifier, applied to every result after the core rank/score is generated.

Of the two, only the first seems remotely likely.

In other words, given the fact Penguine is not present in the process, this artefact has no explanative power in investigating that particular ranking module, though it does give an opportunity to look under the bonnet (hood) of Google's core algo.

Given that [<term> -<term>] give NO results, it implies that the score page-score of the excluded term is wholly subtracted from the page-score of the included terms. This in turn suggests a powerful investigative tool using related terms to see the relative scoring power of each term; <automobile -car> for example.

crobb305




msg:4455016
 1:57 pm on May 18, 2012 (gmt 0)

Just that we may not be working on the right things :(
That may be true, but Google really should be able to spider the web and incorporate new data/changes much faster than every 4 to 6 weeks. It's not just my site that I'm seeing stale results for. I've had some links removed, incorrect BBB pages removed, changed my business FB page title, all over 3 weeks ago...Google still displays the old stuff.

I know what you're saying though Jez123 :) ...there is much work to be done. Unfortunately, once you do that work, it will take Google a complete moon cycle to detect the changes.

The only two explanations that make any sense (and neither impact recovery strategies) are:

1) There is a different recipe for cooking up SERPs including excluded terms / domains. As Penguine (and pos Panda) are resourse-intenstive, they are not baked in to this recipe. Implication: the SERP is generated holistically, rather than for the "include" terms and then subtracting the "exclude" terms.

2) Adding an "exclude" breaks the sequence for generating displayed SERPs, the assumption being that you build "normal" SERPs then remove "excluded" items. Implication: Penguine(/Panda) is a post-algo modifier, applied to every result after the core rank/score is generated.

Of the two, only the first seems remotely likely.

In other words, given the fact Penguine is not present in the process, this artefact has no explanative power in investigating that particular ranking module, though it does give an opportunity to look under the bonnet (hood) of Google's core algo.

Given that [<term> -<term>] give NO results, it implies that the score page-score of the excluded term is wholly subtracted from the page-score of the included terms. This in turn suggests a powerful investigative tool using related terms to see the relative scoring power of each term; <automobile -car> for example.


Sounds reasonable. Thanks Shaddows.

Planet13




msg:4455036
 2:49 pm on May 18, 2012 (gmt 0)

Out of curiosity, for people who were victims of Penguin, how do your rank for plural / singular variations of your keyword(s) that were hit.

I ask because my landing page was removed from the index for the (more popular) plural version of the keyword, while it still ranks near the top of page 2 for the singular version of the keyword (that's about four spots below where it used to rank for the plural version).

(I sent in a reconsideration request to google and they said there was no manual penalty applied, so I can only assume this falls under Penguin.)

Jez123




msg:4455042
 3:07 pm on May 18, 2012 (gmt 0)

I am #2 for singular of popular KW and page 8 currently for the plural. Crazy.

chalkywhite




msg:4455071
 4:09 pm on May 18, 2012 (gmt 0)

Im sure im not alone with a certain site in Webmaster Tools showing THOUSANDS of links to there site. Goes along the lines of "downupper.TLD". Does anyone think these types of links hold any standing with google?

Ive requested that the domain in question removes all links immediatley and they sure enough responded and have done so. When WT catches up it will be interesting to see what happens as they linkd to m4 thousand times and when they go ill lose around 60% of my links..

cr1t1calh1t




msg:4455074
 4:26 pm on May 18, 2012 (gmt 0)

@chalkywhite - you are definitely not alone.

I have wondered how Google treats links like these, where a single site will either scrape part of your page for a very specific keyword, and/or use this scrape for the anchor text in the 'Resources' section of links at the bottom of the page, often with many/every other site in your niche or for this particular keyword or query.

I'm interested to hear what happens to your site's rankings after Google's had a chance to digest all of those links being removed.

I've been reluctant to go after these links because for so long the conventional wisdom and Google party line was that they ignored or didn't factor these links into your rankings, and more recently - reluctant to go after them because I'd lose such a large # of links overnight - like you did.

What about the hundreds of whois, 'site profile', 'site info', and 'site valuation' spam sites that form a baseline of spam/unsolicited links that almost any new site will get - how does Google treat these? While most of them are only one link each, should we try to get these links removed too?

cr1t1calh1t

diberry




msg:4455082
 4:53 pm on May 18, 2012 (gmt 0)

I'm seeing the same plural/singular weirdness. One of my top phrases was plural, and now I've tanked for that phrase, but I'm not doing so badly for the useless singular version of the phrase.

Re: this qualifier business, it only works on my domain name. It doesn't reorder anything for the search terms that have tanked. Which maybe means the domain name IS punished (because I never tried to influence my inbound links, many of them are "look at this article on {link}MyDomain.com{link}"), and I've fallen on other search terms because of an actual reordering. Only, the falling happened on April 24-25, so I could've sworn it was Penguin and not the Panda update of the week or so before.

Oh, well. I know the site needs some improvement from a visitor's perspective, so I'm rewriting a lot of pages with that in mind. If Google likes it, great. If not, whatever. I'm focusing on how visitors respond to determine how to write my pages.

netmeg




msg:4455085
 5:30 pm on May 18, 2012 (gmt 0)

Out of curiosity, for people who were victims of Penguin, how do your rank for plural / singular variations of your keyword(s) that were hit.


I don't think this is a new thing. None of the sites I oversee were hit by Penguin, but we've *always* had issues on plurals and singulars. I even have vastly different quality scores in AdWords on them. Google hasn't done a good job on those for a long time.

Re the updowner thing - I have a boatload of links from that place in my GWT for several sites. Thousands and thousands. I'm pretty sure Google isn't counting them, because it's a useless site that allows garbage pages to be indexed. And it doesn't seem to have affected anything of mine when they first showed up. Got no boost, and got no drop.

soflasurfr




msg:4455087
 6:03 pm on May 18, 2012 (gmt 0)

doesnt work for any of my sites but then again i've already removed links and done all kinds of crap to fix them to no avail.

I did notice Google not indexing or crawling either on the penguinized sites, its almost like we're in a penalty box until a certain time.

I read somewhere else to try removing your sitemap and re submitting in WMT after you send the google bot to fetch your pages. It did get my site reindexed but no rankings recovery.

crobb305




msg:4455091
 6:51 pm on May 18, 2012 (gmt 0)

I did notice Google not indexing or crawling either on the penguinized sites, its almost like we're in a penalty box until a certain time.


It will just have to run its course. If Penguin is iterative like Panda, it could be weeks before we see an update. I've never noticed sitemap resubmissions or crawl rate adjustments having any impact on the duration of penalty. Just my experience, and I have had to "wait out" my share of penalties on different sites.

Out of curiosity, for people who were victims of Penguin, how do your rank for plural / singular variations of your keyword(s) that were hit.


Planet13, I started seeing ranking differences between plural/singular over a year ago...significant differences. Very frustrating at times. I'd be page 1 for one version, page 4 or higher for the other. I think it's just the phrase-based scoring, but I don't know if it was turned up a notch with Penguin. Just sharing my observations for the past year.

johnmoose




msg:4455098
 7:11 pm on May 18, 2012 (gmt 0)

@crobb305: I also meant keywords not on-site. Anyway, looks like that with this - filter Penguin gets switched off one way or the other.

Gemini23




msg:4455103
 7:47 pm on May 18, 2012 (gmt 0)

Just to say that I believed my site was hit by the Penguin April 24th... but the above search anomalies doesn't work for my website..

Gemini23




msg:4455115
 8:09 pm on May 18, 2012 (gmt 0)

I am not sure if it has been said before but adding the "-" before -anyname.com seems to simply remove "anyname.com" from the search results...

backdraft7




msg:4455130
 8:44 pm on May 18, 2012 (gmt 0)

I reported the singular plural issue several days ago. And, no, it hasn't always been that way. I have keyword reports to prove we ranked on all popular plural and singular version of key phrases right up until this mess hit. I'm seeing very slow recovery across 1,000 select key phrases I am now monitoring.
We lost thousands of long tail phrases in this update alone. We didn't just lose traction, rather the wheel's came off. At this rate, it's going to take about 8 weeks to fully recover, (like May Day 2010).
I trying really hard to find what constitute web spam across my site and I'm at a loss.
From what I can tell, Google new "quality" trigger is having one or two paragraphs of thin content surrounded by a galaxy of pop up ads.

This 534 message thread spans 18 pages: < < 534 ( 1 2 3 4 5 [6] 7 8 9 10 11 12 13 14 ... 18 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved