| 9:56 pm on Mar 11, 2011 (gmt 0)|
There are a few reports in the various threads of gradually improving rankings for some affected sites - but no one is saying full recovery right now that I've seen. I work with one site that saw some bounce back, but again traffic is maybe 30% recovered.
By the way, that site made no changes at all - because the traffic they lost was poorly targeted anyway and rarely converted.
| 10:09 pm on Mar 11, 2011 (gmt 0)|
I removed 20 or so "thin" pages, and replaced with new original content, also pulled all ads off home page.
Seems to be lots of jumping around going on, sites coming from 2nd page then they go back.
If/when I see any improvements on this I will post up with what I did.
Sooner or later someone will figure out the trigger and maybe we can all make the changes and move on to the next time.
| 8:53 am on Mar 15, 2011 (gmt 0)|
I deleted lots of tag pages, a few thin ones, and have been trying to get google to visit /remove them as soon as possible. Also visited the 404 center on Webmaster Central and solved that. I am convinced that in a week we will see another major recalculation and maybe some minor algo changes.
| 9:00 am on Mar 15, 2011 (gmt 0)|
I think it appears to be a site penalty, not a page penalty
Still working on specifics, but I've done some pretty insane things with no bounce back.
| 9:01 am on Mar 15, 2011 (gmt 0)|
I assume they will keep the SERPS frozen for a while just to evaluate the impact of the changes ( user behavior, etc )
| 9:48 am on Mar 15, 2011 (gmt 0)|
incrediBILL, I agree with you AND tranquilito.
| 12:21 pm on Mar 15, 2011 (gmt 0)|
I'm also thinking we are going to see something big in a week, somewhere between the 21st and 25th, but I'm just guesstitmating.
My coping strategy is an ambitious proactive plan to protect myself from Google doing this garbage to me again. I never want to be in this position. I'm working toward massive diversification on every level and taking an anti-Google attitude.
| 12:45 pm on Mar 15, 2011 (gmt 0)|
I think it's more of a site report card, the A's are lumped in with the A's, B's with B's etc. Like your momma always said - get good grades and you can do anything, don't get good grades and you'll have to work harder.
My question is - "When is the next report card update and what can I change in my study habbits?", assuming this grading isn't a one time pass.
Right now I'm looking at exact match page titles, especially for products, and seeing they no longer rank as highly as they once did. Perhaps the type of site that had exact match titles didn't often land with an A grade and that's why they aren't as prevalent? Or was it a slight tweeking of exact match title value? Or a by-product of a mix of things?
Freedom your plan is a good one. If you work hard on the 2nd, 3rd 4th source of traffic etc it stands to reason that Google may also reward you while you have other traffic sources even if they don't. Is it really time to focus on the #2 free traffic source instead of #1?
[edited by: Sgt_Kickaxe at 12:48 pm (utc) on Mar 15, 2011]
| 12:48 pm on Mar 15, 2011 (gmt 0)|
|I assume they will keep the SERPS frozen for a while just to evaluate the impact of the changes ( user behavior, etc ) |
They're definitely not keeping the serps frozen, for the record. I've just had a new site jump 5 pages this morning for one of its main keyphrases. (To the sweet and heady heights of Page 15. Damn it baby I'm good. Megabucks, here I come!) If anything, I'm seeing SERPS to be more fluid than normal. I put that down to more and more data being read by the new algo and the serps shifting as a result.
A bit of general advice - Don't assume, and don't trust what you read here or elsewhere. Ever. Anyone decent and making a lot of money in this game will keep their best tricks close to their chest. Its business. A lot of other people will have the best will in the world, but not know what on earth they're talking about. I've had plenty of wrong ideas about SEO when I started out. Some of my stuff may still be wrong. Test. Think the serps are frozen? Track a few keyphrases and find out. If you assume the serps are frozen, you are limiting your scope to experiment. In the meantime, someone like me will test, will figure things out, will use it and will beat you in the rankings. I know that makes me sound arrogent - but I really do usually win, and I do it by taking everything I read with a pinch of salt and testing different ideas on junk sites. Just some advice.
Anyway. On-Topic. I suspect my client base is different to some of yours (No adverts, no thin sites, usually quite specific niches, often brouchure websites), but to me the only effect I've seen is on sites who have articles on generic article websites as part of their backlink profile. I haven't touched this form of link building in years (Too much effort for too little result for my tastes) and my more recent projects seem completely unaffected by Farmer. Some of the sites I used to place articles on were on that 'biggest loser' list thats been floating around since the update.
Those clients who have seen a drop tend to be consistent with part of their backlink profile being discounted rather than a penelty as such. I wonder, if those of you experiementing with onsite changes, have considered that farmer may of hurt the sites linking to you, as opposed to you personally, and what you are seeing is the knock-on effect?
Anyway. That's my data set. Hope it helps :)
| 12:50 pm on Mar 15, 2011 (gmt 0)|
They aren't frozen in my GWT either, pages are rising and falling a little bit daily.
Did anyone see any whopping -100 places or more drops in "average position" all in one shot with Panda? If so did any pages rise? It might be interesting to compare the losers with the gainers on a page by page basis.
| 2:44 pm on Mar 15, 2011 (gmt 0)|
|Also visited the 404 center on Webmaster Central and solved that. |
Walkman, where is the 404 center and what are you solving?
| 2:52 pm on Mar 15, 2011 (gmt 0)|
|but to me the only effect I've seen is on sites who have articles on generic article websites as part of their backlink profile. |
I would say that depends on whether the articles got picked up by real sites as well or not. I have one site where most of the back links actually are from syndicated articles and that site's traffic and income are way up.
| 3:23 pm on Mar 15, 2011 (gmt 0)|
|I have one site where most of the back links actually are from syndicated articles and that site's traffic and income are way up. |
Yep, and my site has no generic article-site links at all, and it's been hammered by the Panda.
| 3:54 pm on Mar 15, 2011 (gmt 0)|
|Walkman, where is the 404 center and what are you solving? |
Log on to Google WebmasterCentral and and under Diagnostics > Crawling you find crawling errors etc. See the 404 pages in the sitemap, or those that have internal /external links.
| 6:46 pm on Mar 15, 2011 (gmt 0)|
It's clearly a site penalty for most of us, yet there was a whole thread here a couple weeks ago tying to definitively state the opposite:-)
Since I don't have any thin content and never played any SEO tricks, I've got nothing to change but our simple HTML site architecture, and I'm not going to do that. I have gotten busy with DMCA complaints and filing Google spam reports, etc.
While I don't track individual search terms as a rule, I've kept up with a couple exact phrases (that nobody ever searches on) to see the results. In at least one case where my own page didn't even appear in the rankings anymore, getting the #1 result (a verbatim infringement) removed and using the public URL removal tool to report that allowed my page to reappear in the rankings, and in the #1 slot to boot.
But the loss of visibility for search phrases is just a symptom of the whole site being penalized so that unauthorized duplicates are now seen as more authoritative, even those on article farms. Whether the massive infringements and unauthorized syndication over the years are responsible for the site penalty, only Google knows.
And getting back that #1 slot has zero impact on the traffic for that page. Getting the duplicate removed didn't increase the trust rating for the original, it just got rid of the squatter that had taken its place.
| 7:26 pm on Mar 15, 2011 (gmt 0)|
From what I can see, it is both a site penalty and a page penalty. Some sites I work with saw some pages improve while other pages went down.
It seems to me like the process begins with a page-by-page assessment. A site-wide score is then applied, based on how many good quality pages versus low quality the algorithm scores. So it's not a black and white situation, not just page-specific or only sitewide.
[edited by: tedster at 7:51 pm (utc) on Mar 15, 2011]
| 7:49 pm on Mar 15, 2011 (gmt 0)|
Content_ed, from what you've described and what I am seeing in some cases, it may be that the scrapers show up as a symptom of weakness, and not as the cause of weakness. I go back and forth on this, but I have seen some evidence that strong sites with scrapers are not affected. So I am to think that if we focus on getting our sites stronger, then I expect that "weight" to be lifted, enough to dislodge all the scrapers who are ahead.
That said, I found out my site had accidental hidden text. Wow. My tech guy who is also my spouse, left some hidden text on my headers by accident. They were not spammy text and are rather innocuous, but who knows -- given that every little thing seems to count it seems. I am now paranoid about what else is lurking under the site sheets at the moment and I guess the best thing is to do a full on site audit for these things!
I'm surprised though that if this were so bad (ie. against guidelines) that I was not delisted a while back. So I am not sure how much a factor this is as i had this stuff for about a year now. My site is also authoritative, with strong backlink profile, old, etc, so could that be enough not to get a stronger penalty? Or is it the case that certain hidden text is not considered too bad (if it was seen as accidental, not using strong keyword phrasing, etc)?
| 8:06 pm on Mar 15, 2011 (gmt 0)|
|It seems to me like the process begins with a page-by-page assessment. A site-wide score is then applied, based on how many good quality pages versus low quality the algorithm scores. So it's not a black and white situation, not just page-specific or only sitewide. |
Yep, and Google admitted this much. Now the question is, what it takes to fix it and how long after Google sees it are ranking restored?
| 8:08 pm on Mar 15, 2011 (gmt 0)|
I pointed out the the scrapers ranking may just be a sign of the whole site being penalized. However, nobody but me has ever done the coding, so I know there's no monkey business. And I know from the past that Google does hand out penalties for duplicate content issues.
I don't know how somebody can focus on being stronger. We were as strong as it gets in our particular niches, I'm not going to compete with Amazon, Yahoo, etc,under any circumstances. We already were stronger than all the garbage sites stealing from us, if Google has changed that criteria to favor large community sites, it's not something that can be fixed on our end.
| 8:15 pm on Mar 15, 2011 (gmt 0)|
And with no history to go by, we are really in the dark. Panda looks to me like a major, major algo change. Some engineers were working on it for more than a year? And Google never used quality as a factor before, only relevance.
We'll be looking at this change for a long time, I think. A lot of webmasters have ideas about Google that were formed quite a while back. They've been moving forward, but our old models still worked, at least to a degree. But with this change, we're really "not in Kansas anymore."
Not only that, but apparently this new "quality" factor in the algorithm is going to be a major, year-long focus for Google. Let's hope so - they've certainly got a lot to fix ;) I'd say when they work out "Layer 2" and it goes live, we'll know it without any official announcement. Just look at how our Update Thread [webmasterworld.com] exploded when Panda first launched.
| 8:49 pm on Mar 15, 2011 (gmt 0)|
I think the real clues I'm finding are from some of my competitors that didn't get hit as hard which elements seem to count more than others.
However, being in the local directory biz, the main thing I've noticed is that it appears Google dialed up local sites ranking over directories and Google Places is definitely pushing many local directories down into an extinction level event.
Some insanity stuff like a PR 1 page for a single company ranking above a PR 4 representing 20+ companies, obviously more useful for the user searching on this topic, until you push 20 of those low PR local sites up to the top and you have now replaced the PR 4 page representing 20+ companies with SERPs that now do the same.
Bye bye directories.
| 9:10 pm on Mar 15, 2011 (gmt 0)|
I have a directory that went by unscathed. It pretty much has the same traffic and rankings as before Panda.
| 9:17 pm on Mar 15, 2011 (gmt 0)|
|I have a directory that went by unscathed. |
Do you target local or category?
I know about 40 that took a pounding.
| 9:22 pm on Mar 15, 2011 (gmt 0)|
|And with no history to go by, we are really in the dark. Panda looks to me like a major, major algo change. Some engineers were working on it for more than a year? And Google never used quality as a factor before, only relevance. |
Yes, that's the scary part. Some new sites were penalized on March 12th so the gift keeps giving. I have checked dozens of competitors and my other sites and cannot place it in one thing. My only guess are too many 'bad' pages. So I deleted them [tags] and many others and bombarded Googlebot with pings to get the noindex /410. If this works, I'll be thankful to Google since my job will be a lot easier (my pages have to updated quite often :). But I have very ugly and thin sites doing 30-40% better since the update. They do, however, have what people are looking.
incrediBILL, anything that can be automated by Google is over for others. Google will eventually get to it.
| 9:22 pm on Mar 15, 2011 (gmt 0)|
|I have a directory that went by unscathed. |
Me too, global niche trade specialist, traffic has increased but not too sure if this was from Panda or merely seasonal boost.
| 10:41 pm on Mar 15, 2011 (gmt 0)|
"My tech guy who is also my spouse, left some hidden text on my headers by accident"
What do you mean by that? What kind of hidden text?
| 10:55 pm on Mar 15, 2011 (gmt 0)|
|Do you target local or category? |
Nothing by location.
| 11:04 pm on Mar 15, 2011 (gmt 0)|
That's probably why you were spared because location directories took a beating from what I'm seeing.
| 11:34 pm on Mar 15, 2011 (gmt 0)|
|Log on to Google WebmasterCentral and and under Diagnostics > Crawling you find crawling errors etc. See the 404 pages in the sitemap, or those that have internal /external links. |
Ah, you mean Webmaster Tools. I thought there was some new 404 thing going on in the Google groups.
The directory portion of my site was hit bad for the word "shops", as in "[insert state or city] widget shops". Substitute the word "store" for "shops", though, and the site is pretty much still #1 for most states or cities. I don't get it.
| This 332 message thread spans 12 pages: 332 (  2 3 4 5 6 7 8 9 ... 12 ) > > |