| 1:18 am on Apr 7, 2011 (gmt 0)|
I also think their panda algo is a one way street for at least a year. The sites will stay buried. They just relly don't want to see all these panda affected sites in their index.
| 1:52 am on Apr 7, 2011 (gmt 0)|
Nice theories, but I don't have any really bad pages, unless you're a literary critic:-) Nothing copied, nothing derivative, lots of high quality organic deep linking.
The penalty is applied pretty evenly across the site, a shorter pages seem to be affected somewhat less than longer pages.
Clearly, everybody isn't seeing the same thing with Panda, but I've taken the time to look into other high-quality sites with low to medium page counts (couple dozen to a couple hundred) that have been hit. I can't lay a finger on anything except that they get ripped off alot and aren't eHow or Amazon.
| 1:55 am on Apr 7, 2011 (gmt 0)|
How are the design and layout of the sites you've been reviewing that were affected?
You can't simply look at pages, content or links with Panda, it's not that simple.
| 2:01 am on Apr 7, 2011 (gmt 0)|
My theory grows from the advice to first address the pages that have taken the biggest loss. Google has told us that the pages the algo selects then influence rankings for the rest of the site.
| 2:02 am on Apr 7, 2011 (gmt 0)|
They vary. I wondered if my old fashioned design was killing us even though I have a newer site that benefitted about 10% from Panda with exactly the same template, etc, but the sites I reviewed were all over the map for design, CMS, number of pages, etc.
Some looked just like magazines, in fact, some were magazines published by one of the top magazine publishers in the world. And only a couple of their magazines got killed, the others were OK. I suggested here at the time that subject matter may have something to do with it, but nobody was buying in.
Subject matter is the main difference between my two hurting sites and my happy site. The happy site rarely gets ripped off because it's not stuff that monetizes well with Adsense, and it's only been around a couple years.
| 2:07 am on Apr 7, 2011 (gmt 0)|
I never saw that advice, was it from Google official?
In any case, no pages got really killed, the greatest variation is maybe 10%, ie - some pages lost 25%, some lost 35%, and it's seems to be more dependent on how far down the long tail they were drawing traffic than anything else. That may be why the short pages were hurt less, they had less content for long tail matching.
All I remember Google saying was to eliminate low quality content. The strongest and longest pages certainly weren't the lowest quality, even in our limited spectrum of super high quality to very high quality.
| 2:33 am on Apr 7, 2011 (gmt 0)|
|If Google feels they identified pages that give their users a poor experience, then they would not let those pages rank again just because a certain amount have time has passed. |
@tedster, in one of the previous posts, you gave an example of one site which was back to normal after they removed thin pages.
Your example and the above statement contradict each other.
| 2:47 am on Apr 7, 2011 (gmt 0)|
|I never saw that advice, was it from Google official? |
JohnMu < correction, this quote is from Wysz > gave this advice in the monster thread on the Google Webmaster Forums:
|...if you believe you've been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content. |
The part about finding the pages that were dropped the most is my added interpretation of HOW to spot the pages that Google considers low quality.
|Your example and the above statement contradict each other. |
The one site that seems to have recovered involved lots of repair work to the content, including removing a lot of URLs and improving others. I should not have used the word "page" because it is not a technically precise word - it can mean either "content" or "URL".
So what I'm trying to say about Panda not being a true penalty is this - the same URL certainly may rank well again, but not if content that was previously rated as low quality hasn't changed. With penalties, the same URL can sometimes rank after a period in the penalty box, even if no changes are made to the content.
[edited by: tedster at 3:20 am (utc) on Apr 7, 2011]
| 2:54 am on Apr 7, 2011 (gmt 0)|
Every single page on my site has been hit, with one exception.
I have pages with somewhat lengthy ads for retail widget stores. Most of these pages still rank #1-#3 for phrases such as "[insert name of a state] widget stores" or "widget stores in [insert a city name]". For some reason these pages--and the pages that list the various stores by state and from which the visitor clicks to get to the ads--are still ranking very well.
There's only one or two other pages left in the top ten.
I see some of my worst-ranking pages pulling down others associated with them. There are some pages that went to page two, and pages associated with them are pretty much in the top 50 results.
My gut is telling me that there won't be a recovery as there has been with past updates, where sites that were in the top ten will return to the top ten. This algo is completely new, and I think that getting those first-page rankings will be like starting over again with a new site.
I've now "noindexed" about 1,000 pages of the 3,000+ pages on my site. Any page that ranks 70 or worse has been noindexed, and 70 is just an arbitrary point for now. If pages that rank better than 70 don't improve, maybe I'll go down to 50. Pretty soon, though, I wouldn't have any site left.
| 3:03 am on Apr 7, 2011 (gmt 0)|
Seems to me your interpretation is a guess, an interesting guess, but not one that matches what I've seen. And I supect that JohnMu's advice really isn't Panda specfic at all, it's just standard Google "build a quality site and the rest will follow" jazz. Used to be true, at least for us, but isn't now, so hard to believe it will work for sites that really have a low quality content.
| 3:09 am on Apr 7, 2011 (gmt 0)|
dickbaker- If I may offer my perspective. I would not noindex pages solely on their current rank. I think this algorithm may do a site wide penalty, in which case you can't be sure why certain pages rank really low. Maybe it has nothing to do with quality, maybe it is the back link profile, maybe it is on page seo problems that can be fixed.
I would caution over doing the noindex thing. Just stand back and decide does this page offer decent value? If the answer is yes, who cares where it ranks right now, leave it in the index.
Something that gets lost in this mess is the fact that there are about 30% of traffic (or more in some cases) that is coming from non-G search engines. Noindexing will affect you on all the search engines.
My stance has been to just eliminate/noindex the stuff you would not want to encounter in the search results yourself.
| 3:26 am on Apr 7, 2011 (gmt 0)|
|I supect that JohnMu's advice really isn't Panda specfic at all |
First, thanks to browsee for letting me know that the quote is from Wysz, not JohnMu. I made a note above.
Second, it is advice that was specific to the Panda update - it is much more tailored to the specific situation than the generic Google line. And I went one step further from that advice, because I say it's best to focus on the pages that the algorithm demoted rather than just looking all over your site subjectively and making changes helter-skelter.
|My stance has been to just eliminate/noindex the stuff you would not want to encounter in the search results yourself. |
I like that way of thinking, too.
So far there's very little report of anything actually working - bringing back lost traffic. So even that advice from wysz is just kind of theoretical. The only case I know of that seems to have bounced back did as I described.
I don't personally work with a site that was affected, but I know several people who do. Some of those sites really confuse me - if they aren't a false positive, then I'm baffled by what this algo is targeting. So the only way I know how to start is looking at the URLs that lost the most and go from there. What other approach might there be?
| 3:50 am on Apr 7, 2011 (gmt 0)|
maximillianos, I'm noindexing pages that are clearly thin. Some pages I've found are almost anorexic. I'd forgotten they existed.
I'm only noindexing for Googlebots. I thought about using robots.txt, or even removing pages altogether, but I can think of a reason why noindexing might be preferable.
Google is a business, and it has competitors. If there's some poor quality pages on a site, Google doesn't want to serve them up to their users. However, it's to their advantage to have Bing and Yahoo serve up what Google thinks are poor quality pages. If I'm correct about that mindset (and I'd certainly think that way about competitors), then noindex would be the way to go.
In the end, all of the theories and guesses and hunches being posted here are just that. We have almost no evidence that anything works, and chances are it will be quite some time before we do, and even then we won't be certain if we know what caused the change.
| 4:16 am on Apr 7, 2011 (gmt 0)|
I think one should do the right thing for each page. The talk about G looks down on noindex, robots block or no follow etc are all misguided - just use them appropriately; not with regard to some magic proportion, just what makes sense in each case.
Panda is a scoring of the sitewide quality of the indexed pages. Delete or noindex? Do what's right for your site / visitors. Noindex does not mean nolook; I'm sure G can work out if noindex or nofollow if appropriate or underhand.
In my overhaul I found quite a few lingering old pages that i deleted and url removed, some things I blocked with robot, some things I tidied with htaccess and most of all a huge amount of pages that didnt need indexing, but were part of the site. Things improved for me, although I was not 'hit by Panda'; certianly it made me look again at these issues. So perhaps if 'hit by Panda' one wont see the improvements quickly but I expect they will come. Do nothing and you wont rebound. Tidy up (and that's what this is about, in short) and maybe some day ...
It's spring clean time
[edited by: bramley at 4:20 am (utc) on Apr 7, 2011]
| 4:16 am on Apr 7, 2011 (gmt 0)|
I have increased content on low content pages and saw no improvement. I still have top 3 spots for many keywords though, but have lost traffic too after Panda. My new theory is that its not my content at all and it could be low quality incoming links from directories we use to submit too. I bet in the past I submitted manual links to over 10,000 directories.
P.S. Why would I loose 3-4 word keywords but they let me keep 2 word keywords for the same page that are high priority.
Puzzling, thats for sure......
| 4:27 am on Apr 7, 2011 (gmt 0)|
@snickles : I think there is a limit on the number of terms a page will rank for. Might be 2 word terms or 3 or 4 word terms but it G maybe now chooses the top 2 and diregards pretty much the others. That's my understanding / experience.
This might account for some of the 'mysterious' ranking drops - the page has not been dropped perse, but the limited key phrases assigned for it are not what one is looking at.
Also, I have seen improvements as I made changes, so it looks like some sites are caged for a while to perhaps see if changes persist and are genuine, something like that; needs patience. I think it was Jane Doe who sounded optimistic in a recent post despite not improvement yet - that is right i think. In my case it is quite embarassing really hor untidy my site was previously. Go tidy up ...
If using a CMS (and that's pretty much everyone; certainly all of those 'pandalised' I'd guess), look closely at the pages output and think what you could improve upon
[edited by: bramley at 4:37 am (utc) on Apr 7, 2011]
| 4:35 am on Apr 7, 2011 (gmt 0)|
|I think there is a limit on the number of terms a page will rank for. Might be 2 word terms or 3 or 4 word terms but it G maybe now chooses the top 2 and diregards pretty much the others. That's my understanding / experience. |
Not the case for sites I work with.
| 4:41 am on Apr 7, 2011 (gmt 0)|
@ Tedster - I didnt write that on a whim, I have been mulling this for some time and experienced it by experiment shall we say. Perhaps the term limit is over the top, but there is a ranking of terms for a page - if stronger for one phrase will weaken another. That doesn't mean a page cannot rank well for for multiple phrases, but there is something in this; and it makes sense after all.
If the page is especially dominant maybe the effect will not be seen much, but if not quite #1 on all phrases, this does occur - boost one phrase and lose a bit on the others. I've seen it over time; I don't know if it's a new thing (probably not) but surprised not to see this mentioned previously (as far as I know, but I'm quite new to SEO).
I've shared what I found and feel i did the right thing; if it is ignored so only I benefit that's not a bad thing ;)
| 4:55 am on Apr 7, 2011 (gmt 0)|
I didn't think you shared that observation casually. The difference between our experiences may be the kind of that site I work with, but I regularly do see a wide variety of search terms being sent to certain pages.
| 4:59 am on Apr 7, 2011 (gmt 0)|
@ Tedster - no offence for sure, mine is an info site. All I can say is that it perplexed me for some time but after a while I realised that is what one should expect - a page is either about blue widget history or blue widget alternatives or whatever, and that a page that tries to cover both / all will be weaker for any particular term.
Some pages I didnt understand why weak serp-wise for my desired term - but later I realised they were #1 for something else - which made sense when looking carefully at the page ...
| 5:23 am on Apr 7, 2011 (gmt 0)|
bramley, I do see what you do. for example, there was a page that ranked high for "blue widget" and "blue widget alternatives". Now I do see the page retaining its rank for "blue widget alternatives" while moving to page 2 for "blue widget". I noticed these for a few other pages too.
But there are a few others who compete for the same set of keywords and one such site has either retained or improved the ranks for most of those combinations.
Do you see any pattern in this?
|but later I realised they were #1 for something else - which made sense when looking carefully at the page ... |
Can you give a generic example on why it made sense?
ps: All my examples are observations comparing pre-panda to post-panda.
| 5:31 am on Apr 7, 2011 (gmt 0)|
|bramley, I do see what you do. for example, there was a page that ranked high for "blue widget" and "blue widget alternatives". Now I do see the page retaining its rank for "blue widget alternatives" while moving to page 2 for "blue widget". I noticed these for a few other pages too. |
I've been watching this phrase-based ranking for a few weeks. My site fell to page 55 for "blue widgets" but I rank #3 with four sitelinks for "big blue widget" (singular), and on page 4 for "big blue widgets" plural. Incidentally, the top 3 ranking for the singular form is new, just since yesterday. Plural forms are hardest hit for me. Traffic is up though.
| 5:40 am on Apr 7, 2011 (gmt 0)|
Getting tired (sorry in UK and sunrise here) but I think the page match is done something like the way PR is distributed between links. A page titled blue and red widgets will not rank so well for blue widgets as one titled simply blue widgets. Pretty obvious really.
If you have pages called blue widgets, red widgets, black widgets; none will rank well just for widgets; for this one needs a widgets page that links out to blue widgets etc.
The nore widget of type x pages there are the more dilute each becomes for just 'widget'
A page may rank for terms other than expected because keywords used, maybe unintentionally, tilt the match towards different phrases. If that trm is not useful it is like PR leak; only ranking term leak in this case.
[edited by: bramley at 6:06 am (utc) on Apr 7, 2011]
| 5:43 am on Apr 7, 2011 (gmt 0)|
OK< a bit of an example, my site has a lot of photos - and photos is a good search term, but I had used the term gallery often in titles, <h1>s links etc and this diluted the match power of 'photos' while boosting 'gallery' - unwanted / no use.
In short, what I am saying is that one cant rank so well for both gallery and photos as one or the other - so look carefully, decide and focus ...
| 5:59 am on Apr 7, 2011 (gmt 0)|
At some point in the past the decision may have been how well does this page match the serach term; now it is how well, but diluted by how well it matches other search terms.
That's why long-tail results are seen as holding up better - because that it the best match of that page. And if there is a more genral page, it is too weak to rank (perhaps just a menu or ble widget, red widget etc).
In other words page match is not additive but apportioned (jack of all trades, master of none).
| 6:13 am on Apr 7, 2011 (gmt 0)|
Do you see evidence that this change happened as part of Panda? I wasn't thinking there were any relevance elements involved at all - but "quality" might be, in part at least, defined as "focus".
| 6:24 am on Apr 7, 2011 (gmt 0)|
@Tedster - really cant say because I happened on the forum here around this time and just implemented changes without thinking abut Panda, then thinking I am ahead of Panda, but maybe not really ahead as my domains are .com, although hosted in the EU. I don't think this is Panda specific but not sure; I know what I saw in my experience though. It's really what I always thought the algo did, just that i didnt think too much about it with regard to my pages until i got a bit obsessed with the topic ;)
In conclusion, it's all common sense, but easily overlooked. The more you look the more you find, so keep looking :)
I really cant relate this to Panda or not but I think it is a real effect. I didnt do any double-blind trial types of testing, but I have sufficient feedback from changes to be confident on this.
'Focus' would be a good term; how well focussed a match is a page, rather than just does it match.
| 11:52 am on Apr 7, 2011 (gmt 0)|
Everything is getting hard to interpret with all the Quotes left and right.
Can we get this confirmed that Google said eliminating 'thin' pages will help our other main pages rise in rank with keywords ?
maximillianos, I'm noindexing pages that are clearly thin. Some pages I've found are almost anorexic. I'd forgotten they existed.
don't you feel just removing them from your site index is enough ? (it's what I thought of doing first)
And guys, I think we need to try to be patient and give google a little time to reflect the changes to our pages. I know this is hurting a lot of us in the pocket...but if were making minor changes to our site everyday, google could be penalizing us for that as well. (for not seeing stable content)
[edited by: tedster at 3:48 pm (utc) on Apr 15, 2011]
| 12:15 pm on Apr 7, 2011 (gmt 0)|
On one set of keywords we fell from page five to seven, BUT for the most part, since I removed some of the CJ and Linkshare ads, we have moved up in the ranks. These ranks fluctuate pretty frequently, but the trend is a little higher.
| This 195 message thread spans 7 pages: 195 (  2 3 4 5 6 7 ) > > |