I disagree with:
|2 – are so dismayed they immediately hit their back button, suddenly popping back onto our radar, alerting us to the fact they were displeased with the result we just showed them – why else would they suddenly come back, without consuming your content? Displeased. |
E.g. What if there was a site that provided a gambling guide to Las Vegas, but ranking for "gambling guide", and most people just wanted to know about gambling, irrelevant of Las Vegas. Yet it still ranks for the generic term.
Therefore if this was as factor then it should only be per page, not site wide.
Kind of tough mixing a Bing reply with a Google question [webmasterworld.com...]
AFAIK, Bing doesn't penalize site-wide, and, if anything, that might be the significant data to take away in this discussion...
|Remember, Panda is a site-wide penalty, not a page penalty. |
From where have they got this?
My Pandalised site has only been hit on ONE section only, nowhere else.
Does anyone else agree with this or am I unique? I doubt it.
@HuskyPup: What I've seen over the last few months is FOLDER level "penalizations", which fits with your "section" (provide info if that is not correct)... and that will take out a number of (whatever that number of) pages. To date I haven't found credible commentary that anyone has succeeded in reversing this...
No it's definitely site wide, but perhaps sub-domains might be effected differently.
I have seen some recovery and let me tell you what I think the real secret is to beating Panda:
You can't beat Panda. It just does what it wants. I made tons of changes over the past few months. None of them mattered. I finally got a partial recovery, months after I stopped tweaking... and that leaves me convinced that nothing I did or anyone else has ever done to their site matters one bit to Panda.
I don't believe any of the dozens of things, minor and major, made any difference at all.
I could have done absolutely nothing, and I'd still be in the same place I am right now.
How do you beat Panda? Do nothing and wait it out.
Thought we were safe and bam...yesterday lost number 1 spots for many important keywords. Nothing on the site was changed. Is it Panda? How does one know?
Daniweb recovered and showed the evidence, along with all the changes they made.
I do not think panda is sitewide penalty. For one of my client's site it is on selective pages. Some of the pages were hit in 2.2 and some more were hit in 2.3 update. Still only half the site is hit (random pages). We have made certain changes and hoping (fingers crossed) things might improve with coming updates.
|No it's definitely site wide, but perhaps sub-domains might be effected differently. |
|Does anyone else agree with this or am I unique? I doubt it. |
Both are correct
Panda is site wide in it's effect on specific keywords/ keyword niches.
So, you can be pandalised for the generic (say 'hoovers' or 'vacuum cleaner') but not for a specific brand search (eg 'Sebo x4') or vice versa or both. It depends where your content problems lie.
Personally, I have plenty of brand + product name searches ranking number 1, yet searching for the brand + generic sees me pandalised.
Likewise, I have very generic terms pandalised and others that are closely related where I am #1 or #2 out of tens of millions of results.
So, I conclude Panda is sitewide in its effects (ie any semantically related page/ search will be effected), but also ring-fenced semantically within Google's keyword/ semantic clusters (the ones that make it so smart at understanding what we are searching for).
The effect is to taint everything related (not necessarily linked) to your bad content.
|However the section on ads doesn't seem to line up with the rest of the points. |
How so? It jives perfectly with the first point about template footprint - if you have too many ads, your template is going to take up most of the page, because the ads will be part of the template.
The part I might disagree with is the idea that "reduce the sins" will get you out -- I've fixed some but not (yet) all of the issues mentioned, and I've seen very little improvement. Others seem to report a similar experience, leading to the conclusion that merely reducing the "sins" will not be enough, but rather you will have to completely eliminate all of these problems before you'll see much in the way of improvement.
Right now I'm working on rewriting my text content. It seems to be the only consistent factor in the recovery stories I've seen -- either they've changed nothing and recovered due to algo tweaks, or they rewrote or removed substantial amounts of content.
Ok fine, the template point doesn't line up either. It is a separate algorithm/factor from the duplicate content issues, and is only speculation.
10000% disagree with ads. Not gonna buy that BS. Be it from any kid in the block not gonna buy into it.
If you think like a HUMAN you will beat panda and from what I have seen it seems it is sort of almost at real time for short quick updates.
The initial launch did a lot of damage and took time to recover.
Everything about Panda is only speculation. Even those who have recovered are not usually sure exactly how they did it, as either they've changed nothing or changed a lot of things at once.
As I see it, there are a large number of signals that Panda looks at and combines into a final rating. So what's at work in one case might have nothing to do with another case.
Google engineers have mentioned "ads dominating a layout" so frequently that I cannot doubt this is in play. The issue is one of proportion, as I see it - do the ads make the content hard for the eye to locate. In other words, if the page is laid out to almost "force" visitors to go to the ads, and it only uses content as SEO bait for search results, then that is not a page Panda is going to like. This will be magnified if other Panda factors are also present.
|Google engineers have mentioned "ads dominating a layout" so frequently that I cannot doubt this is in play. |
Yeah, and right after Panda 2 the Adsense team changed their layout guidelines. That was pretty much a dead giveaway.
On the other hand, I changed my layout to feature content much more prominently right after that, and I think it made the site a lot better, but it didn't help my rankings at all as far as I can tell.
|there are a large number of signals |
That's the problem, and why I'm skeptical whenever anyone thinks they've boiled it down to any one factor or even a short list of factors like this one. It's more complicated than that. If it were that simple, it wouldn't take so long to run.
Do any of you know why squidoo is doing well despite others like hubpages cracking? It looks like the move to subdomain was a failure for hubpages as several of their members are posting about loss of traffic for the past 10-12 days. Most of them are those who saw an increase after the initial move to sub-domains. As I suspected, new URLs seem to have temporarily released the pages, giving a strong hint to my oldest finding that URLs are being tagged for this panda demotion.
But does anyone know, how squidoo where people aggregate content from flickr, youtube and other sources survives Panda? How different is squidoo from hubpages? I see a lot of poor content there too but the fact is it survives! Is it because of Seth Godin magic?
I don't know how sites like squidoo continue to survive post panda.
However, there is a write up on squidoo about the effects of Panda on both squidoo and hubpages.
Basically, according to quantcast data, prior to panda, the traffic for hubpages had been significantly higher than that for squidoo.
After Panda1, hubpages started to get about the same - or slightly less - traffic than squidoo was getting.
Squidoo traffic seems to be holding steady but that article mentioned a slight rise since April 11th. There was no data on that page regarding whether this traffic has been sustained.
No ads at all on my oldest Pandalized site that was hit the worst. No thin content, etc, etc, etc:-) And VERY site wide, distinct subdirectories that are generally unrelated to each other and don't cross-link.
Nobody is beating Panda, it's just Google beating themselves and providing crappy results.
Content_ed - I am sure if I saw your site I could find something that could be deemed Panda relevant, otherwise you wouldn't have suffered.
One of the quality signals I definitely thought Panda would take into account are spelling and grammar based upon my past training in tests and measurements. Amazingly after two months of ongoing study it seemingly has little impact. In fact upon close inspection I am saw sites in the top results with grammar so horrific its jaw dropping. The first few sentences of the page could fool anybody but the rest could be total mush.
My conclusion was what I thought would be a major factors in discerning quality were not. In fact the availability of so many faulty pages practically eliminated the idea that the “bottom of the barrel” pages had been eliminated. In other words Panda seems very forgiving in areas related to grammar and spelling.
The only thing that did emerge, that I wasn’t testing, was there seemed to be page templates almost immune to Panda. In that case it could have been a high number of similar templates catching my eye.
Even Google has spelling and grammar mistakes, it's something you can't help. However, the difference between Indian English and real English should be a factor.
|I am sure if I saw your site I could find something that could be deemed Panda relevant, otherwise you wouldn't have suffered. |
It's very hard for someone who poured energy and resources into their site to get outside themselves and "be objective". I think anyone who is stymied by a Panda problem should get a third party assessment from somebody who is willing to insult them.
Thanks for sharing that. It is interesting that squidoo is doing well. The voting widget towards the bottom of that page tells the story. There are many votes for "traffic unchanged" for squidoo while the story is different for hubpages.
The following are some of the differences that I notice.
1) Squidoo has longer stories. An average squidoo page has more content than a page on hubpages.
2) There is more diversity of content in squidoo - videos, gallery, text and so on.
3) Total number of google ads used on a squidoo page are less but there are several other types of ads.I am not sure whether panda is blind to certain types of ads (other than adsense).
4) The adsense block used at the top is different in size - not the 336x380 or the 300x250 size.Most successful sites don't seem to use these type of ad formats. 728x90 seem to be popular among them.
Whatson, AFAIK there are several Indians adding content on squidoo. What do you think is helping them?
since we cannot make Panda reviews with any individual site as an example here, shall we try to find the difference between the bigger ones like squidoo and hubpages?
hear hear tedster.
let me offer my insults to someone today.
One thing to remember though is that Squidoo did not have as much traffic as hubpages had, so it was akin to Panda "knocking down" hubpages' traffic to the level of Squidoo traffic.
While we are talking about this, I think it is also important to note that while ehow escaped Panda 1, it was hit by Panda 2.
I bring this up because I think that to better understand Panda, we have to look at where it WORKED well (hubpages, mahalo, suite101, etc.,) and see if there is anything that we can learn from them. There might be another common thread among them that we are missing.
What was the difference between Panda 1 and 2 anyway? And don't feed me this BS about an international launch on 2, because that's not true.
Everybody on this list and every website in existence has "Panda relevant" issues, including all of you who haven't had a site impacted by Panda. If you don't realize this, I suggest third party reviews.
The self-accusation business isn't for me. I'm not interested in spending my time trying to please Google's latest algo and have never designed for algos. I'm a webmaster by default, it's not a career choice:-) If our content wasn't good, there wouldn't be tens of thousands, if not hundreds of thousands of copyright infringements on it.
That said, after sixteen years of publishing online, I'm curious to discover where Google screwed up. If I thought there was some sensible change that would make our site better for visitors and for Google I would do it, but I've seen zero evidence that anybody has a handle on that.
I enjoyed watching the Daniweb interview, primarily because Dani is very likeable, but we don't any forums with empty questions or duplicate content to worry about, other than what's been stolen from us.
I am business man by default, and I rely on income from my web site. I have dedicated my business specialization in Google, so I do rely on it.
| This 40 message thread spans 2 pages: 40 (  2 ) > > |