| This 85 message thread spans 3 pages: 85 (  2 3 ) > > || |
|Google's Panda - The Main Factors|
| 6:24 am on Dec 19, 2011 (gmt 0)|
Ok, so like always, every time google comes out with a new algo update, I become obsessed with it in trying to understand it as much as possible. I have taken on clients for free in an effort to learn about panda and recover them, usually on a paid by performance basis.
We know that panda mainly targets sites with a lot of pages and I am seeing that as well. Some of my theories may have been mentioned elsewhere but some of them have not. This is what I have seen from my own experiences. Of 9 clients I have worked with, 4 of them have made some sort of recovery. There were 11 test subjects but it turns out that 2 I did not feel were effected by panda, but a regular google penalty.
Rewritten Content - When you mention the term "unique content" most webmasters generally think of content that is not plagiarized from another source. Technically, paraphrasing an existing article usually passed for unique content. Not since google panda came along. Years ago one of my sites was mentioned on a very popular blog. After that story was published many other small blogs started writing about my site. It was essentially the same article just reworded. That is no longer considered unique content. Writing about a story thats already all over the web no longer does you any good. If your entire site or the bulk of your site deals with writing about popular stories that you did not start, you could be a target for panda.
all pages look visually similar - Look at any content farm and you will see page after page of just text. It is just paragraphs and essays of boring text that is all unique, usually written by a freelance writer at very low cost per word. You know how google says "do you have any graphs to support your argument?". Well in one case, I added dynamic feature rich graphs to a portion of the pages of a panda hit site as needed. This added life to every page and made every page visually better. On other sections I advised the client to add videos, images and anything else that would improve the usefulness of each page and that client has recovered over 60% of their lost google traffic within 2 months. She is still applying these rich media snippets so I think her recovery will continue to improve. Not only that, her bounce rate has improved from an average of 46% over the last 3 years to 35% over the last month. Videos and images are starting to prove very useful especially in my early tests. I have not did too much testing in this regard but this just makes sense so thats why I am listing it here.
Too many auto generated pages vs high quality content pages - If your site automatically generates content or you have a ton of user generated content, you may be in trouble. User generated content left un-moderated can cause big quality concerns for your site. If your site is a UGC type of site that is fine, just make sure you have some useful articles as well. This is also a big cause of empty/shallow content pages.
poorly placed content, ads given prime real estate over actual content - Where your content is located is a big deal right now. What is your intention for your visitors? Do you want them to browse your site and read your articles, or do you want them to click your ads? If you're giving priority placement to your ads over your content, you deserve to be pushed back. If your only intention is to get your visitor to another site then your site is just essentially a doorway page that users can do without.
Content Intention -What purpose does your content serve? Google has made it well known they don't want anyone to try to manipulate their SERP's. Over optimization is a real issue and it plays a role in panda. Do you have to scroll all the way down on your homepage to view a welcome paragraph? That's not very welcoming is it? Is your welcome message really a welcome message or is it a place just to stuff your money keywords?
A lot of time went into this and I usually hate sharing information with strangers that I put a lot of time into but I am just paying it forward. Thanks to tedster sharing his experiences, helped me become very successful so I feel I owe it to him and this community.
It is very possible to recover from panda. Its hard to get a 100% recovery but the more you improve your site the better your chances are at recovering traffic. Not only will you recover traffic by improving your site but your site will be of better quality, users will be more likely to link back to it, conversions will improve, bounce rates will go down etc.
I have one client who I was unfortunatley not able to recover at all in fact he took an even bigger hit since working with me. However, we improved his site and he recovered his sales so that he is almost making as much money as before panda hit. If he ever recovered from panda he will be earning much more than pre panda. There is more than one way to recover. Alternative traffic sources is also a great place to start. Get up a facebook page a twitter page a free iphone app etc. Be creative and beat google.
| 7:39 am on Dec 19, 2011 (gmt 0)|
|poorly placed content, ads given prime real estate over actual content - Where your content is located is a big deal right now. What is your intention for your visitors? Do you want them to browse your site and read your articles, or do you want them to click your ads? |
Google tells Adsense publishers to put the users first but they also want them to click the ads so I very much doubt that they would penalise anyone for any particular placement?
|FROM GOOGLE: Certain locations tend to be more successful than others. This "heat map" illustrates the ideal placing on a sample page layout. The colors fade from dark orange (strongest performance) to light yellow (weakest performance). All other things being equal, ads located above the fold tend to perform better than those below the fold. Ads placed near rich content and navigational aids usually do well because users are focused on those areas of a page. |
| 8:01 am on Dec 19, 2011 (gmt 0)|
From all points this one is probably the only one i can vouch for as a panda factor. There have been so many discussions lately about penalties and everyone mention panda but I bet you something else is going on behind Panda for couple months now.
| 8:05 am on Dec 19, 2011 (gmt 0)|
BeeDee, I am not sure how familiar you are with in regards to panda. Of all the theories that have come out, ads seem to be the one mentioned the most.
Just do a search for "google panda adsense".
The day panda was release a member here even noticed that adsense had changed their best practice guide.
A site that overwhelms you with ads, makes it extremely hard to find any actual content, should a site like that rank well? I certainly don't think so.
| 8:15 am on Dec 19, 2011 (gmt 0)|
Nor do I but it still happens.
| 9:08 am on Dec 19, 2011 (gmt 0)|
it was reported at [webmasterworld.com...] that 'Content/Ads position on a certain page is going to be a factor in ranking' in the future - which suggests it isn't implemented yet.
Given also that (1) there are many sites that escaped panda but have excessive numbers of ads and (2) I and others have tried removing all ads from a site to see if it causes recovery from Panda (it doesn't) I suspect that either panda ignores ad position completely or at best (worst?) only gives it a very low weighting.
| 10:02 am on Dec 19, 2011 (gmt 0)|
Thanks Brinked, as usual you are sharing your actual experiences, not just empty theories (which is all that many of us have to go on, much of the time).
I wonder if the use of videos means that the pages have become stickier as people spend time watching them and Google is rewarding this. I have found that relevant, watchable flash introductions often result in a rise in the SERPs which is otherwise difficult to justify.
| 10:27 am on Dec 19, 2011 (gmt 0)|
I think the point is that the inclusion of videos/multimedia etc gives a page more variety to standard text - which shows that effort has been made to give the user a better experience.
Would content farms make this extra effort? No.
Nice post, thanks.
| 12:16 pm on Dec 19, 2011 (gmt 0)|
Do you think these are factors that Google is measuring directly? It seems to me that pages with no rich media, auto-generated text, and ads in prime real estate could all be detected using user experience metrics (long click detection). It may not be they are measuring ad placement, but measuring visitors' reactions to overwhelming ad placement.
I'd think they must have written some algorithms to detect re-written content. I don't think that could have been detected algorithmically. If I were asked to write such an algorithm, I can't imagine how I would start doing so though.
| 1:04 pm on Dec 19, 2011 (gmt 0)|
|Too many auto generated pages vs high quality content pages (tons of user generated content) |
I have heard this a few times, but this is the first time that I think that this could be true in my case:
I have a small blog with approximately 100 unique articles. Since last year the comments (I switched to Disqus) on a few articles literally exploded (nearly 4000 comments on 5 different articles).
I moderate them, but never correct grammatical mistakes.
Could this be it?
On the other hand, Google must know that this content is located in the comment-section, so why see this as "real" content?
| 2:40 pm on Dec 19, 2011 (gmt 0)|
Brinked: Excellent article which will help many people.
Incidentally, your findings - perhaps unintentionally - support Google's case for Panda.
Re-written content? The sludge of the internet; untold billions of pages that essentially add nothing (except a few Adsense clickers trying to escape).
Un monitored auto-generated pages? A total waste of space.
Poor page design? MFA in effect, though possibly not in intention; no more, no less.
And Panda goes to the heart of the problem - intention - as no update has done before.
Because it's effectively very cheap to produce 'content', and search engines have always favoured 'content rich' sites, there has been an explosion in totally pointless pages, just like the tonnes of pizza delivery leaflets I recycle each year, we have become buried in quantity, leaving quality at the bottom of the pile. Panda has sought to redress this.
I do accept - and always have - that there are some innocent victims of the collateral damage of the Panda process, but you have illustrated the awful truth that if people have nothing to say, they should shut up - not plaster the web with mind-numbingly dull, repetitive content. In fact the only place I would disagree with you, is that I advise people to simply lose those pages, not just try to 'pretty them up' in order to escape the Panda scythe - I suggest that your unsuccessful cases might try that approach with success.
If Panda really succeeds in letting a bit of quality shine through, then the price is worth paying.
Let's face it, if there was a $1.00 per page tax on URLs, 99% of pages on the web would disappear overnight, and no-one would miss them.
| 5:19 pm on Dec 19, 2011 (gmt 0)|
Domains are not free. Yet, in our niche, for each key phrase I can come up with, .com, .net, and .org are already registered.
| 5:19 pm on Dec 19, 2011 (gmt 0)|
Poor Business Model
I think when you are diagnosing Panda, you need to go back behind the website. Are you tossing up a stock shopping cart site with products gleaned from a manufacturers/affiliate RSS feed, like a thousand other sites? Has your (once original and unique) static site remained largely static, while others around you have progressed? Are you doing anything that is immediately and obviously different (and better!) than anyone else in your niche?
I have been shown pandalyzed sites that looked pretty darn good to me, until I went and looked at the niche and there were hundreds of other sites that were doing essentially the same thing in essentially the same way. Only so many of those are ever gonna rank.
| 5:48 pm on Dec 19, 2011 (gmt 0)|
Vamm; I would never suggest that they are. But they are cheap. And domain names don't HAVE to include you niche key words to be successful. A .com domain with a six-letter made up name, which you could build into a brand (Or not) costs a few dollars a year.
But the cost of web publishing is negligeable compared with print, and once established, the marginal cost of another 1,000,000 pages, set up for SEs and never visited by human beings, is virtually zero.
That's my point.
| 9:15 pm on Dec 19, 2011 (gmt 0)|
Thanks Brinked. If you feel that some sites have recovered, what was the length of time between changes made, and detecting positive results?
It took me a long time to realize: rewritten content. As my site became larger it was inevitable that something written about in 2010, would also be written about in 2011, and over time content was effectively - but unintentionally being duplicated.
However this is still speculation. Until I see some positive results - I cannot be sure why the huge panda hit.
| 10:51 pm on Dec 19, 2011 (gmt 0)|
synthese, I cant really give a definite timeline. The changes that were made to these sites were gradual and not all made in one day. I would say a little less than 2 months after they started applying the changes.
One site that I recovered took about a week and had over a 300% increase in traffic from before the penalty so that was huge. In that case I believe it was an over op penalty and not related to panda.
| 11:56 pm on Dec 19, 2011 (gmt 0)|
|rewritten content. As my site became larger it was inevitable that something written about in 2010, would also be written about in 2011, and over time content was effectively - but unintentionally being duplicated. |
The term "rewritten content" covers a very broad spectrum. At one end is the scraped, auto-generated trash that gets crammed into worthless blogs. There are sites that provide spinning techniques for generating and distributing numerous variants of the original.... which is often a cut and paste from the real original anyway. There is straight out word-for-word plagarism where one site steals from another.
IMO, the sooner this crud is driven from the internet, the better.... once Google can reliably figure out which is the original to leave alone!
But beyond those obvious examples, the interpretation of "rewritten content" seems to mean very different things to different people.
There seems to be an argument being put forward that if a subject has any coverage at all in existing sites, that any coverage of the same subject in new sites will be seen as "rewritten content", ie... duplicate content.
That's basically saying that every site after the original is at risk of being seen as duplicate content. If a pure affiliate site is simply rolling out the same product pages as the merchant, then yes, that is duplicate content and the affiliate site adds nothing to the web. If Wordpress blogs are be duplicated throughout the same site, then yes, that is duplicated content.
But if I write a new 300 page site, using my own unique style and presentation, are pages of that site going to be seen as "rewritten content" simply because Mary Smith wrote 6 paragraphs on the same subject back in 2003?
There is no possible way that Panda or any other algo function is going to smack a site just because there are other existing sites on the same subject.
I think for most people the term "rewritten content" means copying content from someone else's site and re-wording it so that it appears unique on your site. I guess what the OP and others are saying is that Google has become very good at recognising and smacking these pages.
| 1:08 am on Dec 20, 2011 (gmt 0)|
You are spot on about adding rich media applications as part of the content. This will reduce the bounce rate and could possible become a conversion.
| 1:26 am on Dec 20, 2011 (gmt 0)|
If they are correctly determining something is rewritten, then 90% of the big media news and newspaper sites ought to be in the dumpster now, and the news/press release services would be at the top, but they are not. There are ONLY so many ways to say something. "John S., 99, died today of cancer. He's survived by... He was CEO of..., He invented the... etc." FACTS are FACTS! There is NO WAY to avoid repeating part of what someone else wrote if you are all trying to provide the same FACTS. Unless one of you is WRONG or just making stuff up - in which case... do you really want the "different" one to be on top? (Hmm, perhaps time for an experiment there ;-) If everyone is rushing to tell about the same "scoop", someone, perhaps even the first to know about it and the first to broadcast/print it, but having the slowest web team or the last to be crawled by G, is going to have to be last. G's tried various things to tell who was first, apparently without luck so far, so WHO should be penalized for "rewriting"? Perhaps both or all parties equally. Perhaps that IS what Panda is about, SPLIT the credit equally for repeated content among all sources? Otherwise, I'm still of the opinion that duplication issues in Panda are 95% restricted to same-site content duplication.
Mentioning images, I just noticed today (I'm probably a bit behind the curve) that on-site google adsense search is now putting small (90x90?) versions of photos from the page they return in the results when there is one and they think it applies to the term. It works about 50% in so far as the photo actually represents the keyword being searched for, especially when there are multiple photos and topics on the page (a daily news page for instance). So they're apparently taking more interest in photos for sure.
| 1:51 am on Dec 20, 2011 (gmt 0)|
Maybe when it's *facts* it's not so much who reports it first but who reports it authoritatively.
| 3:38 am on Dec 20, 2011 (gmt 0)|
I don't like the term "re-written content" in terms with Panda. It's too slippery. "REDUNDANT CONTENT" on the other hand, makes sense. As noted above, there's only so many ways one can chat about "large red widgets" as "content" (articles). Product descriptions most likely get a pass... but then we have the "redundant sources" for "large red widgets"... so Panda looks for other metrics to apply against ecommerce to generate serp rank, thus "brand" becomes important.
| 3:41 am on Dec 20, 2011 (gmt 0)|
My crazy theory from day one is that Google started using screenshots of webpages to detect low-quality sites. Any webmaster could identify a MFA/parked page just by looking at it. I think Google's big idea is to analyze the page visually, instead of just looking the code, and they are also comparing pages/websites visually. Most of the new features in introduced in last 1.5 years falls in line, instant previews, reverse image search, detecting and grouping same resized images, executing ajax, showing the snippet on preview etc.
As for Adsense - Panda relation, it should be rather easy to take the snapshot and calculate the (above/under the folder weighted) content/ad ratio.
| 8:55 am on Dec 20, 2011 (gmt 0)|
|As for Adsense - Panda relation, it should be rather easy to take the snapshot and calculate the (above/under the folder weighted) content/ad ratio. |
All the more reason to get out of adsense. I was dumped by G a couple of years ago, they didn't think my site was good enough quality, so I dealt directly with the merchants that had been advertising on it and now I make more than double the EPC that I got from them - plus no worries about how it affects SEO, and no need to be sweating all the time over whether or not the plug would be pulled on me without any notice whatsoever. Let's face it, adsense is a lazy way of making money and it should be viewed as a temporary step at the most. Developing relationships with merchants isn't all that difficult and it's much more profitable for both partners in the long run.
| 9:27 am on Dec 20, 2011 (gmt 0)|
The point of the web is that you don't NEED to rewrite Mary Smith's article - you can provide a link to it.
If you are discussing her article - maybe adding to, or dissenting with her content, then go for it.
But, currently, 99% - yes, 99% - of the web is regurgitated non-original content.
Call it what you like, but the test is - "Does this article have ANYTHING new to add?" or would I be better off writing something else and using a plain old fashioned hypertext link. To paraphrase, it's the age-old self-question "Am I publishing for people or for search engines?"
If you choose to have digital diarrhoea (simply a form of sick site syndrome), that's fine - but don't blame Google when they make an accurate diagnosis and send you home without your lunch.
Of course Panda ain't perfect - but it has got closer than anything Google have done before to separating the wheat from the chaff - and millions of miles further than Bing has even dreamed of - which is exactly why I'm always suspicious when webmasters praise Bing. Think about it ;)
| 9:44 am on Dec 20, 2011 (gmt 0)|
I agree that this is mostly true but ...
|Of course Panda ain't perfect - but it has got closer than anything Google have done before to separating the wheat from the chaff - and millions of miles further than Bing has even dreamed of - which is exactly why I'm always suspicious when webmasters praise Bing. |
Do you really, honestly think that Google is that much better than Bing? About six or eight weeks ago I changed my default browser to Bing. Since then I have to say that I have had no problems finding what I need there. Google has certain tools that I miss so I still use it when I need them but I can honestly say that I am not aware of any major problems with Bing that are not present in Google.
I would say that you need to try Bing for an extended period because (in my experience) to suggest that it is that far away from Google is just wrong.
| 9:58 am on Dec 20, 2011 (gmt 0)|
|adsense is a lazy way of making money and it should be viewed as a temporary step at the most. |
I wholeheartedly agree with this comment. Many people seem to miss the point and bet everything on adsense. The business plan could not be any more wrong.
| 11:03 am on Dec 20, 2011 (gmt 0)|
I don't think it matters if you're the originator of a piece of content or regurgitating it, the bit Google is interested in is whether you're doing something different with it (which it detects by looking at the content on your page, the internet noise around the page and the internet noise around your site generally).
Here's an example. I run a pandalised ecommerce site which was using the original supplier descriptions (hands up) so we now write the product descriptions ourselves and have seen ranking improvements as a result (although no Panda recovery overall).
Recently we were the first to latch onto a new product which was very different, we produced a great product page for it and very quickly ranked top (even above the supplier). Then another reseller (a newcomer to the niche) latched onto it, created a better page with video and images, wrote a review, promoted the product like crazy on well known sites and published all the customer questions (and answers) on their product page, making their page even more unique.
They now rank top and as more sites have also latched onto it we've slid down the rankings related to that product because our page was trumped by more and more resellers.
In short, we were initially seen as the authority, then other sites put more effort into their content and promotion and Google now sees them as more of an authority for that product, which makes sense when you're taking about something lots of people are selling.
I don't think rewritten content protects you from Panda, you need to add something extra to it either in terms of content or promotion. Then, to be top, you need to make sure you're always putting more effort into it than your competitors (content, promotion and use of user generated content).
Making a noise creates user interaction which produces unique content that keeps growing while you're making a noise about it, and that all produces good rankings.
To successfully make a noise you have to use well known sites (which usually equates to high ranking, trusted sites). That takes money or time or great contacts but you'll end up with a very lively site with great content and great links from trusted sites. That, I believe, is what Google sees as quality.
I doubt ad placement or visual appeal are much to do with it, they are too easy to fix.
| 11:25 am on Dec 20, 2011 (gmt 0)|
|I wholeheartedly agree with this comment. Many people seem to miss the point and bet everything on adsense. The business plan could not be any more wrong. |
As the hundreds of thousands of people who got rich with Adsense will no doubt testify.
| 12:01 pm on Dec 20, 2011 (gmt 0)|
I am seeing lots of sites with rewritten content doing well.... cinlduing my own. It all depend on the rewriting.
When using our shampoo 80% of people notice an increase in the shine of their hair within 2 weeks of using it
4 in 5 users of our shampoo report improved luster in their hair only 14 days of giving it a try.
Sentence by sentence rewritten like this. Same meaning, very different structure and words.
But sure, if you'd do a rewrite then when you search using in google using snippets of the text the source document comes up? You have a problem.
| 12:47 pm on Dec 20, 2011 (gmt 0)|
Fair point, but don't forget that Google looks at the whole source, not just your 'content' - many sites, especially machine-built sites, have acres of links, promos, crud and general trash that feature on millions of pages; one or two paragraphs rehashed (for the umpteenth time) hardly register as unique content, however advanced the automated -gibberish-producer is.
This has long been a source of sick site syndrome, and Panda has moved much further in that direction. These days it would be dangerous to gamble on how well Google can recognize rehashed drivel - I'd wager they've bought all the major crud-producers and reverse engineered them.
Small hand-made sites where rewriting of affiliate content (for example) has been done by a human being, have been largely immune to Pandalization - that's no coincidence, in my view.
| This 85 message thread spans 3 pages: 85 (  2 3 ) > > |