| This 216 message thread spans 8 pages: < < 216 ( 1 2 3  5 6 7 8 ) > > || |
|Panda key algo changes summarized|
Folks, I have been reading a lot, thinking a lot and analyzing a lot. I am still not sure, how to get the US traffic back to pre-24th of February levels! But I think it is time to summarize the key theories of the algo change in the US:
- Internal links devalued, only external count really
- Thin pages cause substantial bigger problems for a domain
- Duplicate content snippets on your page cause substantial bigger problems
- Too many external named links "widget keyword" instead of "more..." (eg) cause penalties
are what kept me working in the past 4 weeks. Do you have some additional meme?
Pontifex, I just posted this reply on another thread, but we have some overlap on two similar topics, so I am going to post it here also.
One point that Bill Slawski makes [seobythesea.com...] :
Assessing the credibility of content and people on the web and social media: Modeling author identity, trust, and reputation
This is something I mentioned a few weeks ago on a different thread. I suspect that author names could be profiled to determine their credibility (i.e., are they posting articles on a commercial site then posting in hubs, or vice versa?)
Some of my writers have historically published content on other places, including some of the hubs like Buzzle or Ezinearticles. I think by having their name on my site could be lowering the credibility. I might be better off without even having the author's name. I have always selected writers who do good research, but you know how freelance writers work...they write for many people and write everywhere. It's perfectly reasonable, but unfortunately it could be guilt by association.
I'll also add that seobythesea has always done an outstanding job of summarizing and documenting the Google patents. I believe that the secret is in the sauce. It just takes time to delve through so much information. Many of the things we're seeing today were proposed in patents many years ago.
Consider, for example, these patents:
"Document Scoring Based on Document Inception Date" filed on Nov. 20, 2006 (by Matt Cutts)
"Document Scoring Based on Document Content Update" filed on Nov. 21, 2006
"Document Scoring Based on Query Analysis" filed on Nov. 22, 2006
"Document Scoring Based on Link-Based Criteria" filed on Nov. 30, 2006
"Document Scoring Based on Traffic Associated With a Document" filed on Nov. 30, 2006
"System and method for modulating search relevancy using pointer activity monitoring" filed on July 13, 2010
One of my sites is a travel site on a city, until recently I only offered tips and information on traveling to and around the city. In the last few months I added a guide to hotels, restaurants, activities, etc. However, to get the information on each of these establishments, e.g. a restaurant, I would go to their web site and copy and paste the content.
Is this an open and shut case of duplicate content resulting in pandalization? Should I drop all these pages immediately, or at least change it to unique content?
|Is this an open and shut case of duplicate content resulting in pandalization? |
If you link between them , it's going to create duplicate content issues
|Should I drop all these pages immediately, |
In the exisitng state it would likely be poor quality content - scoring very low
|or at least change it to unique content? |
Not sure this belongs in this thread ... back to algo changes summarised
I guess I was more stating a classic example of duplicate content penalty. As I have another similar site in the same industry which was fine, and didn't copy any content from other sites, although I would not deem it as useful as a result.
What do you mean if I link between them?
|Tedster says: Quality score is not a sum of single factors but a decision tree of chained signals. |
Thanks for bringing back this quote. It's probably not even a simple cascade-style decision tree, but one with several loop back branches and so on.
Several times Google engineers have mentioned that Panda generates what they call a "document classifier" for each URL. That terminology has come up a few times, I think I'm going to go on a deep dive and try to fill out the picture a bit more - what role does a document classifier play in the total ranking picture, you know?
Also note, for those who may still be wondering about "too many ads" or "content as a percent of space" as a factor, in another thread, suzukik pointed to this new Help page from the Adsense team:
|AdSense team tweeted [twitter.com] about the layout. |
Best practices for laying out your site and your ads [google.com]
|it's also important to consider the user experience and the AdSense program policies when placing ads on your site. Here are some tips to keep in mind: |
First, consider your users: Organize your site's content logically and make your site easy to navigate. If users can easily find what they're looking for, they'll come back to your site. Also, choose an ad color palette that is easy for your users to read.
Show off your content: While placing ads above the fold is a good way to improve ad performance, also make sure that users can easily find the content they are looking for.
As I commented in that thread, it sounds like Google's organic search team had a meeting with the Adsense team about their messaging.
@ Tedster - there is a link there to a One Click Optimizer for ad placement for different kinds of sites. I noticed that their content page structure offered actually puts a lot of ads in the 'above the fold' area. [google.com ]
Don't know if its an old Optimizer, or this is a new addition post-Panda.
Google made sure to add Pandas in their earth day logo. Cheeky. Cool HTML 5 interactive animations by the way.
|Tedster says: Quality score is not a sum of single factors but a decision tree of chained signals. |
Immense processing power is required to process the large number of interrelated parameters implied in this thread, over the whole net, hence I have my doubts. This would be very costly, if feasible at all with today's technology, and economics is a limiting factor in the real world. They have probably devised a "method" believed to do the trick "efficiently" - thereby creating a "monster" that has messed up everything.
Or, maybe, the system is still in flux - to be on the optimistic side . . . .
|As I commented in that thread, it sounds like Google's organic search team had a meeting with the Adsense team about their messaging. |
Indeed they did! But why? There is no smoke without fire - AdSense doing bad recently? Can they afford this? Have they bargained something "in the middle of the way"? Or is the system out of control, hurting search results & income at the same time?
Wait & see. Make changes that make sense irrespective of Panda, if something really needed fixing anyway. Don't fix something that is not broken.
I lost big on primary keyword terms. In my case this is two word string (i.e: texas widgets, florida widgets,etc).
But, like others have described, I held position -or maybe jumped upwards a spot or two on longer 3-word terms. (texas widget repairs, texas widget history)
I'm wondering today if PANDA III, when it comes, will strike against those longer tails.
In other words, the refinement process gradually works its way deeper into sub topics, variations, etc.
PANDA III, PANDA IV, PANDA V...
I was just looking at the top serps for a few competitive queries, and I notice a tendency for them to display their domain name in the title tag. If not on their homepage, then on many/most of their internal pages. For two queries I just looked at, 7 out of the top 10 do this. They display something like "blue and green widgets - example.com". This might reinforce a site's branding a bit.
Internal links have not been devalued IMO. Just the entire site carries much less 'juice' so it looks that way.
People keep posting that google cant afford to hurt their adsense earnings. This is rubbish.
What they cannot do is take short term gains by sacrificing the quality of their core product. If their core product is devalued, no one will use it. To that end, the SEARCH team must provide quality results. If that means hurting publishers in the short term, so be it. Indeed, it will increase earnings over the long term, as publishers become more savvy at retaining eyeballs long enough for someone to click an ad.
Cross messaging occurs when one team is incentivised at the expense of the company. So, adsense team is paid on earnings. To them, more ads means more earnings, so they push the message. This is actually detrimental to Google as a whole as results quality drops, and people use alternative SEs.
Happens all the time. Sales guys chase the quick buck, and eventually get pulled back by the more strategic thinkers who are trying to build/maintain a Brand
Exactly, Shaddows. We are so used to seeing financial plans (and even the stock markets) focus on the short term that we can barely register long term thinking when it's right in front of us. But Google has always been willing to take an unconventional route in business, and they do put much more attention on the long term.
Also there's little doubt in my mind that Adsense income at Google jsut took a hit with Panda - that, in fact, they just DID sacrifice short term revenue for long term purposes.
that doesn't explain how Google let the MFA situation get out of hand for close to a decade. it's not like they didn't know. But I guess better late than never.
I hear you on that, walkman. I think that organic search wasn't paying much attention to MFA, rather than they were actively encouraging it. The advertising side and the organic search side really are remarkably disconnected - that's not just propaganda.
I just realised how much my sites that had been hit by Panda utilised ads above the fold.
I am an AFS premium partner as well - and they love heavy ads to content ratio and my account manager was very happy with my implementation that puts heavy ads above anything else.
It does seem like the search team has now had a meeting with the adsense team as now they are saying push the ads lower down the page!
In fact clearly they have had that directive.
How backwards are Google - how amateur is all this playing out, it feels like this business is run by a bunch of muppets that don't consider other departments - and they do this in public.
Nothing more insulting than seeing a smiling Matt Cutts in his latest video saying "how to tell Google your content is the original" - or should we say "we have no idea who's content is original so please help us".
He has no clue and the more I see from his pro-Google blog the more I am offended by his "i love Google" stance - any more like that and you would think he is fully paid up member of the "I get paid so much cash I will say anything good about Google". Good god how much does he talk about all the Google products that he uses that are great - but no-one actually uses!
The main fault is with the advertising side. You can very easily say that ads on that content will not be allowed. It makes much more sense for them to do a page content quality analysis than Google search.
Once you remove the 5 ads per page, the MFA lose their luster. By pushing adsense and with the adsense team closing their eyes MFA flourished.
|Nothing more insulting than seeing a smiling Matt Cutts in his latest video saying "how to tell Google your content is the original" - or should we say "we have no idea who's content is original so please help us". |
Matt has barely anything to do now with Google search /webmasters now, he's more of a public relations guy pushing and defending everything google, from Gmail to Google Buzz. Nothing wrong with it, since Google pays him generously, but we shouldn't forget that.
[edited by: walkman at 12:07 am (utc) on Apr 23, 2011]
I was thinking of doing an anti-Google experiment.
Where I blatantly do everything against the guidelines and see what happens - and I do a daily/weekly update.
I have a few PR 5 domains ready to go - shall we see if a merchant datafeed (i.e. duplicate content) can get traffic and the new Panda algo can stop it working?
If there is interest I will setup a real trial with 5 PR 5 domains and see what happens - and you could all give some ideas on how to differentiate them.
walkman, with respect Matt Cutts is not just a PR guy - he is head of web spam.
That is a pretty relevant job and not just a voice for google - it is fundamental to what we are talking about.
Just because he is available for talking about Google does not make him a "PR" guy.
|walkman, with respect Matt Cutts is not just a PR guy - he is head of web spam. |
Not trying to speak for Walkman, but just wanted to say that I for one am grateful for MC, his videos, blog, all he's done to bridge the line of communication for so many years. I know it's hard to be the voice of an organization, have to articulate every single word, and often be the punching bag. I have been watching some of his older videos for the past few days, and he has offered some gems that I overlooked...subtle things that seem to be helping my rankings, even on Bing. I'm disgruntled with Panda, as many are, but I don't think any of us fault MC whatsoever...even for advice given years ago that may not necessarily be valid in 2011. Things change.
I do think it's very interesting that, in addition to the discovery by RustyBrick on a 302 redirect from Alltheweb ranking #1 for "Yahoo Search", Walkman discovered yesterday that some of MC's posts on his blog are now getting outranked by scrapers (for snippets of text, in quotes, for example...search some sentences from the 3rd paragraph of his February 10th post). I hope Google addresses these examples to improve scraper detection and any 302 issues popping up.
Swanson, I read your reply the other day (in one of the threads), and saw some of the characteristics between your pandalized/unpandalized sites. Did you see my comment above about my observation that many top-ranking sites seem to have their domain name in the title? Have you noticed that? Even one of my sites (unaffected by Panda) has been doing this for about 50% of its pages for several years. I had forgotten that I did that until today. Either it has helped branding (thus improving click through rate and trust, or it could even be a direct signal).
crobb305, I have lost all the respect for Matt Cutts, Google and all. I feel that the advice Matt himself gave about tags and thin pages (don't worry) and the adsense team suggestions have caused us a penalty and all of the sudden. And then they hide and leave the penalty on for 2 months (at least) despite our changes.
Absolutely disgraceful, no other way to say it.
|it feels like this business is run by a bunch of muppets that don't consider other departments - and they do this in public |
There is a school of thought that calls this "transparanecy" ;) We saw some of that with the Google Search tems's premature switch to AJAX in the SERPs - it broke Google Analytics and had to be rolled back.
From what we are seeing with the few reports from sites that are recovering traffic, the key changes to the Panda algo seem to be exactly those areas that Matt Cutts and Amit Singhal highlighted in their Wired interview at the TED conference:
1) don't swamp the content with ads
2) make the content unique and of good quality, rather than copied, spun, or shallow
The rest of our conversation these past weeks has been more about HOW they might be scoring this rather than the very basics of what to do (or what to fix) in order not to be Pandalized.
[edited by: tedster at 3:12 am (utc) on Apr 23, 2011]
Tedster, any 'recovery,' at least for me is far from pre-Panda. I think I gained some I lost after Panda. Maybe it goes in stages or maybe they rolled back some Panda 1x-2.0
I have a question to everyone who was hit by panda:
Do you own multiple sites in the same or similar niches? If so, were all your related websites effected?
I have long suspected that google looks for same owners on similar websites trying to rank many sites in the top ten.
@brinked I have two sites in similar niche (but different focus). One (the best one) got hammered. The other one slightly gained at Panda 2.
You know what, I blocked Googlebot from all of my domains since the Illuminati #*$!s started to move my content into the -50 box. I linked from my own high PR domains (on-topic). Linking from my own domains which have been added into WMT did help for one week, then they pushed everying into the -50 hell. This company is hateful, they are out for a kill. I suggest this one:
Rule no. 1 : Never add your domains into their f#### WMT tool, Rule #2 never use Analytics. Rule #3 never use footprints.
This will help for sure. I hope Bing will gain market share, I love Bing and they love my sites. Illuminati polluted Google with their Panda and -50 stuff can f** o** in my eyes.
They want a new world order so let the fight begin.
I really hope facebook and bing will crush Google in the next few years!
Once again Goog is out for a kill, better get new income streams.
[edited by: SEOPTI at 3:34 am (utc) on Apr 23, 2011]
I wanted to get your feedback about a few Wordpress plugins that I use that might possibly be affected by Google's response towards dated content:
(1) Stick Post: Allows a post to stay at the top of the Home page. My current post being stuck at the top is dated 2008; however, the ones that follow are more recent.
(2) PostMash: Allows me to re-order my posts.
Do you think that these plugins would work against me now?
In addition, since my blog gets a lot of traffic, it would get a lot of comment spam (90% spam and 10% relevant comments), providing little content value to other readers. I decided to disable comments two years ago, but still use Facebook apps to allow readers to Share posts on their Walls. Do you think that this arrangement would potentially work against me?
Welcome to the forums, IMangel.
I don't see how either of those plug-ins would create a Panda problem. StickPost just gives your home page some static content, right? That might even help. And as long as you don't mash-up the post order too often, PostMash shouldn't create any Panda issues either - at least as far as we've identified them for now.
I also can't see how sharing on Facebook Walls would work against you. And if you ever want to turn comments back on, there are plug-ins like Akismet as well as challenges to stop automated spam that can cut down comment spam dramatically.
Are you concerned because you have already seen a traffic drop that corresponds to the two Panda roll-out dates? Or are you trying to pro-actively prevent trouble before it shows up?
| This 216 message thread spans 8 pages: < < 216 ( 1 2 3  5 6 7 8 ) > > |