There's a big surprise for me on the new loser list - Spike.com (website for the cable TV channel) shows a 64% drop. That's a pretty major brand, there might be some clues in analyzing their situation.
The winners list is mostly big brands, but not just one type of site. Several news sites, but also Sears!
spike.com has a lot user profile page that is quite thin.
the only thing suspicious about spike I can see off the bat is I cant see any comments. This one page has 34k views, 81 fb likes and 0 comments?
Other than that there content is straight forward. If I wanted something spike related, it looks like I can get it from there.
Google should just release there own line of CMS software with all the different plugins and then moderate all our content so that the web can be exactly how they want it to be. We listen to google, we make sitemaps, add nofollow links on random links nobody even really knows why they use nofollow, I installed a nofollow plugin on FF, some sites nofollow every single one of there links, even to there inner pages. I once asked a client why he was nofollowing some of his links on his site and he told me because google said you need to or else you will need to submit a "refile request".
Sorry to rant, but getting back on topic, a lot of these sites are quality. Are they the greatest sites in the world? no, but they each have there purpose and have original content that likely cant be found anywhere else. I am all for google getting rid of content farms, but just because a site has a lot of content, or has a poor design somewhere in there CMS such as user pages getting duplicated or something silly, shouldn't warrant a penalty/pandalty
Spike makes ZERO sense on this list.
It's weird how Sistrix, and I'm sure the rest of the media will too (they did before with other sites that don't fit), just glosses over it in their report and acts as though "yay google is only punishing content farms".
Spike makes sense on the list - you just have to analize the data in detail and look at the keywords and pages that dropped (there are a lot of adult keywords and thin pages).
google said the launch of panda on the 11th effected 2% of queries and it incorporated user feedback from chrome.
Well ehow comes up for pretty much everything so obviously a lot of people are going to block it. Pretty sad, the more exposure you get the more people will block you, doesnt mean you're a low quality site. This can certainly be gamed, I can see it now ways for webmasters to get chrome users to "block" their competitors.
superpages.com was also hit hard by the Google maps update in October that pushed web results below the fold when people search for local business names.
There is nothing inherently low quality about the site. This looks like a pretty clear case of Google swatting a competitor. Google is trying to own local search (yellow pages), maps, and shopping search. All areas in which superpages is trying to compete toe to toe.
superpages.com actually has a lot of cool features, like guaranteeing the job of many contractors. I'd say that's value
|Well ehow comes up for pretty much everything so obviously a lot of people are going to block it. Pretty sad, the more exposure you get the more people will block you, doesnt mean you're a low quality site. |
Love'em or hate 'em, you have to credit Google's algorithms with being a tad more intelligent than that I'd suggest :-)
-66% drop in visibility for eHow.com would mean nothinng in terms of traffic as i have seen this with mine as well.It only indicate a small drop.
This visibility metric is totally useless as I have personally experienced it with one of my sites.
These reports are good for finding which sites have been hit and which are not.
But you may never be able to judge the traffic loss from this metric.Having experienced this in Panda 1.0, I would say that traffic drop to ehow.com may not even be noticeable.I have a feeling that Alexa too uses a visibility metric.
[edited by: indyank at 3:36 pm (utc) on Apr 17, 2011]
I will grant you that a search visibility metric can't be an accurate predictor of actual traffic shifts.
But we're trying to decipher a change in the ranking algorithm (rather than its traffic or financial impact) so I do think this kind of data has value for our purposes.
Tedster, I think google just picks up a bunch of sites per update and implements Panda.
Spike.com may not have been in the bunch that google evaluated in Panda 1.0. The same could apply to ehow as well.
I have seen a few sites that remained solid after Panda 1.0 disappearing after Panda 2.0.
I also feel that there could be Panda 3.0, 4.0 and so on when a few more are picked for evaluation.
while these are major evaluations, Google might also be doing evaluations on a smaller scale, regularly.
[edited by: indyank at 4:06 pm (utc) on Apr 17, 2011]
There are three websites on that loser list that normally outrank us. We saw about a 25% increase in traffic this week but we normally see a spike in traffic around this time of year too.
I can't say I feel sorry for eHow. According to WMT, I'm showing around 1,100 links from eHow and 700 from the UK version.
|HTML 12 171,573 |
CSS 12 678,561
Scripts 17 1,324,749
Images 26 814,250
CSS Images 366 592,254
Total 437 3,581,387
I don't know about you, but the Spike.com pages are somewhat abusive from a resource perspective. I mean, is 437 HTTP Requests something that may present a performance issue?
I ran a speed test and the home page of Spike.com takes 496.67 seconds to load on a clean 56k. Ya, I know, who the hell browses at those speeds? o_O
|Set-Cookie: ak-mobile-detected=no; |
expires="Mon, 18-Apr-2011 00:17:41 GMT"; path=/
Is it just me or do their pages try to drop 35 cookies on you for the above?
|<meta name="description" content="Watch full episodes amd exclusive show highlights on SPIKE.com!" /> |
You have to wonder how someone could miss a typo like that. Shakes head...
I'm still apt to believe that there are performance challenges with many of the losers.
p1r - that's a good observation. I think I'm going to run some performance tests on other losing sites. If there is a user data component to Panda (such as fast click-backs to the SERPs) then slow resource-heavy pages could be an important indicator.
Why should Sears not be hit?
Shows up #1 for Sears, also shows up all over the web, including sites like pricegrabber which IMO could have backlash for some of the sites they grab since Google doesn't seem to know who originates what content where.
Note K-mart sells the same thing on their site, Sears is now syndicating content, Sears got hammered just like any other syndication site.
Per PR1's citing, Sears is also heavy on the graphics, however I don't think that's what's hurting them, it's the dilution of their unique content.
correction - In your example Bill Sears shows up #1, #2, #3, #4, #5 and #6, silly Google.
That may also raise an issue of accuracy in these reports, did Sears really gain 60+% or is each entry being counted as a unique entry and then being compared to pre "google multi-listing silliness"? If sears is now being counted 6 times per search keyword the reports are very wrong.
note: no, i'm not a fan of showing multiple listings from one site, just the best one please.
every time more information comes out, i am increasingly certain that the algo part is just a show to keep webmasters busy an spending time figuring the unfigurable.
These are very bright folk, they know as well as we do that current algo have not the ability to rate human subjective reactions
anyway , have fun
While I agree that search engineers are exceedingly smart and likely want to encourage the best quality possible the caveat is that the algo is secret and quality content doesn't always rank above spam. Until one of those changes webmasters will continue scratching their heads and will keep this place very busy :-)
I wasn't clear,
I mean't that, IMHO
The Algo is a kinda list of sites they like ,,