homepage Welcome to WebmasterWorld Guest from 54.167.144.202
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 36 message thread spans 2 pages: 36 ( [1] 2 > >     
The other side of Panda - the winners
whatson

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4321790 posted 11:18 pm on Jun 3, 2011 (gmt 0)

All we seem to hear about Panda is the losers complaining about how much traffic they have lost and what they are doing to try and get it back... with little success by the sounds of it.

So how about we hear from the other side, webmasters that were not [negatively] affected by the Panda update, and what it is that separates their sites from the losers. Let's hear some success stories of the winners.

 

aristotle

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



 
Msg#: 4321790 posted 12:52 am on Jun 4, 2011 (gmt 0)

I think the biggest winners are sites like Wikipedia, Amazon, and well-known brands. Also, there are probably millions of sites, especially small sites, that escaped from being hurt but didn't gain much either. For the keywords I watch, which are related to either historical subjects or social issues, there haven't been any significant changes -- All the top sites are still the same as a year ago.

Shatner



 
Msg#: 4321790 posted 1:06 am on Jun 4, 2011 (gmt 0)

I know a lot of unaffected webmasters. I haven't met one that gained.

But I don't know any webmasters for big brands. I think all the traffic went straight to Yahoo, NY Times, AOL, etc.

walkman



 
Msg#: 4321790 posted 1:15 am on Jun 4, 2011 (gmt 0)

I am a winner, big time. 3 or possibly 4 of my sites have had their traffic increased, one as much as 300% (it varies.) 3 have Google ads and I'm making double on adsense but not enough to replace my pandalized site's lost income. Don't ask me how I did it.

There was a Panda for content and then something added credit for brand name etc so we have to differentiate. For the winners, the content boost will probably go when Google decides to stop the non-penalty penalty for pandalized sites. The brand, will probably stay in some fashion.

Atomic

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4321790 posted 2:03 am on Jun 4, 2011 (gmt 0)

I've gained both traffic and earnings with my primary site. It is not a big brand but it's 15 years old. It has 2 Google ad units right now. Used to have more but I removed all but the 2 while improving the site over the last few months. The intent of the redesign was to improve user experience. I decided to do this after taking some HCI and social computing classes.

I not only removed ad units. I also moved them far away from the content.

I looked at what visitors clicked on/used and removed anything they didn't. Amazon links shot to the top of that list.

Bounce Rate is down 10%.

Traffic is up about 12%.

Time on site is up 10%.

Pages per visit is up 10% or so.

Earnings are up a lot more. I figured earnings would go down when I removed all afilliate links and 2 AdSense units. I figured wrong.

mrguy

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4321790 posted 2:24 am on Jun 4, 2011 (gmt 0)

I am doing great with Panda. Most of my sites are seeing increased traffic.

I love the Panda.

koan

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4321790 posted 4:08 am on Jun 4, 2011 (gmt 0)

I have one site that clearly gained with Panda, it isn't a well paying topic, but it did help cushion the blow of my other, better paying sites that were pandalized and reduced to scraps. The differences I see between those sites are mostly the topic, but also better inbound link profile and more social media engagement. Now you could copy my old articles on the pandalized sites on a newly created blogspot site and rank better than the originals. Actually, people do. It's a mess.

thirteen



 
Msg#: 4321790 posted 4:25 am on Jun 4, 2011 (gmt 0)

I have a niche site and at the beginning of Panda rollout which did well. There was no Main Street established brand in that space. Traffic stayed the same but Adsense CPC payout went up. I figured Google was giving out more adsense dollar to the remaining ads that got clicked since all the other site lost their traffic. In order for Google to maintain the same revenue with less click, they must have given out higher click payout to site that survive Panda.

Unfortunately, adsense revenue did not make up for the loss revenue on affiliate site that got bitch slapped by Panda.

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4321790 posted 12:04 pm on Jun 4, 2011 (gmt 0)

None of my sites or client sites were pandalized. Most stayed around the same as far as traffic, some gained a bit. The one client site that I relaunched last year on a new platform has had some big leaps in product keywords in the past month, but I don't know that I can blame that on anything panda-related. They're up against some huge brands, including Amazon, and they'll probably never make it to the tippy top on about half their products, but at least they're getting to the first page now.

Most of the sites are as technically sound as I can make them (but they don't all validate). Most of the sites have elements and pages kept deliberately out of Google as I believe what you don't let Google see is almost as important as what you do (and that's not a recent change, they were always that way). Most of the sites are clear authority sites in their niche (even if it's a really small niche). What I mean by this is that either there is no direct match competition, or else the competition is so weak as to not even be a factor. (except for one client who still sits atop a *murderously* competitive niche and even I can't really say why). Some sites improved slightly just because some of the competitors got knocked out.

All of them are in GWT, have analytics; some have AdSense and some don't; some use AdWords to drive traffic and some don't; most of my sites have social media interactions, but most of the client sites don't.

None of them have what I would call impressive link profiles. Linkbuilding is not one of my specialties (and most of my clients are B2B - makes it even more difficult)

Almost all of the sites have a very strong rate of return visits. Some are practically being stalked.

None of the sites have article content, or anything that was not written or re-written by the clients or by me. (We use no verbatim manufacturer descriptions anywhere, even on the sites with 2000+ products) The writing may suck in certain cases, but it's definitely original.

Some have really shallow pages (category and taxonomy pages), some don't. If the shallow pages serve a purpose to the user, then I don't worry about blocking them.

None use no-follow on links except for affiliate links.

Many get scraped to some degree, but we don't spend a lot of time worrying about that. One client has me keeping an eye out for people who swipe his pictures, and I have a rather aggressive republishing policy of my own that outlines what I may do if I find people using my content without attribution, but for the most part, we ignore it. There are only so many hours in a day. I can't think of any instance where scraped content actually outranks any of the sites that I oversee.

All of them are imperfect, and have room for lots of improvement insofar as we have the time and funds to do so. And we (the clients and I) recognize that. It's always a moving target.

I guess the sum-up would be, they all look cared-for.

thirteen



 
Msg#: 4321790 posted 4:54 pm on Jun 4, 2011 (gmt 0)

Most of the sites have elements and pages kept deliberately out of Google as I believe what you don't let Google see is almost as important as what you do


netmeg,

That is an interesting point. Just this week, I recently came to the decision to remove Google Analytics from one of my site that got hurt by Panda.

I am exploring the path that you had already traveled on. It might be better to not trust Google and give them full access to monitor all the information on my site.

My next step to is to figure how to SURGICALLY deny Google from probing specified webpages. I do not want to do a noindex header on those webpages because I do want Bing and Yahoo to index those pages while denying Google access to those same webpages.

Do you have any suggestion on how to tackle this?

Thx

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4321790 posted 5:01 pm on Jun 4, 2011 (gmt 0)

Not really. And I have no problems using Analytics. The pages I keep out are out because they don't need to be in (and that goes for Bing and Yahoo as well) I'm not trying to hide anything from Google, I'm just trying to keep a tidy shop. Or at least give the appearance of a tidy shop.

Planet13

WebmasterWorld Senior Member planet13 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4321790 posted 6:37 pm on Jun 4, 2011 (gmt 0)

The pages I keep out are out because they don't need to be in...


Could you elaborate a bit on that? Maybe you could use an example of someone (not a client's) site that you came across and said to yourself, "They should really noindex this page."

Planet13

WebmasterWorld Senior Member planet13 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4321790 posted 6:48 pm on Jun 4, 2011 (gmt 0)

I guess my main site might be considered a Panda winner, but it is hard to say exactly because there were so many Panda losers that dropped 50% or more OVERNIGHT, and our site's gains have been gradual (over the three months since Panda 1).

So even though we have seen about an 18% gain over the last three months, it would be hard to determine specifically if it is Panda related. It might have more to do with link building efforts on my part, and also with the fact that I had redirected a lot of material from one site to another site via 301s PRIOR to Panda, and google is just now "trusting" those 301 redirects and sending more traffic to them than when they were on the original site.

It might have something to do with taking a few under performing pages with a lot of content on them and splitting them off into multiple shorter pages so they were more focused.

I can tell you that our gains DON'T have anything to do with social media, which is something I am COMPLETELY inept at it. Seriously, there are octogenarians out there who would easily blow my social media presence out of the water...

gadget26



 
Msg#: 4321790 posted 6:53 pm on Jun 4, 2011 (gmt 0)

Did anyone else notice that all of netmeg's sites were either in a niche with "no direct competition" or were a "clear authority" in their niche. Even the site with that is in a very competitive niche "sets atop" it.

More brand favoritism. What I want to know is how google/panda determines various niches and in which one each site resides. It's actually a pretty complex issue if you think about it. Practically every site touches more than one niche (I'm starting to dislike that word). If we could somehow get our panda-hit sites reclassified into a less competitive niche, we might just get unpandalized.

Leosghost

WebmasterWorld Senior Member leosghost us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4321790 posted 7:34 pm on Jun 4, 2011 (gmt 0)

A site may touch more than one niche, but if the signals are clear and unambiguous as to which niche is it's primary focus, the search engines will pick that up.

If it is trying to be too "catch all" ..it will be weak in all ..and thus weak overall..in all the niches it touches.

Breaking up sites into smaller sites or at least into very distinctly focused sections will help search engines to decide the "niche" you are trying to compete in better.

Amazon has reviews, but it is definitely "selling" ..the best camera review sites are definitely "review" sites, feeding it and other sellers ..some review only or mainly "DSLR" cameras ..some only or mainly "compacts" ..make it obvious what "your niche" is , and you'll rise above less focused sites.

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4321790 posted 7:38 pm on Jun 4, 2011 (gmt 0)

Well for one thing, for myself, I always pick a niche with little or no effective competition if I can. Or it might have competition, but I know I can do it better.

I wouldn't work on or with a site that I didn't think could be a clear authority in *something*. What would be the point? There are only ten spots on the first page (if that) and I'm not playing for #11 or below. There's only so much time in a day, and I'm not getting any younger. Can't afford to waste it.

Everything has a brand, and brands always get favored, whether they're products, services, or people. that's life. They don't HAVE to be BIG brands, either.

@planet13 - I noindex or block pagination, sessions ids, multiple paths to the same product, search results, things like privacy policies and affiliate disclosures, contact forms, urls with question marks in them, sort orders; I provide rewrites for www or non-www, trailing slashes, upper case/lower case anomalies, and some taxonomy pages. Tag pages general go, and I'm not averse to blocking category pages until I get around to properly outfitting them for the best user experience.

Planet13

WebmasterWorld Senior Member planet13 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4321790 posted 8:23 pm on Jun 4, 2011 (gmt 0)

@netmeg

Wow, that's a lot of blocking.

You prefer blocking to using canonical for pagination / alternate sorting. That is quite contrarian, no?

phranque

WebmasterWorld Administrator phranque us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4321790 posted 9:32 pm on Jun 4, 2011 (gmt 0)

My next step to is to figure how to SURGICALLY deny Google from probing specified webpages. I do not want to do a noindex header on those webpages because I do want Bing and Yahoo to index those pages while denying Google access to those same webpages.

Do you have any suggestion on how to tackle this?

<meta name="googlebot" content="noindex">

Using meta tags to block access to your site - Webmaster Tools Help:
http://www.google.com/support/webmasters/bin/answer.py?answer=93710 [google.com]

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4321790 posted 9:56 pm on Jun 4, 2011 (gmt 0)

You prefer blocking to using canonical for pagination / alternate sorting. That is quite contrarian, no?


Pretty much. I do what the situation calls for. For that stuff, if I don't want it in, I make sure it's not in.

thirteen



 
Msg#: 4321790 posted 1:22 am on Jun 5, 2011 (gmt 0)

phranque,

Thank you. That was exactly what I was looking for.

indyank

WebmasterWorld Senior Member



 
Msg#: 4321790 posted 6:07 am on Jun 5, 2011 (gmt 0)

I have a confusion here as I think people are using some words differently than I usually do.

What do you all mean by blocking? Do you consider specifying a "noindex" meta tag for googlebot as blocking it?

When I use the word "block" or "blocking" I always consider it as something that one does through robots.txt. I have a feeling that adding a noindex meta tag doesn't necessarily mean you are blocking googlebot for the following reasons.

1)Googlebot still follows links in the content unless you add "nofollow" as well to the robots meta tag.
2) Recently google have clarified that you can add a noindex meta tag to pages that you intend to improve, so they could still keep track of history of such pages.This clearly suggests that they do keep them somewhere in some form, though they don't show them in the results, even as stubs (only urls).

netmeg, do you add "noindex" meta tag to the example pages you had mentioned above or do you add a "noindex, nofollow" meta tag or do you block them via robots.txt?

Update: Though google is using the word block in the title for the help page linked above, they are obviously contradicting themselves when they asked people to use noindex meta tag on pages that people want to improve, so they could have a history.

[edited by: indyank at 6:39 am (utc) on Jun 5, 2011]

indyank

WebmasterWorld Senior Member



 
Msg#: 4321790 posted 6:18 am on Jun 5, 2011 (gmt 0)

If you ask me, I still believe that google fetches and stores a copy of pages for which you add noindex meta tag. noindex just prevents them from listing them in the results in any form.

But when you disallow the googlebot through robots.txt, they might not (though not sure) store a copy of the content on such pages on their servers, though they do list them as urls, as if they are stub pages.

I also believe that they have copies of pages with "noarchive" meta tag on their servers though they don't show a cached link for them.

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4321790 posted 6:21 am on Jun 5, 2011 (gmt 0)

I'm seeing stability across sites and on others not.

I've absolutely no clue what the reasons for the successful sites are because they don't meet the guidelines put out by Google for the Panda release. Whereas the others which are affected are closer to the guidelines.

So to answer the question re the " winners " - " No idea " .

indyank

WebmasterWorld Senior Member



 
Msg#: 4321790 posted 6:33 am on Jun 5, 2011 (gmt 0)

As regards winners, it is clear that panda is not run for all niches.The research paper of Biswanath Panda also makes references to the intensity of the process meaning the resources that these panda runs consume are huge.

They have probably excluded certain niches (examples are niches on which there are not many as in the case of netmeg), certain platforms (examples are youtube, blogspot etc.) and several other sites.

However, I believe that panda is a run on sites picked using some criteria (or picked randomly) than making explicit exclusions (though there could be exceptions to it). Those that have not been hit are probably sites that have never been tested by panda.

Two sites i.e. DI and cultofmac were initially picked by panda but were dropped later on.It does suggest that they had signals for panda to trap them but they probably removed those signals later on. It could have been done with some help, though they may not obviously share them, without requiring google to make exceptions for them.

[edited by: indyank at 6:58 am (utc) on Jun 5, 2011]

walkman



 
Msg#: 4321790 posted 6:57 am on Jun 5, 2011 (gmt 0)

Two sites i.e. DI and cultofmac were initially picked by panda but were dropped later on.It does suggest that they had signals for panda to trap them but they probably removed those signals later on. It could have been done with some help, though they may not obviously share them, without requiring google to make exceptions for them.
I personally believe that they were manually white-listed for Panda. Google denies that they were hit by Panda but it would not look good for them if they admitted it. Nevertheless, one thing is clear: there's no way for them to have fixed and come back in a few days. For one, it takes Google quite a while just to index tens of thousands of pages.
indyank

WebmasterWorld Senior Member



 
Msg#: 4321790 posted 7:00 am on Jun 5, 2011 (gmt 0)

walkman, what I am trying to say is they could have removed those signals (with help) that panda looks for.They might not have necessarily fixed the sites.

indyank

WebmasterWorld Senior Member



 
Msg#: 4321790 posted 7:08 am on Jun 5, 2011 (gmt 0)

of course, if google picked up sites randomly and these two were included, the possibility of manual exceptions are there.

But they are winners after they came out of it.

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4321790 posted 1:12 pm on Jun 5, 2011 (gmt 0)

netmeg, do you add "noindex" meta tag to the example pages you had mentioned above or do you add a "noindex, nofollow" meta tag or do you block them via robots.txt?


Depends on the circumstances. My strategy for a brand new site that has never had links and Google has never seen before is different than a site that's been around a while, maybe not put together well in the first place, with external and/or internal links to pages I don't think should be in Google, and also might be some shopping cart or CMS where I can't easily perform surgery page by page or directory by directory. I also have to look at how quickly its pages are crawled and indexed. For my primary sites, new pages/posts are indexed in minutes. For new sites, or sites with a lot less traffic, it can take days - which gives me more room to work, actually.

Shatner



 
Msg#: 4321790 posted 6:24 pm on Jun 5, 2011 (gmt 0)

How many actual winners have posted in this thread so far? 3? And it sounds like to me most of them either had only small gains, or were smallish sites to begin with.

Accurate?

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4321790 posted 6:27 pm on Jun 5, 2011 (gmt 0)

Smallish? heh, okay.

Good luck.

This 36 message thread spans 2 pages: 36 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved