Welcome to WebmasterWorld Guest from 54.158.51.150

Message Too Old, No Replies

Is this a classic case of Panda? Hit by Panda 2, or not?

     
10:20 pm on Jun 5, 2011 (gmt 0)



Any help in this matter would be very much appreciated,

On May 19th after an enormous amount of traffic (a record day!) - Suddenly four of my main sites lost 90% of its traffic. My first thought was that I might have received lots of links to one of my sites and it was a case of a Google dance.

But then I didn't figure out why four of my sites suddenly had died at the same time, so I tried to find out what they had in common. And I found out that they were all linking to another website of mine. And the sites which had survived hadn't. So I removed all the links, and I've cleaned up the sites for any common mistakes such as duplicate content, nofollow links etc - And tried to write articles and backlink organically. I even checked with Google and they said none of the sites had any manual penalties attached to them.

So while I'll continue (and pray ;) for a return, somehow, sometime - Do you think this is a classic case of Panda 2? Or is is just me having to clean up links and wait it out? I just can't get my head around the fact that all four sites died exactly at the same time.


Oh, and one thing I should add - All of my sites are Wordpress, and Google continue to index its content and new posts. But the posts which actually make some traffic are old posts, right before the day the traffic died. The new ones only gets indexed at page 5 and beyond.

Thanks in advance!
12:33 am on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Welcome to the forums, hatherley.

The date is wrong for Panda 2, or even for Panda 2.1 (by 9-10 days.) If I were you I would "de-Panadalyze" my thinking and simply look at the situation as lost rankings.

And I found out that they were all linking to another website of mine.

Is there anything "hinky" about either that site or the reason for the other sites linking? If there is something of a bad neighborhood smell, you may have found it. If not, you may just be looking at a coincidence.
12:48 am on Jun 6, 2011 (gmt 0)



Thanks tedster,

I had no idea the Panda algorithmic changes occurred under such specific time frames.
I guess its a good thing since my competition, which now ranks over me - is mostly garbage.

The site I linked to had a lot of possible problems, so the bad neighbor argument could be correct. At least I hope so - But should you consider it not being a bad neighborhood problem, a month later after a cleanup, if nothing has happend?

And again, thanks for your input.
1:29 am on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member planet13 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



The site I linked to had a lot of possible problems...


What sort of problems are we talking about here?

Was it just bad structure?

Or was it something more nefarious - especially dealing with linking?
1:37 am on Jun 6, 2011 (gmt 0)



We have actually experienced the exact same problems hatherley. On our small network of sites we implemented a link on some subpages back to the main corporate site. Overnight on June 2nd (1 week after implementation of this link) we lost ranking across 95% of our network.

Only a few sites were spared, and looking at them they are sites that we have paid little attention to and most of their subpages are either indexed infrequently or have been completely deindexed. (Leading me to think that google didn't pick up the new links yet)

The 95% of sites that were hit are now stuck behind an invisible wall on page 5 and cannot go higher, for certain keywords we have multiple sites (which never ranked for those keywords before) queuing up behind this invisible wall resulting in large amounts of listings between 50-70 in the SERPS

I don't think it's panda related as we have only benefitted positively from previous panda iterations, the sites that are unaffected contain both large amounts of content and no content at all, same for the sites lost. We don't partake in nefarious linkbuilding, actually for some of our sites that have been number 1 for 4+ years they have less than 10 links!

To me it seems more like the domain farm penalties of old! We have nofollowed any links between the sites and our corporate site (as these were never meant to have any SEO benefit) and I'm currently waiting to see any effect. I have not yet submitted any reconsideration requests.
2:37 am on Jun 6, 2011 (gmt 0)



That's a brutal penalty. I could see it for selling links but linking back to the corporate site?

Chris, you are in the travel /hotel industry, right?
2:43 am on Jun 6, 2011 (gmt 0)



We are in travel, yes
3:26 am on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have to actually sympathize with those who lose rankings because of cross linking. To be honest there seems to be little to no rhyme or reason behind what gets penalized and what doesn't

In some markets, travel related, I still to this day VR networks that internally link in the footer to their other 15 websites. From 15 websites. What I would construe is excessive.

The "big guys" do it all the time with no apparent issues.

I have done it a couple of years back in an industry I am part of, and it was done tastefully with very closely themed websites. We suffered some fairly consistent, and fast spankings from that.
12:17 pm on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google recently said that they have whitelists so that specific algorithms don't apply to specific sites. For example, if they determine that using purple links is correlated with spammy sites, they might put purple links in the algorithm as a negative factor. But they know of some big, reputable sites that use purple links. So they put those sites in the white list for purple links. Those specific sites don't get hit when this algorithm rolls out.

I believe that the travel industry has several big sites that are on many of these white lists. Its hard to compare a small site against a big one. Different algorithms are going to apply to the little sites.
12:43 pm on Jun 6, 2011 (gmt 0)



I feel with you Chris, I've made sure to implement every measures I can think of to get out of this mess. Actually I've noticed that my sites results isn't even in page 5 or beyond, it's not to be found if you don't actually quote content and search for it.

I went back and checked, and much like Chris the panda changes seemed to have positive effects on my sites rankings.

The thing that confuses me the most is how old keywords are still getting flawless #1 rankings while the new content pretty much gets ignored. I've never seen or heard anything similar before.
2:36 pm on Jun 6, 2011 (gmt 0)



The thing that confuses me the most is how old keywords are still getting flawless #1 rankings while the new content pretty much gets ignored.


Have you checked if your newer posts are taken/copied/stolen by other sites and republished with or without backlinks to your site?

All newer posts on one of my pandalized sites are also ignored by Google, whereas some old (but not all old) articles are ranked normally. The difference is that the new posts are much more copied by scraper and spam sites and now they rank on Google instead of me (I'm not sure if that's a pandalization symptom or cause).

Also, a big spike of traffic occurred right before 80% drop on February 24th (Panda 1.0).
3:18 pm on Jun 6, 2011 (gmt 0)



Have you checked if your newer posts are taken/copied/stolen by other sites and republished with or without backlinks to your site?


Thanks Danijelzi, But I have checked for scraping etc. and no one else seems to copy my content.
3:36 pm on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Google recently said that they have whitelists so that specific algorithms don't apply to specific sites. For example, if they determine that using purple links is correlated with spammy sites, they might put purple links in the algorithm as a negative factor. But they know of some big, reputable sites that use purple links. So they put those sites in the white list for purple links. Those specific sites don't get hit when this algorithm rolls out.

Google recently said that they use EXCEPTION lists - that's a big difference. Exception lists are standard practice in algorithm development, and one that Bing also confirmed using.

If engineers notice that a site is incorrectly triggering a specfic algorithm, then it gets put on the exception list FOR THAT ALGORITHM, and only until the logic can be improved. Note that first there had to be a false positive. In fact, the first publicly discussed case I heard of (almost three years ago) was for a one man shop, not a big brand.

A whitelist would work like "these sites can do no wrong". Google does not do that. However, I do thin that big brands can have so many positives in other areas of the algorithm that it tilts the overall balance. They would still do even better if they stopped doing whatever it is that the algo doesn't like.
4:19 pm on Jun 6, 2011 (gmt 0)



deadsea, most of those penalties are probably checked and imposed manually after the algo singles the sites out. So they look at expedia or priceline and say, 'well, that's Ok because they advertise, I saw their ad last night on CNN, they have a real business blah blah' but someone's network of hotel sites with top domain names strikes to them as too easy money or whatever, so penalty. As a bonus, it's good business for Google because 9 out 10 brands probably advertise on Google as well.

Google used to go out of its way to say that they try to be even handed, now they don't even bother.
5:16 pm on Jun 6, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It would not surprise me at all if the biggest sites in the travel industry were each on a couple dozen of these exception white lists. Its not that these sites can do no wrong, but they have done a lot that is not optimal, and have been given passes for it.

It means that if you are starting from the ground up, your job is that much harder. You can't just copy the big boys because their site structure isn't going to work for you. You can't get the same exceptions because you don't have the reputation yet.
3:56 am on Jun 7, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm unclear on the circumstances here.

Two webmasters report a 95% reduction in traffic after interlinking their own websites.

Initially, were these links "nofollowed".
Was creating "nofollow" links between related (ownership related) websites penalized?
5:21 am on Jun 7, 2011 (gmt 0)

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Some quick clarifying questions :

Are the interlinking sites on associated topics - Google tends to have problems with excessive linking between off topic subjects between sites.

You've de-duped your content - can you elaborate as potentially this can be complicated. I mean, you have no content between .com sites for example. Different TLD's don't matter.

Have you worked out which site Google considers as your "main site" and is this holding?

Interlinking should be no problem if it is done correctly and is genuinely useful.

But you point in context about Panda is fair in the sense that unique content is a lot more reliable than any linking strategy these days. Even though your direct issues may not be related.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month