| This 81 message thread spans 3 pages: < < 81 ( 1 2  ) || |
|Recovered From Panda: Well I Guess So!|
| 9:34 am on Apr 29, 2011 (gmt 0)|
Today, either they reverted their panda update or i am just being lucky to see old rankings, some of which are even stronger. What i did so far.
1: Changed hosting
2: Changed nameservers (just for the heck of it)
3: Used WP recommended robots.txt file for blog
4: Blocked some internal dupes and weird urls through robots.txt
5: Fixed some 500+ broken links/images.
6: Noindex Nofollowed thousands of pages.
7: Removed adsense altogether from main pages. Atleast 1 unit still exists on some very deeper pages, which i have noindex, nofollowed.
Further things i have lined up to do.
1: Thousands of canonical tags (they seem tricky to implement)
2: Thousands of 301 redirects.
3: When i create a Widgets page, it creates 6 other pages e.g green widgets, blue widgets etc. It at times get very hard to fill those pages, so iam in a process of enabling only those pages, for which i have actually added content. It shall significantly remove thin pages.
I just hope its not a temporary thing and ranks stay. There is still a lot more to fix in my site, the errors, which i was not bothered about and took rankings for granted.
Hope it helps someone.
| 9:55 pm on Jun 20, 2011 (gmt 0)|
You mentioned the following:
1: Changed hosting
2: Changed nameservers (just for the heck of it)
Why did you decide to change hosting? Just curious? Did you check if many sites were hosting malware on your old hosting?
| 10:06 pm on Jun 20, 2011 (gmt 0)|
Were you using shared hosting or dedicated machine?
| 11:58 pm on Jun 20, 2011 (gmt 0)|
I'm pretty sure I could write a thesis about bill gates and pass googles dupe / shallow content filter called panda.
Also, my pages all show updated date as date page was opened and I still recovered
I was also scraped to Buggery. But when you are failing a dupe content test after being compared with simulator or the actual manufacturer - having been scraped is a false path to look down.
If your content is unique and you've been landlords, maybe it's thin. Dupe within your own site etc
I maintaIn it's a Content problem you can fix.
Let's face it. Most sites that have recovered have fixed their problem content. Owned it and grown it.
| 12:33 am on Jun 21, 2011 (gmt 0)|
Thanks Nippi, you're giving us direction and hope. I aim to be in your shoes one day. I too, have had a bad fall with Panda. I have an authority content site (not ecommerce) that crashed on Feb. 24. Crashed again in each of the Panda "runs" that were rumored to happen. Tried restructuring, pulled back monetization, cleaned out links and removed pages. Filed a ton of DMCA requests. Reported spammers/scrapers. Seemed like anything I did would not work. Until June 6. At that point, it seemed like things started to turn around for my site: not suddenly like in Nippi's case, but gradually. Every week, the numbers are somewhat higher, 10% here, 5% there, but a gradual rise. I'm nowhere close to Panda levels, but I feel like I'm gaining ground one day at a time (or week at a time).
I've been rewriting content a few pages a day as I have 2000 pages, a lot for one person (built it all myself over many years). So maybe I have not reached the tipping point yet but there are signs that I may be on the right track. I have been focusing on evaluating each page and rewriting bogus/weak pages now. So far, so good.
| 1:27 am on Jun 21, 2011 (gmt 0)|
I have written about 15 posts on my Pandalized (Feb. + April) site in the last two weeks and two of them got #1 and #2. It think it's due to content. The post have more images than most of the other posts, have an embedded video, while the other posts don't, and are differently written, but not much.
Nothing else special about them and almost no ext. inbound links to these pages.
P.S. I've written a longer version of this post, but it's gone now, Windows decided to shutdown to update itself. :)
| 1:39 am on Jun 21, 2011 (gmt 0)|
Windows 7 auto-deleted your post because it was shallow.
You may continue.
| 3:36 am on Jun 21, 2011 (gmt 0)|
|Let's face it. Most sites that have recovered have fixed their problem content. Owned it and grown it. |
I hope people come out and say I recovered, here are the stats. They can make them anon. I only heard of DaniWeb recovering ('sorta'), but the stats don't show it [quantcast.com...] Look under 6m and weekly. Maybe if other traffic went down and Google increased, but we don't know. Either way it's nowhere near what she lost. We have heard of people gain, lose, gain, lose as time goes or do better on a keyword or two but that's about it. These mostly happen as Google updates and data is shifted back and forth.
Not saying it didn't happen, but we haven't seen it here. It also bears repeating that not all traffic loses are Panda related.
These past 4 months, traffic on my non-pandalized sites has always gone up while doing nothing and always gone down on my pandalized site as I improved content and non-index lot and lots of tags /thin /shallow /whatever pages.
I'm 95% certain that I know what's going on.
| 4:03 am on Jun 21, 2011 (gmt 0)|
I'm curious as to what you think is going on.
It's fairly obvious there's a heavy dose of anti-gaming being handed out, and it's not just the algo updates.
| 4:14 am on Jun 21, 2011 (gmt 0)|
supercyberbob, I'm done saying what I think :)
But no, I didn't really game Google at all relatively speaking, and that much I regret. I say relatively speaking because adding an extra synonym or two when your competitors purposefully mention the word dozens and dozens of time is not gaming at all. I got hit because of too many tags.
Come to think my non-pandalized sites have more 'gaming' going on and that gaming is barely registered in the radar because I barely did any.
| 5:01 am on Jun 21, 2011 (gmt 0)|
Didn't mean to imply there was any gaming on your end, was speaking in general. But thanks for the reply.
Since my mic is on, I've been seeing a lot of google.ph pre-Panda referrals the past few days.
Not sure what to think about that. This is for a .com US based site. Just thought I'd share.
| 7:37 am on Jun 21, 2011 (gmt 0)|
|I'm done saying what I think :) |
Meaning that you've said it already: that nobody has recovered from Panda 1 regardless of what they've done, and probably not from Panda 2 either.
It is possible that the recovery reports we've had were due to other factors, not Panda-related. We also have no way of knowing with certainty that our own difficulties are Panda-related, but most of us are pretty sure.
You may be right. Presumably we'll learn more over time. The only bit I really dispute is that, if there has been no recovery for anyone, Google is doing it as "punishment" rather than for other, more technical reasons.
Or, on the other hand, it could be that the reason recovery reports are so sparse is that real recovery requires more work than you can reasonably do in a few months, for most of us.
In either case, I think we may be looking in the wrong direction when we pay attention to the signal we're getting from Google - rather, perhaps we should be paying more attention to the signal we get from our users. It's hard to reverse-engineer Google. People may be easier, and in any case Google is clearly trying to reverse-engineer them too.
| 8:04 am on Jun 21, 2011 (gmt 0)|
|real recovery requires more work than you can reasonably do in a few months |
Certainly that would be true if Google is using significant amounts of user data in this algo. It could take a lot of change even to begin to move that needle.
| 10:07 am on Jun 21, 2011 (gmt 0)|
I do think G is using major amounts of user data in their algo. On one of my pandalyzed sites, the most favored Google pages are now (since panda) the ones on which users spend the most time (perhaps onsite time included too?). I think Google is comparing the user data of my "blue widget" page with other sites' "blue widget" pages and using that comparison to rank my stickier pages higher.
| 10:33 am on Jun 21, 2011 (gmt 0)|
There is probably calculations that assess how much text content is on a page vs how long the user spends viewing that content, comparing it to other pages on that same subject - thus coming to a conclusion about its quality. (This is not necessarily saying more content equals better content).
| 11:04 am on Jun 21, 2011 (gmt 0)|
I had couple of sites hosted on same hosting, 80% of them were affected by panda. My reason to change the hosting was unseen, unknown possibilities of ip blacklisting or something. I just wanted to give the a site fresh start, separate from all other sites and affected host.
Changing hosting, nameservers and other stuff i mentioned in my first post, might have triggered a notice to Google that site is now owned by someone else and a lot of internal issues are now fixed.
Sites were not hosting malware, even if they were, its unlikely for all of them go down at the same time and date when panda rolled out, please note my all other sites affected with panda are still affected, i haven't worked on them yet, amazingly traffic is growing stronger day by day on my site which got out of panda.
| 11:09 am on Jun 21, 2011 (gmt 0)|
It was a shared hosting, but i had a dedicated ip, that ip was further shared between 5 to 6 of my own sites. I have now moved the site to entirely different host, again its a shared hosting, but its on dedicated ip, which is not shared with any other site.
| 1:05 pm on Jun 21, 2011 (gmt 0)|
That's an opinion based on what I see, neither of us have any secret video. I understand how hard is to change our beliefs, I still have not gotten over the Santa Claus thing, so I made a deal with myself: he exists, maybe not in the South Pole, but he does exist. :). I hope some here handle it better when their Santa Claus moment comes. What's saying you can come back if you reach a nearly impossible goal?
|The only bit I really dispute is that, if there has been no recovery for anyone, Google is doing it as "punishment" rather than for other, more technical reasons. |
|In either case, I think we may be looking in the wrong direction when we pay attention to the signal we're getting from Google - rather, perhaps we should be paying more attention to the signal we get from our users. It's hard to reverse-engineer Google. People may be easier, and in any case Google is clearly trying to reverse-engineer them too. |
I understand that a site can always be better and better but I wouldn't assume that all site owners are trying to game Google while ignoring users. Of course one can look at the site and say "But that image should be more on the left, that's why panda hates you." Yeah OK. Changes depend on the site, if you have 5.87 million articles it will take you a lifetime to fix but a site of several hundred pages can easily fixed in a relatively short time with help from others, so that aspect is also false. My site is certainly not worse than it was in Feb and my competition didn't improve that much more either...yet I lose traffic every know Panda update. I gain on my non-Pandalized sites each update, almost like clockwork.
Anyway, people need to start to ignore propaganda and use their reasoning. Fine, my site /your site/his site sucks but everyone's site as far as we know still sucks? I don't have any statistics of what % of people talk about comebacks but we've seen it here all the time in previous updates and even in Panda when the get more traffic.
The only one here that's saying it's holding is Nippi, if you read everyone else AFAIK lost that 10-20% again, not that increasing 20% after losing 70% is a comeback. I try to keep the dates in mind as well, unless G is not telling, Panda runs at certain times so if you come 2 weeks after/before it run, is it Panda? Sites have always gone up and down, panda or not Panda.
|It is possible that the recovery reports we've had were due to other factors, not Panda-related. |
I am not not going to kill myself of "but my site must still suck," because while it's far from perfect, it certainly deserves more Google users than type ins or Bing, especially when considering who's outranking me for certain pages. And those 'experts' that escaped Panda surely don't jack more than the average webmaster, in fact my worst pages (in every way possible) didn't get pandalized.
That's how I see it and it becomes clearer with each passing day, now back to regularly scheduled programing.
| 3:58 pm on Jun 21, 2011 (gmt 0)|
> > real recovery requires more work than you can reasonably do in a few months
> Certainly that would be true if Google is using significant amounts of user data in this algo. It could take a lot of change even to begin to move that needle.
It's not just the work you do on your site; it's the work Google must do in evaluating your site, apparently. Danny Sullivan believes the Panda update is more like PageRank than a regular update. He recently wrote:
|At our SMX Advanced conference earlier this month, the head of Googleís spam fighting team, Matt Cutts, explained that the Panda filter isnít running all the time. Right now, itís too much computing power to be running this particular analysis of pages. Instead, Google runs the filter periodically to calculate the values it needs. Each new run so far has also coincided with changes to the filter, some big, some small, that Google hopes improves catching poor quality content. Source: [searchengineland.com...] |
He suggests there are on average about five weeks between PandaRank updates.
"For anyone who was hit by Panda, itís important to understand that the changes youíve made wonít have any immediate impact."
| 4:09 pm on Jun 21, 2011 (gmt 0)|
there's much more to this. What if you changed your site drastically and 2 Pandas didn't make a difference (other than a negative one)? Re-read Danny's article again and see the last paragraph from the "Recovering From Panda" section.
| 10:18 pm on Jun 21, 2011 (gmt 0)|
|I wouldn't assume that all site owners are trying to game Google while ignoring users |
I didn't say that, and it's certainly not what I meant. I'm talking about where the best place is to focus your attention when trying to design a recovery strategy.
|I am not not going to kill myself of "but my site must still suck," because while it's far from perfect, it certainly deserves... |
I'm not sure whether I disagree with what you're saying, or if I just don't like the whole cognitive framework you're using. I don't think it's useful to think in terms of "sucks" and "punishment" and "deserves" as though we were all recalcitrant teenagers. This is information theory, not criminal justice or child psychology (and I don't think those concepts are particularly useful in those fields either).
| 11:20 pm on Jun 21, 2011 (gmt 0)|
|I don't think it's useful to think in terms of "sucks" and "punishment" and "deserves" as though we were all recalcitrant teenagers. This is information theory, not criminal justice or child psychology (and I don't think those concepts are particularly useful in those fields either). |
;) Let's see:
sucks = not good (according to someone). If Google thinks that your site sucks they will push it back or ban it. It may no longer suck but they still may do the same for several reasons.
punishment = you break the rules you get punished, like spamming (if Google sees it as such) or buying links
deserves = is subjective I guess. One thinks of the amount of work he /she puts in, competition etc. etc. But 'deserves' can be substituted with 'should.'
This has nothing to do with recalcitrant teenagers, it's the reality and is applied in life every single day.
| This 81 message thread spans 3 pages: < < 81 ( 1 2  ) |