There are clearly no reports of any rapid returns from this algorthimic changes. Two insights stand out to me so far :
Brett's insight on the back of Eric Enge's remarks re the " 20% Panda metric idea" : [webmasterworld.com...]
and Tedster's :
|I think some of the delay in recalculating is that Panda works at a very basic level – it’s what Google calls the “document classifier”. I have a feeling that particular type of routine does not run as often as the rest of the scoring that is built on top of it. My current research – looking through patents, papers, and posts that mention “document classifiers”. |
I think this could be a slow process of restoration for exisiting sites , even though i've pesonally seen some blips ( and falls ) on modified pages.
No blips, no unusual movement. I track 50 key pages, and its a slow steady crawl upwards. Not a lot each day, and some days no movement at all, but about 2 or 3 for those ranking 11-30, and about 5 or 6 places for pages that were pushed further back. Thought it might be just flux, but its a steady improvement accross all 50 tracking urls, so unlikely flux.
One particulair page jumped from 'way down to 1st page after fixing errors, wrote about it in another thread.
Earnings are back to about 80% of pre-Panda.
How many total pages do you have dibbern?
even if what Brett said it's true (and personally I doubt Google would put 20% on that) I'd lose at most 10% of that since my site is not 0 when it comes to that. And my other sites with zero "social buzz" as I stated above have gotten a very decent Panda raise.
I do notice a lot of (new,) really good and profitable referrals but overall traffic is not moving up or I can't measure it. I also cannot compare to Pre-panda since I removed a lot of pages, rank or not, I used my gut feelings. Ironically Bing has increased, most likely because of the changes.
But if get referrals for most of my pages the way I am getting for some, I will be one happy camper, but this is becoming a joke.
|No blips, no unusual movement. |
Any conclusions on this yet?
|And my other sites with zero "social buzz" as I stated above have gotten a very decent Panda raise. |
How's the authority rank compared to the others?
"How's the authority rank compared to the others? "
Non existent. The sites have what the title says and it's 100% accurate but it's nothing special. Panda makes no sense.
I'm glad I've seen this thread. On Saturday the only site of mine that gained significantly in Panda 2, started to disappear off page 1 in searches. By Sunday it had dropped back to page 3 and is now on page 4, worse than pre-Panda. That put me in a bad mood all weekend, but now at least I can see I'm not alone.
I'm starting to think Google has done something particularly vindictive with Panda and is simply not going to let anyone have any traffic back that they lost, no matter what they do. I see people ranking on page 1 who seem to do everything wrong, keyword stuffing, cross linking with themselves, unoriginal content, etc. So I'm going to try that now too.
Depending on the tricks you pull to get out of a penalty you can accidentally trigger an even worse penalty. I witnessed a competitor do this a couple of years ago an invoked a complete ban on his sites for about 18 months, so beware ;)
|Non existent. The sites have what the title says and it's 100% accurate but it's nothing special. Panda makes no sense. |
What about the link profiles of referring sites - were any of them caught in Panda ? This must be causing havoc out there as well with the various authority levels - although you say these sites have no authority they must rely on some links.
Interesting summary on Panda 2.1 or is it Panda 3 at SE roundtable:
And Five New Tactics For SEO Post Panda at SE Land:
I am pretty sure know what caused Panda to maul me. It was too many search tags, something Matt Cutts said not to worry about in 2010. Fixed that and then some but Google is taking it's sweet time.
Having seen what rose and what got hit I don't buy a lot of the theories on Panda, too far fetched.
@walkman -- are you refering to social and user metrics or content/ ad placement?
I fear both are red herrings.
Seems to me that google has classified content (pages) as worthwhile or not and (following on from that) websites as generally publishing worthwhile or worthless content.
It may have included a little bounce data, but can't see they've gone much beyond that.
|Unless you have massive backing it may be all over for small independent companies like us. |
And it's near over for many individual site owners too. My AdSense income is still down at least 50%, traffic down pitifully. Panda has nearly wiped out my other associate income too. It's too painful to even look at my stats again today.
From what I've read in these forums about what sites were targeted by Panda, my site does not fit into the profile.
My site, like others owned by hard-working individuals, is just collateral damage. I'm not convinced there is much of anything that I can do or have done to my site to change the situation.
I've owned several small businesses during my long life and none ever ended or declined suddenly and without warning like this, even through difficult economic times.
Thing is, I doubt Google even cares. Their pockets will be lined no matter who is left behind.
btw, I'd like to add that my site is educational. It's been featured or quoted in some national media, and one of the largest "widget" retailers in the world asked permission to use some of the content in their employee training manuals.
Oh well, Google now deems my content unworthy. So it must be true.
[edited by: shallow at 5:45 pm (utc) on May 3, 2011]
|Any conclusions on this yet? |
Only in that I believe in taking no drastic measures, the last thing I'd recommend is switching back and forth with big changes.
Today is another small step upwards in recovery.
Today my G traffic a little better, not a lot but noticeable and waaaay below pre-Panda levels
My traffic is still holding at 70% improved. The pages are getting crawled and re-cached VERY VERY slowly. I made a sitewide tweak two weeks ago today, and only about 50% of my pages have been recached. The traffic seemed to increase in direct proportion to the number of newly cached pages, but that seems to have stalled. Google is moving slower than I have seen it in years.
Crobb, this is odd. Your site, IIRC, has around 100 pages and is indexed that slow?
What's your PR? Do you get a fresh tag on your homepage?
Do you have a sitemap?
Do pages have links to them from outside sites and your home page?
How deep are those pages (Home > Category > Page) ? With just 100 pages, almost everyone of them should be crawled daily
I'll be honest, it's very odd. Everyday 50% of my pages in a medium sized site get indexed. Yesterday and today added up are already at 120%
Yeah, my site is about 110 pages, PR5, almost 10 years old, the homepage gets crawled frequently. The preview updated almost daily. All pages are on the root, so no subdirectories, just Home > Page. I do have a Sitemap. Despite the 70% recovery from Panda, I have recently seen the slowest crawl rates of the past 3 months. This is also evident in my WMT data. The past 10 days have been pitiful.
Update: I just discovered an error in my sitemap.xml. When I deleted a page from my site (and from the sitemap) a couple of weeks ago, I left an unclosed <url> tag. WMT reports this error. Because the error occurs early in the sitemap, it threw every subsequent line off and WMT reports about 80 errors. But surely this wouldn't affect crawling? Google always crawls my site from memory, often requesting pages that have been deleted months/years ago.
I definitely see a pattern of recovery, and then removal of that recovery that seems very deliberate.
Any time a page of mine gets into the first page of results, it never lasts more than 24 hours or so before it either drops two or three pages or is removed from the keyword entirely.
It's as if some of my pages deserve to rank, but then there's something separate yanking them back to make sure they don't.
I wonder if that could have anything to do with scrapers? The older a page is the more it gets scraped? And the more it gets scraped the more it gets penalized?
Shatner, same here. What I see /think:
-Google sends 3-5 people to a page and no where near enough to judge the page by their behavior.
- What may yank them from page 1 is the site penalty (or Panda score) so without them you'd be on page one but.... G is known to add /remove filters during updates, and this is an update, nothing is settled. I feel the same, see an improvement only to lose it in hours and then it repeats. So as some have pointed out, you need Goog to score each page and then one day to do the math and score your site.
|I also cannot compare to Pre-panda since I removed a lot of pages, rank or not, I used my gut feelings. |
I too have removed or deindexed many pages but I use a ratio as a measure of how I am doing pre- and post- Panda for lack of better measure.
ratio = Visitors / Indexed Pages
By comparing the ratios pre- and post- I can see that I am now doing much better post Panda even now that I have almost 67% of my site deindexed.
I also see the yoyo pattern you all are referring to. One day great traffic that converts and another day traffic that does not.
mslina, how many ages is your site, can you give us some percentage numbers on how much improvement you're seeing and when it started?
The thing with my yoyo pattern is it's not really leading to any overall recovery. It all basically evens out to keep me about the same.
|I definitely see a pattern of recovery, and then removal of that recovery that seems very deliberate. |
A month or so ago, I noticed this effect. I was going to #1 for some phrases, then falling. Another set of phrases, then falling. Also, a thread was started where some folks observed one or two of their pages being completely unaffected by Panda -- I saw this effect also. Out of my entire Pandalized site (albeit a small site), ONE page was never Pandalized and I couldn't figure out why (it fit all the characteristics of my hardest hit pages, with an affiliate link and ad copy).
I have a feeling Google is testing site performance on phrases to improve targeting. I say this because my bounce rate is at its lowest point in a year. Also, some of the phrases I saw sporadic testing on a month ago are the very ones I have seen stick at #1 for the past 10 days -- with sitelinks. Maybe Google will eventually return you to the top spot for the phrases that perform best in their testing. Just a guess. I am still waiting for full recovery (if it ever comes); but to be honest, my revenue is almost back to pre-Panda levels simply because my bounce rate is MUCH lower. Perhaps that unpandalized page I mentioned just had good performance from a user standpoint (at least relative to other pages ranking for the same phrases).
When I say yoyo, I don't just mean yo yo in keyword ranking, to be clear. I mean Yo Yo in actual traffic coming from Google.
The yoyoing in keyword ranking has definitely been that way since Panda started, just in the past it actually had zero impact on actual traffic.
There's some confusion though because I suspect that some of what Analytics reports as "Google (Organic)" is actually from Google News. So it could be that when I see an increase in "Google Organic" it's actually just an increase in Google News, and that there's been no change at all.
When looking here on WW I simply don't see a large number of people reporting success in coming back from Panda (I'm not sure I really see any).
Yet, so many of us have made, no doubt drastic, changes to our websites (their format, their content, the way people interact with them).
It's not possible that at least a small group of us haven't accidently hit upon the correct mix that can satisfy the Panda algo.
That suggests to me that there is a "time freeze" associated with sites that have been hit or (as Tedster stated elsewhere) there is a two level crawl associated with the algo now, one of which happens only infrequently.
If I had a new algo, and didn't want anyone to easily figure it out, I would either re-evaluate penalized sites randomly and release them randomly, or else delay their release as long as possible.
There could be a Google attitude that Panda is cleaning up the web, so why don't we let this clean up continue for, say, 6 months.
|ratio = Visitors / Indexed Pages |
Unique visitors or number of visitors(hits)? Also, per month or day?
@Broadway A couple of people here have claimed recovery. Crobb for instance, though he has a very small site. I believe Brinked says he recovered as well though, and he has a larger site.
But it's kind of confusing as to who has actually recovered or not. Not many people certainly.
@Shatner, the site that has been pandalized is about 4 years old. All original content. Main problem was the way portion of the site that was dynamically generated. Looked at those pages and discovered many were blank or duplicates. G had gone rampant indexing them, some even up to page 99. If you have a site that is generated in such a way, I would look at the parameter handling and instruct G in WMT how to handle that. Plus changed meta tag to only "index, follow" the first page but "noindex, follow" any subsequent pages. I have looked at 2 big competitors (I am small fry) that have been pandalized as well, and they both have the same issue. Lots of dupe content and blank/or nearly blank pages.
@browsee, ratio is taken from visits per day divided by number of indexed pages for that day using GA data.
Ratio = Visits Per Day / Number of Indexed Pages
Ratio Feb 23 Pre Panda = 1.32 (Traffic =100%)
Ratio Feb 25 Post Panda 1 = 0.91 (Traffic = 62%)
Ratio Apr 11 Post Panda 2 = 1.72 (Traffic = 77%) (42% de-indexed)
Ratio May 3 = 3.05 (Traffic = 80%) (67% de-indexed)
Thanks mslina2002. If I understand correctly you are checking total number of visits per day divided by number of pages users visited on that day right?
I don't think you meant the total number of indexed pages, you can find total number of indexed pages using site:domain.com
|That suggests to me that there is a "time freeze" associated with sites that have been hit |
Or it could be that Google's user stats for a site require about 3-6 months worth of data and any new changes will not affect significantly those stats for quite a while.
browsee, it is the total number of indexed pages using site:example.com on each specific day.
I wanted to see what the impact was on my site for de-indexing all these pages. The site is not fully de-indexed yet so every day I am checking to how much G has de-indexed and the impact on my traffic.
Thanks again mslina. Some how, my site:example.com count is going down very slowly, it is 10 times more than actual urls in my sitemap. Something wrong with this count. I am not sure if there is any other way to find out how many urls are actually indexed, suggestions are welcome.
| This 81 message thread spans 3 pages: < < 81 ( 1  3 ) > > |