| 6:08 pm on Sep 25, 2012 (gmt 0)|
|Google send most of the traffic but I sure as aint playing their games in the hope things improve. |
I wish I had. I mean, just look at the good it's done. None. None at all. The harm it's done is detroyed my Bing rankings that would have at least kept me in bread and butter. Cost me a lot of money in paying people to redesign the site around the reworked pages. Spent a fortune trying to speed it up, moving all images etc to cloud.
I wish in fact I had abandoned it and just started again as then if it came back I might have 2 sites in the SERP. Now most likely to go bust. I have like a month of grace (that's without anything drastic going wrong with the equipment I use to actually make the stuff I sell).
Yeah yeah, I know. I was stupid to rely on organic SERPs. It's harsh though.
| 6:35 pm on Sep 25, 2012 (gmt 0)|
|Here's a thought. What if G was "training" their filters instead of assigning them a value. |
Take Geo-location. Instead of using Server IP, ccTLD, potentially whois and WMT settings- what if they just let "the world" tell them which locations responded well to a site, and which ones don't
Shaddows - This is the concept I've been circling around in a bunch of earlier posts, but I've never articulated it nearly so well. I've been calling it "calibration", which is very fuzzy thinking. "Training" crystallizes the idea for me.
Regarding tedster's comment...
|...the machine-learning does seem to be going pretty slow in the mid to long tail queries. |
It would necessarily be going slower, because it would have less data on less frequently searched queries.
| 9:07 pm on Sep 25, 2012 (gmt 0)|
I dont get this, if I look at no.1 for my keyword out of 62 mill. results it dont comply in anyway what has to do with algo, Panda or Penguin.
Its a image site, with a front page with unique text, now that was the positive view, invitations for linkexchange, 6 banners above the fold.
Inner pages : when you have picked a image, you then see 40 words of text with 3 words unique on every page rest is the same text as on all pages, 6 banners above fold, a lot of linkexchanges......
Now I have changed my sites so I offer images, with unique text description on each page, 2 banners above fold, no link exchange, but im now on page 84 and the other no.1, where I before April 2011 ranked as no.5, hmm am I doing anything wrong, yes listening to googles guidelines as it looks.
I have not seen anyone on the web who can say Panda is about blabla 100%, nobody knows what Panda is about.
| 11:00 pm on Sep 25, 2012 (gmt 0)|
|nobody knows what Panda is about. |
| 2:24 am on Sep 26, 2012 (gmt 0)|
|I have not seen anyone on the web who can say Panda is about blabla 100% |
I agree with you. Some people know that they recovered from Panda after doing X and Y. Well, those were just the parts of Panda that hurt them specifically, but they aren't nearly the whole story of Panda.
And because Panda is a machine-learning algorithm, its effects on ALL sites over a variety of query terms isn't something that anyone knows in specific detail - even Google engineers most likely know only in a more general way, figured as percentages and statistics. If Google understood Panda in perfect detail, then they would not have had such a monster thread on their own forum asking for input from webmasters who felt they were a false positive.
| 6:28 am on Sep 26, 2012 (gmt 0)|
|If Google understood Panda in perfect detail, then they would not have had such a monster thread on their own forum asking for input from webmasters who felt they were a false positive. |
A very good point, must admit I was amazed when I saw that post asking webmasters if they thought they were affected! Trouble is I wonder how much notice they take of the posts, on many issues they fail to even reply...
|Martin Ice Web|
| 6:35 am on Sep 26, 2012 (gmt 0)|
you said that you manage a few sites that did not get hit by panda/penguin. What i now would like to know is, if you can see this traffic fluctuation when all pandalized sites say:"yesterday was a bad day"?
Because, i canīt imagine that they only shake up the pandalized sites?
@ohno, Jezz123 i am sure that google added a flag to every pandalized site or a value. If your value is low you can fix it, if your value is high your domain has been burned. That is why MC said sometimes is the best way to start over with a new domain. The question is why did the new #1 rubbish sites did not get caught by panda?
| 7:35 am on Sep 26, 2012 (gmt 0)|
I take anything that comes from MC with a very large pinch of salt.
| 8:53 am on Sep 26, 2012 (gmt 0)|
My site got hit April 2010, but the last 3-4 month I have not seen any flux in stats at all, so im not sure that all panda hurt sites do flux or maybe its not even Panda anymore who knows, be cause i have noticed that the front page is now last (page94) for a keyword search.
| 8:57 am on Sep 26, 2012 (gmt 0)|
I thought MC made that suggestion about starting afresh in connection to Penguin, not Panda.
I don't believe Panda 'burns' domains because it's all about the quality of the content, which can be changed.
My guess is Google asked for feedback on Panda as a way to verify it was doing its' job accurately, in the same way we ask for feedback on our sites even though we may feel they are already perfect. Panda was a major change to things so Google would be crazy not to ask for feedback. They've always said they're very happy with it though, and haven't updated it for months.
|Martin Ice Web|
| 9:47 am on Sep 26, 2012 (gmt 0)|
claaarky, you are right, i mixed it.
Zeus, not any flux, like yesterday very bad, today is good?
| 9:48 am on Sep 26, 2012 (gmt 0)|
The thing about Penguin, as I understand it, is that it looks for artificial footprints, then smacks the site in question. The inherent value is basically irrelevant (apart from possibly raising the "score" that the demotion kicks in at).
Thus, if you are able to "fix" your backlink profile to make it less prone to Penguin, you are just confirming that you are a dirty spammer in the first place. Spammers don't get rehabilitated.
Panda, like claaarky says, is about user experience. I don't personally think it uses actual metrics in the way claaarky does, but I agree that changing the UX changes the Panda action. Sites can and do improve, and Google is willing to reflect that improvement.
As such, I would be far more worried about a Penguin hit than a Panda one.
| 9:54 am on Sep 26, 2012 (gmt 0)|
nobe its just slide in the cold dark panda sea, where the kaptain cant see any thing be cause of the low pressure fog named Matt
Shadows : UX changes whats that
[edited by: zeus at 9:58 am (utc) on Sep 26, 2012]
| 9:57 am on Sep 26, 2012 (gmt 0)|
Seeing a lot of movement this morning (east coast US) on penguined sites, all negative.
| 10:01 am on Sep 26, 2012 (gmt 0)|
|Thus, if you are able to "fix" your backlink profile to make it less prone to Penguin, you are just confirming that you are a dirty spammer in the first place. Spammers don't get rehabilitated. |
I hope that that's not the case with Penguin. In my case a 3rd party created lots of empty forum profiles, all with the same user name, all with 0 posts. It was not authorised by me - they were supposed to be publicising a competition but to fill their time ran xrumer or something like it. I was able to get most of it removed. I also removed links from within my own sites (all on topic but nonetheless in my control).
I am hoping that it's a penalty pure and simple and that it will eventually get removed. Whether it will or not remains to be seen.
I do wonder if it was a data mining excercise by google to find out who had access to what.
| 10:10 am on Sep 26, 2012 (gmt 0)|
One of my friends has bought some friendly SEnuke x blast from a guy, now his site is on 1st page, #5. After only 1 week he saw good results, before he was on the 1st page he was on 50.
| 10:20 am on Sep 26, 2012 (gmt 0)|
I keep hearing stories like this rzaweb. It's tempting to spam your way out. To be honest I think that some people will feel they have nothing to lose and go for it.
| 10:24 am on Sep 26, 2012 (gmt 0)|
At this rate I think you are right. Who unplugged the server today? Oh, no one :(
|Martin Ice Web|
| 10:27 am on Sep 26, 2012 (gmt 0)|
Shaddows, how does google measures UX?
I see the following: When we have "good days", users are going to stay on our site and will go through 3-10 pages.
When we have "bad days" users will immediately go back.
So how to measure?
If it is not metric ( and i believe that this is only a little part ) then it is on page factors. I know all the factors because they have been discussed in WebmasterWorld. But it seems not to stick for all pages.
I think to do nothing ( like ohno is saying ) is wrong, because the rules have changed to rules that even google does not know. So I began to ask every call-in about my site and what should be better. My UX is not users UX and not googles UX.
I must think of the threat with the " escaped from panda site" where the author says after reworking the site it took 2 month to see improvements. The UX on this site was not my UX but seems to fit panda UX.
| 10:36 am on Sep 26, 2012 (gmt 0)|
snift, what is UX, im not that into those shortcuts.
|Martin Ice Web|
| 10:59 am on Sep 26, 2012 (gmt 0)|
UX = User Experience
| 11:28 am on Sep 26, 2012 (gmt 0)|
At the risk of sounding like a broken record, I just don't see how Google could reliably judge the quality of every page on every site by analysing the content.
Quality is not something you can programme into a computer. It's a human perception that changes every day based on our own life experiences, sites we visit, world events, seasons, fashion. A page that's quality today may not be tomorrow for those very reasons.
The way to judge which pages and sites people regard as quality and relevant, is to capture data on how people behave and react. Compare the metrics of sites in a niche and the low quality ones will stand out like a sore thumb. Work out the average, build in an allowance so you don't catch borderline cases and you have the basis of a system to demote the worst sites.
| 11:29 am on Sep 26, 2012 (gmt 0)|
For a few days in a row now, my sales have completely stopped at around 5pm Eastern, but traffic has continued. It is crazy, good sales from midnight till 5 the next day and then boom, nothing. Has anyone else seen this?
|Martin Ice Web|
| 11:46 am on Sep 26, 2012 (gmt 0)|
claaarky, i agree with you. But I would like to add that user metrics is not the only point. And relying on user metrics is goin to bite itself in the back. Because the user metric of a site that is ranking very good will have good user metrics. Because it has good user metrics. It is like a circular reference in excel!
But this would match to my observations that on thursdays all the good rankings are set back to "panda entry point" and will gain traffic/user metric over the week.
| 12:16 pm on Sep 26, 2012 (gmt 0)|
Martin, my metrics from before Panda hit and immediately after were basically the same overall (just looking at the site averages). No real difference to speak of, so in my case Panda demotions haven't made getting out of Panda more difficult. That's not to say it's not the case for others.
What I can say though, is that despite all the things I've done to improve the quality of my site, my overall metrics have only marginally improved. That suggests to me that we haven't really made much difference to our visitors' perception of the quality of the site.
I believe that the biggest hurdle to getting out of Panda (if you assume user metrics tell you everything you need to know about your site's quality) is knowing where the threshold is. I've recently been able to obtain the stats of another site in my niche that hasn't been hit by Panda and what I've discovered has shocked me. Their exit rate is absolutely miles lower than mine.
18 months of trying should have told me that, but now I have a much better idea of the scale of the task ahead. I've effectively been nibbling at the problem, even though it felt like I was really going for it.
|Martin Ice Web|
| 12:29 pm on Sep 26, 2012 (gmt 0)|
|I believe that the biggest hurdle to getting out of Panda (if you assume user metrics tell you everything you need to know about your site's quality) is knowing where the threshold is |
I mean the same, although i think it is not all about user metric but site structure ( siloing ), unique content. Apparently all this reduces in some way back to user metrics.
Do you see on thursdays lower traffic?
| 12:47 pm on Sep 26, 2012 (gmt 0)|
If a site was deemed poor by its (once)users then if it were an ecommerce site it wouldn't have sold much. What about the ecom sites that were selling just fine up until these so called quality updates by Google? Have Google deemed them poor quality hence the lack of traffic which = reduced conversions? If that were true why would said sites seem to have on/off periods of conversions? How would this fit in with new sites? Would Google give them a boost to gather data? Personally I think the algo is just not working plain & simple, making any changes right now would be pointless. What I'd love to know is how Google seems to be able to limit conversions by monetary value rather than quantity of sales as that is what our figures seem to suggest. It's almost like once we have made ĢX our traffic & conversions all but go. Spookily we also see Gbot starting a cart session, I don't recall that happening prior to this year?
| 12:47 pm on Sep 26, 2012 (gmt 0)|
|I mean the same, although i think it is not all about user metric but site structure ( siloing ), unique content. Apparently all this reduces in some way back to user metrics. |
Last week I reported that I observed about 6 sites rotating upwards through the SERPs for a certain query, to page 4, then dropping. This week, I've watched those same sites go full circle again. Yesterday, two of them dropped to page 16; today one of those two is back up to page 4 and the other is on page 9. Tomorrow, the one on page 9 will be up to page 4, and the one on page 4 will drop to page 15 or so. The other sites will rotate up in the same manner. These are all "similar" sites according to G, so I assume they are testing user metrics for those sites.
| 1:51 pm on Sep 26, 2012 (gmt 0)|
Martin, yes, the 'Thursday' effect has become such a regular and obvious occurrence here that each week my customer service staff say "it's really quiet today" and then say "oh, of course, it's Thursday isn't it".
Ohno, since we've been working on our exit rates our conversion rates have gone up and up. My site now converts better than at any other point in the last 5 years. However, I suspect that my non-Panda'd competitors have much better conversion rates as well, so if I can get my exit rate down to the same level as them we should see conversion improve massively.
I've also had the sense for years that Google has been dishing out the converting traffic. It's another phenomena that's been noticed by the customer services team. Sometimes we have a great start to a day, hit our normal level of sales by 1pm and think we're in for a good day, then the door shuts and the total at the end of the day is exactly what we would have predicted. Could be natural, but it defies all real world explanation.
| 1:55 pm on Sep 26, 2012 (gmt 0)|
|Sometimes we have a great start to a day, hit our normal level of sales by 1pm and think we're in for a good day, then the door shuts and the total at the end of the day is exactly what we would have predicted. Could be natural, but it defies all real world explanation. |
100% the same, & has been for some time now. The Thursday effect I can recall as far back as 2009.
|Martin Ice Web|
| 2:00 pm on Sep 26, 2012 (gmt 0)|
ohno, i know what you mean and i although think that a shareholder profit driven business is not willing to give money away. You can see that in introducing the new shopping fees or the fess for using the google maps. It is also clear the a "little" factor in ranking is adwords earnings. But as i stated before, i think the rules are new and old site with panda sitting on them are not going to get back until something changes. It is to long now for them to get back.
There is truly some false positives and i see all the things like traffice shapping, throtteling and even sales throtteling like some others here. The heck, i told it my wife, she was rolling her eyes and said she canīt believe it because it would be to conspiratorial. I still think goog found a way to get some more data then we know.
But this is the point once you have this opportunity you use it.
But i like more the idea that user metric is only the verification of the algo that is certainly base upon backlinks. Whereas this algo seems to have some severe problems. As crobb305 is observing the rotation of some sites I think this is the reset of "user metrics confirmation". All pandalized sites have this but on a different day and i a more or less heavy way. To escape from this you have to find claaarkys threshold. Finaly something tedster was talking about shallow or low content was something that hit me like a train, itīs smattering what we are adding as new content, because you canīt push something up that canīt be pushed. And there is already a non pandalized site on the top with the targeted content. The key is to invent the wheel new.