| This 75 message thread spans 3 pages: < < 75 ( 1  3 ) > > || |
|Official Google Panda Update Version 3.8 On June 25th|
| 12:46 am on Jun 26, 2012 (gmt 0)|
Just spotted this from SearchEngineLand:
Official Google Panda Update Version 3.8 On June 25th
June 25, 2012 at 5:49pm ET
|Google has announced they pushed out a new refresh to the Panda algorithm recently. |
This update "noticeably affects only ~1% of queries worldwide," said Google on Twitter.
There were earlier rumors of an update over the weekend but Google said the rollout started today and not over the weekend.
[edited by: Robert_Charlton at 1:45 am (utc) on Jun 26, 2012]
[edit reason] fixed link, added quote box, shortened quote [/edit]
| 5:51 am on Jun 27, 2012 (gmt 0)|
Well, I cannot see any changes I already received -50 penalty last time, and this updates put my site behind again, I am not sure what is happening from google, all reputed sites are going behind the SERPs, and all spam sites are coming on the top thats pretty wierd.
|Martin Ice Web|
| 8:28 am on Jun 27, 2012 (gmt 0)|
All but realy all we did made it worse.
We deleted links - failed.
We deleted keywords - failed.
We reworked all failure in WMT - failed.
We reworked page design - failed.
We deleted duplicate content - failed.
Oh, Stop, it did not fail it make it go in the other direction that it was planed it should go to.
So after 4 month of waiting, I think it is time to try something other.
| 8:58 am on Jun 27, 2012 (gmt 0)|
"Has Google ever gone on the record saying that user metrics matter for Panda? Or is that just a huge assumption everyone is making."
Believe me, very few people seem to be making the assumption about user metrics mattering for Panda. I've been trying to get people to think that way - some are getting it, some don't want to. Panda is all about user metrics and producing a good user experience.
Sites hit by Panda are vulnerable to Penguin. Their user experience isn't good enough to overcome it.
| 10:20 am on Jun 27, 2012 (gmt 0)|
The one thing you don't mention doing is the one thing that google said was the main cause of being hit by panda - improving the quality of content.
I'm not saying I agree with the results of panda, because I most definitely don't in many cases, but to get our main site out of panda we had to spend 6 months of long hours rewriting and dramatically improving almost all the articles on our site (and removing sections that weren't worth improving - noindex didn't work for us)
| 10:26 am on Jun 27, 2012 (gmt 0)|
Claaarky, that's not true. That's all we hear about Panda is a) user metrics and b) thin content. Over and over and over.
What I've not yet seen, is evidence and recovery based on improved user metrics.
I appreciate you think you're on to something with exit rate and it's a nice feeling when you feel like there's some light at the end of the tunnel. But I seriously wouldn't get your hopes up. Certainly if your entire plan is revolving around user metrics.
I've read time and time again, that user metrics are being used, by SEO'd I have massive respect for such as Tedster, but where is the evidence? Case studies?
Panda has remained one of the biggest mysteries in SEO history.
| 10:38 am on Jun 27, 2012 (gmt 0)|
This was the first Panda update in which there was not a deep crawl to my site several days before. The results of the update for my site appear to be neutral. Were only certain sites picked to be part of the update? Did they do the refresh using the same data as 3.7? There has been some speculation something didn't go the way Google wanted with 3.7 and that's why 3.8 was run so soon after.
| 10:50 am on Jun 27, 2012 (gmt 0)|
Rasputin wrote "The one thing you don't mention doing is the one thing that google said was the main cause of being hit by panda - improving the quality of content."
From what I have seen the lowgrade to no content sites are floating to the top.
I have read where some of you are deleting old good content sites down to bare minimum.
Is it Working.
| 11:05 am on Jun 27, 2012 (gmt 0)|
"Panda has remained one of the biggest mysteries in SEO history."
I suspect many people know how it works. I'm realising now that I'm not discovering something new. People with a lot to gain/lose are keeping it under their hats.
| 11:17 am on Jun 27, 2012 (gmt 0)|
"improving the quality of content." that is just a standard answer, dont put to much weight to it, when you look at rankings you see what counts.
| 12:02 pm on Jun 27, 2012 (gmt 0)|
I wonder if the googheads have worked out that 100 ~1%s = ~100%
| 12:12 pm on Jun 27, 2012 (gmt 0)|
Not much to add but this one summarizes my reaction.
I have never been hit by Panda but still watching out. And I was hit by the penguin but nothing seems to help.
| 12:18 pm on Jun 27, 2012 (gmt 0)|
Claaarky, it seems somewhat odd to me, that you've jumped to these conclusions with zero recovery. I understand your theory and like many other people, it's something I've explored. There maybe 'some' truth in it.
I also agree, that many will know bits and pieces about Panda and hold on to it. I have several findings from my own testing that I don't share, simply because I don't want Google to change things again. But I don't believe anybody has 'cracked the Panda code'.
But as far as exit rate goes, it maybe factored in to Panda. But it's certainly not the missing link.
Panda is a quality score and any one thing can tip the balance either way. So if exit rate is a factor, you maybe able to swing the balance by improving it. But my guess would be you'd need to fix a lot more to swing the balance. It's not all about fixing things either, it's about earning trust and proving your value.
I'm not dismissing your claims, I just don't want them to be blown out of proportion.
| 12:33 pm on Jun 27, 2012 (gmt 0)|
|Claaarky, it seems somewhat odd to me, that you've jumped to these conclusions with zero recovery |
Identifying the factors and recovering are two different things. Just because you can identify the problem doesn't mean you can, or even know how, to fix it. I identified my own site issues almost immediately after Panda 1.0. It's taken me over a year to determine the right way to fix things (I've now had an 18-20% recovery from the last two Panda updates after being hit every other Panda since 1.0)
| 12:54 pm on Jun 27, 2012 (gmt 0)|
|Panda is all about user metrics and producing a good user experience. |
I just don't buy it. I've seen sites with great user metrics crushed by Panda. I've seen sites with poor user metrics crushed by Panda. From what I can tell, user metrics on their own don't do much for you. They might be *a* factor, but they're not *the* factor.
The only thing I've found in common with either type of site is thin content (which technically could fall into the 'good user experience' bucket). I know that thin content isn't as sexy of an answer as user metrics, but from the sites I've seen, it's the only real commonality.
|Martin Ice Web|
| 1:24 pm on Jun 27, 2012 (gmt 0)|
this was one thing, that I thought was the primary condition. So I wrote my descriptions always on my own. Never copied descriptions from manufacturers. I did my own test reports. I made pictures of details that go far further than others.
The only thing I think of is that goomazon bothers that I have to use one keyword because it is my widget. goomazon counts it over >130000 times on a site with 15.000 pages. But what can I do? Rename my widget ( wich is although a keyword ) in something like RUMBA ?
| 1:27 pm on Jun 27, 2012 (gmt 0)|
In my experience Panda is about unique, relevant content. Gone are the days where you can repeat text across hundreds, or even thousands of pages, swapping a few words here and there. Even re-using descriptions across pages is a definite no.
Panda was designed to weed out content farms after all, so has to be a content based algorithm really... think about it for a moment. The answers are often right there in front of your eyes.
I also think, for each page Google rank there is an associated cost to retrieve and store the information. Multiply this cost by the huge volume of pages and things get quite expensive... why should they spend money retrieving and storing information about your page? Google are looking for value in each page.
I think people over analyse things and they would be forgiven as it's very easy path to go down.
| 1:36 pm on Jun 27, 2012 (gmt 0)|
I think one of the biggest things that people fail to take in o account, is that not all niches are subject to the same rules. In some niches generic text with keyword replacement will work just fine.
But as I say, it's not about this or that, it's a whole multitude of factors that all add up to tip the balance.
Websites with greater trust, will get away with more than websites with less. This makes determining the causes of Panda even more difficult.
My #1 tip would be to build trust. That's the one thing I'm absolutely certain will help with Panda. Improving content and UM are also important.
| 2:08 pm on Jun 27, 2012 (gmt 0)|
|In some niches generic text with keyword replacement will work just fine |
I agree 100%, until competition increases and the Panda dial in turn increases in severity. Either way it's best not to go down this road as one day it will fail.
| 2:20 pm on Jun 27, 2012 (gmt 0)|
Can we hear from more people about whether THIS Panda update hurt or helped them? If you were already making Panda-related changes, how did that work out?
| 2:27 pm on Jun 27, 2012 (gmt 0)|
We got a bump on this one but down on the one before, it seems like we alternate 1 up 1 down 1 up 1 down.
| 3:44 pm on Jun 27, 2012 (gmt 0)|
In the recent thread about adding internal links to help recover from Panda [webmasterworld.com] I posted this...
|OK, I experimented with this idea a little, so I thought I'd share the results so far. |
On June 17, 2012 I added 52 internal links to 51 primary pages and about about 900 pages one level down. All of those links point back to one or another of the primary pages.
So all 52 of the primary pages picked up over 900 internal links.
On June 17, of the 52 primary pages 17 ranked on page 1 for there primary localized query, (location+kw1kw2).
Since the the number of pages on page one has bounced around from 21 - 25 but seems to have more or less settled on 22 so far.
Probably too soon for these numbers to mean much, but I thought I'd share this now in light of the Panda attack on June 25.
So it doesn't look like the current Panda attack has hurt me so far. Of course there's not much left for the Panda to chew on, so.....
| 4:32 pm on Jun 27, 2012 (gmt 0)|
I've been working hard on beating Panda 1 for about 1 1/3 years now, I got a bump in May (12') but it just brought back long tails. I have a few hundred thin pages yet to clean. But seeing as I already cleaned 17,000 pages thus far, I figured I would see more of a boost.
I took your advice about Rich Snippets and Tieing my G+ account to my pages. I just put that in motion on Sunday, probably not quick enough for Panda to have ate up a full crawl.
I consulted with @RustyBrick on this. Barry seems convinced that everything has to be clean before you break through. So I just have to keep working away.
One thing is for sure, my site is much better for users now.
| 4:32 pm on Jun 27, 2012 (gmt 0)|
I'm seeing a 25% to 30% improvement with this update. I got hit hard by the April Panda & Penguin updates. I'm still not anywhere back to normal. I have been working extremely hard to bulk up the content on all my indexes. I don't know if that's what paid off, but I am doing better (fingers crossed).
| 8:14 pm on Jun 27, 2012 (gmt 0)|
DigiSEO. What else did you do, other than bulk content up?
| 8:26 pm on Jun 27, 2012 (gmt 0)|
I'm up about 20% from the last few days.
| 8:48 pm on Jun 27, 2012 (gmt 0)|
We had a 10-15% recovery from Panda 3.7. It was looking like we had another 5% or so recovery from Panda 3.8 as of yesterday, but now today that gain seems to be gone. I've been making incremental Panda related changes since Panda 1.0 and June 2012 is the first recovery we've had.
Seeing odd traffic for a Wednesday so I'm not sure what's up and if we have any recovery or not from this latest refresh. Googlebot is spidering my site heavily today too.
| 9:39 pm on Jun 27, 2012 (gmt 0)|
Googlebot is spidering us like crazy as well today and yesterday. Traffic is up around 20% the past couple of days so we'll see if it holds.
| 9:51 pm on Jun 27, 2012 (gmt 0)|
@getcooking - your pattern of Panda hits is very similar to ours...
We were hit and Panda 1.0 and thereafter. We experienced a 15% recovery at Panda 3.6 (apr27, 2012). This was then taken away at Panda 3.7.
So the question is what caused the small recovery. Its too hard for me to tell as we've made so many changes across the board.
As for what caused the Panda 3.7 hit, we made a design change in early May that included a small sidebar with 20-30 links to our best pages. This is the only major change, could this have lead to another panda hit (too many links on page)?
| 10:41 pm on Jun 27, 2012 (gmt 0)|
tedster, it hurt us, I had noticed the dip june 25/26 and wondered if there was another update, sure enough, there was. Our site is responding to each and every penguin and panda update since april. Negatively.
Our site is guilty however, not so much in spirit, as in intent, ie, we did things to try to get advantage in google knowing that is what we were doing, but we also added tools and features that were good for users, so it's a gray area type site, I'm not whining or complaining, although I do feel personally vindicated since I have long disliked the methods of generating content just to get seo bonus points. Long have I tried to get owner to focus on generating unique quality content but he just will not see the light, and there's not a lot I can do about it.
Other than taking the lesson and building out my own well situated sites of course, which for some odd reason never seem to see any bumps of note in these google updates over the last decade. Weird what decent hand written content can do, useful pages, information that answers questions quickly, with links pointing OUT to the relevant resources, and the only seo concerns done to make sure google sees and indexes the page topic.
The real trick I find, however, is to release or write things that people actually find useful, then there's this odd result, natural inbounds on quality sites just appear as if by magic.
Just checking some stats, I had 1400 references to one of my things last month on web forums, many of those contained links to the host site. So google sees this thing, notes its name, which is reasonably unique, notes that people link to it from authority sites in the topic area, and lo and behold, high page rank, never did any seo on it at all... beyond maybe using a few well placed backlinks to launch the thing, but only after creating real users on real forums to really post things.
To put this into perspective, I did the same search filters on webmasterworld.com and they showed 2000 references last 30 days. And WebmasterWorld is a unique, valuable resource. So it doesn't need to plant links on repetitious link farms (man... talk about easy to algo detect, I've been going through them, zero creativity, boilerplate, of course that garbage was easy to penalize for... almost certainly blackhat made our case worse, negative seo is looking more and more likely to me)
However, one thing I do see, decent page rank and inbounds from top authority sites is doing a very good job protecting site y, another site we do in the same general topic area as site x, the one that is getting slammed by every single penguin/panda update.
My conclusion: I'd be way better off doing this stuff myself on my own sites than trying to get people who keep looking for shortcuts to change their behaviors.
Just checked, a few days ago did a quality posting, on a site well regarded, now ranks for 8 million plus two key word phrase, not front page, but 12, page two. Has i believe only one backlink to it, but that's on an authority site in the topic area.
I think a lot of people aren't getting that content isn't good because you think it's good, it's good because other people think it's good. The ability of the black hats to place bad sites on the top ten should be ignored, they do tricks and do not care about the sites they promote, period. Creating a quality site with products that people organically link to with zero, no, prompting from you, is the only viable long term plan, otherwise you are just relying on a free google ride.
I know my client got caught up in the black hat seo game, and certainly has always preferred the easy route over the quality route, and now he's paying the price, this was a good site and could have been a great site if they had generated their own content as as well as their own tools etc for users. But no.. the temptation of easy content generation for key word search placement is just too high, it's like a drug.
I have no idea what people are talking about when they refer quality content as something one writes tons of every day without any thought or research.
I am, again, not discounting that black hats are placing sites on the top ten of their targets, they are always going to be ahead of google in many ways, there are just too many tricks and it's too easy to emulate real sites and real backlink structures.
My guess re small sites ranking, which just popped in my head after this morning's seo chat with client, is this: small sites tend to have real content written by a real person. Not always, not in all cases, but my guess is it's far more likely than not. Large sites, not so much.
I can generate a few quality articles a month max, stuff better than anything out there, or as good. Not more. I can also do a lot of quick forum postings like this, which tend to bounce back to plague me when I do searches on the topics I'm interested in, lol.
To do a quality forum post, that might be worth reading, that's a few hours. And I can only do it on a few topics, and usually am too lazy to make it good enough.
I know I've read, and linked to, many times, my favorite site in a similar topic area to the one that was hit, for over 10 years now. I read it and link to it because the main guy who runs the company writes a weekly blog type item, and what he writes is seriously high quality. He never has to worry about seo is my guess, because every article he writes is going to get first class authority links pointing to it, or at least often enough to not matter. I don't care about his products at all, I care about his analysis. And nobody can copy that. They can scrape it, but it won't work because he's been there so long google knows its his stuff. So that's one good weekly analysis/overview done, I think one a week is about as much as you can really hope for, otherwise it just gets redundant and the quality drops, I know all the real bloggers and authors I read online only do one a week, and that one tends to get instant floods of backlinks to it.
[edited by: lizardx at 11:26 pm (utc) on Jun 27, 2012]
| 11:15 pm on Jun 27, 2012 (gmt 0)|
Small sites also have less data available for Google to make conclusions from.
| 11:29 pm on Jun 27, 2012 (gmt 0)|
There is that, but think about it, it's easy to populate a small site with some pages of typing.
It takes very little in terms of character/word strings to generate a fully unique signature, really just a few sentences is all you need. And if you write those on your own, out of your own brain, they almost have to be unique.
I think that may be it in parts anyway.
I know we hit the main bump after a foolish addition of roughly duplicated content that roughly doubled the site's main content pages over a month period after fairly long period of relative low activity, what exactly was going through client's mind when he had that done is totally beyond me, he's a very smart guy, but this was a huge slip.
| This 75 message thread spans 3 pages: < < 75 ( 1  3 ) > > |