Welcome to WebmasterWorld Guest from 35.175.190.77

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Panda Refresh in 2-4 Weeks - Finally after 6+ Months!

     
11:07 am on Jun 3, 2015 (gmt 0)

Preferred Member

5+ Year Member

joined:Jan 10, 2012
posts:484
votes: 29


After 6+ months we are finally going to see a Panda refresh in the next few weeks.

At SMX Advanced tonight, Google’s Gary Illyes announced that the next Panda update will happen in the upcoming weeks. He said he expects it in the next two to four weeks.

Illyes referred to it multiple times as a data refresh, not an algorithmic change. So sites that have been suffering from this algorithm may see a recovery in the near future. However, not all sites will see a recovery: Some may not recover, and new sites may also be hit by this data refresh.

[searchengineland.com ]
1:39 pm on June 3, 2015 (gmt 0)

New User

5+ Year Member

joined:June 8, 2011
posts: 28
votes: 4


Any word if it's going to become part of the algo like they keep promising rather than refresh updates months apart?
2:56 pm on June 3, 2015 (gmt 0)

Preferred Member

5+ Year Member

joined:Jan 10, 2012
posts:484
votes: 29


Google said

Illyes also explained that it is in Google’s best interest to keep this data fresh, so the they want to keep it updated as frequently as possible. But they do require manual updates and will currently not run by itself like some of their other algorithms.


Doesn't look like it's something the will be part of the live algo anytime soon.
3:02 pm on June 3, 2015 (gmt 0)

New User

5+ Year Member

joined:June 8, 2011
posts: 28
votes: 4


Nightmare. a 6 month refresh for a Panda kicking is just too long.
9:44 pm on June 3, 2015 (gmt 0)

Full Member

5+ Year Member

joined:May 20, 2010
posts:219
votes: 7


Never believe anything Google says, ever.
10:01 pm on June 3, 2015 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12201
votes: 358


A much more helpful reference on this same information, also on SEL, is from the live blog of Danny Sullivan's Q&A with Google's Gary Illyes (different from the SEL story linked to above)....

SMX Advanced 2015 Live Blog
Matt McGee on June 2, 2015 at 7:45 pm
[searchengineland.com...]

DS: Panda – are we still working toward a one month, every couple months update schedule?

GI: Our goal is to update it more often. It’s not that easy to refresh Panda that often, we have to get our data right to do a data refresh. But sometimes the data is noisy or something’s wrong with the data, so we have to wait.

Illyes anticipates that Penguin should be updating continuously, but that "Panda, for the foreseeable future, will be manually updated."

I'm thinking that getting a clean Panda data set requires the confluence of many separate databases, tracking many ranking factors (some of which take time to appear), and a fair amount of resources.
10:48 pm on June 3, 2015 (gmt 0)

Full Member

5+ Year Member

joined:May 20, 2010
posts:219
votes: 7


Umm, Panda was being updated MONTHLY way back in 2013. See here:

[searchengineland.com...]

What changed? Their lies and deception changed.

Something wrong with the data? There's no right or wrong with data. If it's wrong, it's the algo that is broken.
2:04 am on June 4, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3353
votes: 705


If Panda is meant to be an an assessment of Web sites' quality, the index data shouldn't shouldn't need to be refreshed constantly, because few sites change in quality from moment to moment or even month to month.

The Panda algorithm, on the other hand, should be updated whenever Google is convinced that it's improved the recipe.
9:42 am on June 4, 2015 (gmt 0)

New User

joined:May 27, 2015
posts:25
votes: 0


I'm sure that most sites haven't recovered from the phantom and Mobile update. And now, Panda is out to rear its ugly, spotted head again.... or maybe it was the phantom update which went live a week ago.

If it isn't, then it's going to, then brace yourselves, it's going to hit us hard again.
10:02 am on June 4, 2015 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month Best Post Of The Month

joined:May 9, 2000
posts:25723
votes: 821


Most hit by Panda will welcome this long overdue update coming, but i'm also sure it'll also affect other sites as yet penalised.
10:09 am on June 4, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:May 14, 2006
posts:692
votes: 59


If it's wrong, it's the algo that is broken


The algo is broken since Panda 1.0 ...
10:29 am on June 4, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Nov 2, 2014
posts:654
votes: 313


I suspect that both panda and penguin scores are overridden by brand assessments in Google's algorithm. That would explain why many thin pages on branded domains have been ranking so well and small businesses are hard to find on page 1 of Google's serps. As long as Google allows the brand signal to take precedence over actual content quality, many lesser known/small websites will continue to be handicapped in Google's search results. With that being the case, I doubt a panda refresh will be felt in a major way. Between big brand dominance and excessive advertisement spam above the fold in Google's search results, it has and will be harder to be found in Google's organic serps.
11:03 am on June 4, 2015 (gmt 0)

Full Member from GB 

5+ Year Member Top Contributors Of The Month

joined:Mar 26, 2013
posts:264
votes: 34


Good days for copywriters. Although I suspect any content changes now will probably not make the actual data refresh which has probably already happened or will do very soon.
12:03 pm on June 4, 2015 (gmt 0)

Preferred Member

10+ Year Member

joined:Dec 23, 2004
posts:574
votes: 2


Personally after being Panda affected since 1.0 and some nice improvement in May 2014 I'd almost rather they leave it alone. I've done far too much to satisfy this thing. However, one thing I never looked at was copied content that I spent four weeks working on earlier this year.

Six months? More like eight. Wasn't the last October 2014?

Holding my breath again.
1:25 pm on June 4, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:Mar 9, 2010
posts:1806
votes: 9


Good days for copywriters. Although I suspect any content changes now will probably not make the actual data refresh which has probably already happened or will do very soon.


I am surprised that there are people who still believe that Panda has got something to do with copywriting!

[edited by: indyank at 2:33 pm (utc) on Jun 4, 2015]

2:11 pm on June 4, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Feb 3, 2014
posts:1295
votes: 377


I'm sure it will be another punitive iteration to sites that have tried nothing other than to provide good, compelling content and a great user experience.
2:39 pm on June 4, 2015 (gmt 0)

Full Member

5+ Year Member

joined:July 29, 2012
posts:251
votes: 12


It could go either way. I get the feel they might be trying to fix some of the mess created by the Cutts web spam team. I have seen a few things fixed in WMT that obviously they had wrong. I think the Mobile algo was a bust. They still have the structured data not working properly. Very obvious with the structured data they are throttling results or the mechanism is in need of updating.

I suspect there is pressure from the EU and maybe FTC so they have to make it look like progress. It's so simple, you can see in WMT who is trying to be legit and maintain decent sites. Use the data and then hit these sites producing the referrer spam and all the other garbage.
2:55 pm on June 4, 2015 (gmt 0)

New User from GB 

joined:June 4, 2015
posts:1
votes: 0


It could be good or bad depending on whether everything goes smoothly on Google's end.
2:55 pm on June 4, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Aug 5, 2009
posts:1669
votes: 329


Well I'm hoping for a flood of happy webmasters on the next rollout! I have a couple sites that I've stripped and will be watching intently. A bit of hope would be appreciated. I counting on the Google for this update.
4:02 pm on June 4, 2015 (gmt 0)

Preferred Member

10+ Year Member

joined:Dec 23, 2004
posts:574
votes: 2


Ditto Savage

I'd love to be inspired by Google again. Not that I lost it. A boost of confidence always helps.
9:19 pm on June 4, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:July 29, 2007
posts:1823
votes: 107


I'm just not buying the official reasoning behind the slow, manual nature of Panda refreshes, sorry. It's a piece of script designed to sort data based on x factors and "weed out" sites that don't meet the criteria, that's it. Manually, daily, monthly, yearly... makes no difference, it will do the same thing.

What I suspect is really happening is that they need Panda NOT to create weeks and months of manual cleanup afterwards. Human raters go into overdrive on a keyword by keyword basis to make sure no spam rises to the top after an update. They only pay attention to page one results and start with the most searched for keywords and work their way down. I suspect that Panda has the potential to make quite a mess of things for the human raters and that's why they can't apply Panda more frequently. I'm sure you've noticed that content doesn't get indexed(well) as quickly as it once did, top 3 seems to need manual approval.

Google prides themselves on algorithms and isn't about to tell us they lean on humans quite a bit, but they have no choice. With every major update webmasters spot things that make no sense and Google quickly cleans it up, Panda is probably the biggest mess maker for them.

At any rate, this is good news anyway. I'm really hoping a couple of my competitors eat dirt for their thin image rich content approach that gets mass social appeal but is barely worthy of a serp spot.
11:13 am on June 5, 2015 (gmt 0)

Full Member

10+ Year Member Top Contributors Of The Month

joined:June 3, 2005
posts:298
votes: 12


If we can keep the chit chat to minimum I have something serious to say about those sites waiting for recovery including mine.

Due to the nature of Panda effected sites they are rarely spidered and re-indexed. This means when the data is refreshed Google only refreshes on what data it holds.

You can use the cache: directly in Google to see if your changes have made the index.

If not then like me it’s a painful process of “fetch as google” in webmaster tools with a limit of 500 URL’s.

(please note I am not an SEO professional just some one with a keen interest in SEO with quality original content).
12:26 pm on June 5, 2015 (gmt 0)

Preferred Member

10+ Year Member

joined:Dec 23, 2004
posts:574
votes: 2


Rarely spidered and re-indexed?

I have no problems with that. In response to your PM Johan. I just got an overall rise in the tide with Panda 4.0. Actually a 100% boost, but still off by about 55% prior to Panda 1.0. I've been through the ringer spending well over 1,000 (probably much more) hours on a site that is now 255 pages or so. If I don't pull out, it's just a matter of focusing on more important things such as keeping visitors on the site. I consider the site unique in that I was one the the very early ones in my industry to actually put authentic (I really do the actual work) content out there. With very good visibility prior to Panda 1.0 it could have been a case of stolen/copied content. There was a TON I discovered in December of 2014, spending a month contacting hosts with DMCA takedowns. Overall, about a 95% success rate.

I had two of the senior members here at WW look at the site briefly through PM's about three years ago. I didn't ask for them to look. Actually I think it was a few of my original posts that were rejected and we exchanged a few messages. One was the late Tedster. Great guy. Even those guys never mentioned copied content. I also had the Ninjas go through the site two years ago. Copied content was not any focus at all. They did however, do a great job in getting the site structured better than it was and helped me tremendously with a lot of 301's to the new architecture.

I'm not any web design expert or know all the bells and whistles. I'm just a simple guy that enjoys informing people about my industry; showing them all sides to the real story without any biased agendas. It is nice to make a living from it rather than working like a dog doing it at 58 years old. Times change and I realize that. I no longer look at who is ranking ahead of me and such, but get dismayed when I find some of my better content on page three of the Google index.

Will the stolen content being removed help. It's anyone's guess
12:56 pm on June 5, 2015 (gmt 0)

Preferred Member

5+ Year Member Top Contributors Of The Month

joined:May 24, 2012
posts:648
votes: 2


GI: Our goal is to update it more often. It’s not that easy to refresh Panda that often, we have to get our data right to do a data refresh. But sometimes the data is noisy or something’s wrong with the data, so we have to wait.


That's enlightening for me. It hasn't been refreshed because they don't have confidence that the data is right. And, it's bad enough that they don't see a fix that would allow for monthly updates. That implies a manual effort to scrub/tweak/fix the data...probably only against some fixed list of popular sites until it matches some pattern they like. My confidence level that this sort of process results in better SERPS is, uh, low.
2:29 pm on June 5, 2015 (gmt 0)

New User

5+ Year Member

joined:June 8, 2011
posts: 28
votes: 4


I'm fairly sure Panda = punishment and Goog don't want to update it too often to make it hard for nefarious webmasters to find ways to dodge it. Is just a shame that sites that are valid but get caught up in it then have such a hard time escaping because of the big waits between updates.
7:32 pm on June 5, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3353
votes: 705


I'm fairly sure Panda = punishment and Goog don't want to update it too often to make it hard for nefarious webmasters to find ways to dodge

That may be true of Penguin (which is intended to punish and discourage questionable SEO techiques), but Panda is about quality, not about SEO. People who own low-quality sites aren't necessarily doing anything nefarious; they just don't deserve to have their sites rank as high as sites of higher quality.
7:43 pm on June 5, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3483
votes: 309


but Panda is about quality, not about SEO. People who own low-quality sites aren't necessarily doing anything nefarious; they just don't deserve to have their sites rank as high as sites of higher quality.

EditorialGuy -- Please forgive me if I'm wrong about this, but didn't your own site get hit by Panda?
9:25 pm on June 5, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:1045
votes: 132


Plenty of good sites get hit by Panda its an imperfect algorithm. Too many good sites suffer and there are too many white listed sites. Thats why it needs to be updated regularly and run, because there is allot of room for improvement in this algo.
9:53 pm on June 5, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:June 6, 2010
posts:759
votes: 100


All these updates are for Google not for publishers. Anything the big G does these days is designed to increase the bottom line for G.

If some sites see improvement it's just a side effect.

Bottom line updates are for Google's benefit not publishers.

If you think otherwise I have some nice property in the middle of the Mojave Desert you might be interested in.
1:22 am on June 6, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3353
votes: 705


Plenty of good sites get hit by Panda its an imperfect algorithm.

Of course it's an imperfect algorithm. That's why it has evolved and will continue to evolve.

But this thread isn't about the Panda algorithm: It's about Panda data refreshes. Is there a compelling need for the data that the algorithm looks at to be refreshed constantly or even frequently? How many sites change drastically, in terms of quality (or even in terms of the quality signals that the algorithm looks at), from day to day or week to week?
This 112 message thread spans 4 pages: 112