Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

John Mueller talks Panda and Penguin penalties on hangout-30Dec14

         

Whitey

6:27 am on Jan 2, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



English Google Webmaster Central office-hours hangout

Streamed live on Dec 30, 2014
https://www.youtube.com/watch?v=Ba_qLBFlIe4&t=08m37s [youtube.com]

I thought this was one of the more notable conversations involving a potential specific Panda penalty issue on Barry Schwartz / RustyBrick's [webmasterworld.com...] website, the things that the algorithm may be looking for, and later some further insight into Penguin penalties and the disavow tool.

John Mueller said that he would pass on the Panda query to his team, and hopefully we get some insight into the potential Panda quality issue's Barry will need to resolve. It would be good to have some response from the G team.

All sorts of gems in the mix, things like proportions of content, comments, time to recovery, process clarification etc. etc. The video starts at 8mins 37secs, so I may have missed some other gems which I'll check when I next have a spare hour or so.

Anyway, the hangout might be worth having a look at, and passing on your analysis and comments. One thing, ..... it's complicated [ as if you didn't know ]. Any inputs appreciated.

Can we keep this thread OT and avoid the temptation to run Google down - more try and work out what this video teaches us.

[edited by: Robert_Charlton at 9:36 am (utc) on Jan 2, 2015]
[edit reason] fixed YouTube url [/edit]

Whitey

6:43 pm on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



ps. Discussion of this particular site could really benefit from being in separate minor thread.

@roshaoar - agreed. But let's support the momentum of interest in Barry's site for a while, then when that drop's maybe we can pick up on some of the other things. I think the dilemma is that there is a lot of rich things to be talked about on JM's hangout and ad hoc discussions may interfere with the train of thought on a single subject. That's just my suggestion - it's the Mod's call.

like the fact the site has all of the tag and category pages available for indexing (12,000 pages of it!). This is crap duplicate content that Google has long punished.

@elquiri - I wonder how much of Barry's site has been blocked from the SE's with "no-index". @Rustybrick / Barry ?

rustybrick

7:28 pm on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@Planet13 I think John was looking for a nice way to tell me my content isn't unique enough for Panda. So he blamed the comments.

I know Googlers, such as John and the web spam team, read my site daily. So I think this is a bit awkward for them.

rustybrick

7:28 pm on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@Whitey, I blocked only one page from the index [seroundtable.com...]

Planet13

7:31 pm on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@ rustybrick

Thank you.

Please keep us updated.

rustybrick

7:34 pm on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I will, fun being transparent like this.

aakk9999

7:42 pm on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I wonder how much of Barry's site has been blocked from the SE's with "no-index"

Hopefully none as it is "noindex" and not "no-index" :)

I would agree with elguiri though with his view on noindexing category and tags pages.

Barry, the page in robots.txt is blocked from crawling, not from indexing :)

Whitey

1:46 am on Jan 6, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hopefully none as it is "noindex" and not "no-index" :)

@aakk9999 - Thanks for picking me up on this :)

Yes I agree, cull all the low quality pages with high bounce rates. With Panda lurking for over 2 years now I would think this would be a priority. Got to focus in on the quality content.

I think to reverse engineer Panda is fun, at times real interesting, but not as productive at turning things around, as with, a focus on the user's perspective. Hopefully, many will agree with me that we have to set our eye's so much on the content that is being created. Is it unique? Are people engaging with it? What's the clear differentiating message that you are sending to your audience that Google is picking up on, or not picking up on. This is the key.

Please be clear on the definition of content [ for the newbies ] - it is much more than blocks of text. JM emphasises this with the Spanish SEO in the hangout[ some day I'll learn his name:) ]

Here and in the 2015 - emerging trends on search, what are you predicting? [webmasterworld.com...] thread ; I'm really trying my hardest to encourage us to engage as a community, to think together, think to the future, and come up with new ideas about how we analyse and tackle problems.

I'm absolutely against defeatist negative sentiment, running Google down, or other folks down, [ because it is utterly unproductive ] but understand the battle individuals have had adjusting their mindset for the next round. Google is committed to excellence, otherwise it does not function as a business. Equally, nothing would please me more to see lot's of successful small / medium size business' and individuals mixing it with the best.

If only a lot of the old Mom and Pop sites that gave SEO all away, or folks who struggle to adjust the new way of thinking could take a tentative step forward, I think it would re invigorate our collective thinking.

If we all came together and gave Barry our feedback on how his site and what this side of the business does for us, it could go to the next level with 1-2 days work, I think we'd all benefit from the exercise, change the way we interact for a stronger outcome, and it would answer a lot of the issues around what JM was being asked about.

To get out of Panda, or protect from future updates the site must support the story that " I/We , so something different, that is better than anyone else" - bake that into your site / business and I think it will rock.

Aside from the technical SEO, this is vastly more relevant to Panda protection/release IMO

I wonder if JM was asked this question, how he would respond, compared to the technical level he has been asked. Just curious :)

btw - I'm no expert; the expertise is here, reading these threads ; just trying to be helpful and stir things up positively, not rant (apologies in advance if this is interpreted that way ).

@Barry, 3 people now suggest you cut the crap pages as a starter :) We're here to help.

Clay_More

7:30 am on Jan 6, 2015 (gmt 0)

10+ Year Member



encourage us to engage as a community


I pretty much consider everyone here a competitor. Talking shop is fine, but there are a lot of things it doesn't make sense to blurt out just to be the first person who blurts it out. Apologies to the blog people who live their life by that.

rustybrick

12:09 pm on Jan 6, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I am not changing anything right now but how the comments are displayed. I am okay with testing this slowly.

rish3

7:04 pm on Jan 6, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



I know Googlers, such as John and the web spam team, read my site daily. So I think this is a bit awkward for them.


Best statement in this thread.

"We, uh, find your site valuable enough that we check it every day, but the algorithm is looking for, uh, uh, uh..."

So much for "Focus on the user and all else will follow".

EditorialGuy

7:53 pm on Jan 6, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



So much for "Focus on the user and all else will follow".


Or not, depending on what the search engine's algorithm is designed to reward.

Take a story like "Webmasters Talk About 2015 SEO & Search Trends & Predictions." Do a search on "2015 search predictions," and it comes up on page 3 of the SERPs (at least for me). That seems reasonable, because it's merely a summary of a thread at Webmaster World. Barry's users may like it, but Google may feel that a searcher is looking for something meatier, not just an aggregator's abstract with a link to an original source.

To use an analogy, it's like going to Google News to learn about a news event. Should Google News give its most prominent link to a Reuters, New York Times, or BBC article about the event, or should it give top billing to a Huffington Post teaser that links to the Reuters, New York Times, or BBC article? Most people would say the former, but there's nothing to stop people who prefer Huffington Post teasers from going directly to the Huffington Post.

rish3

8:16 pm on Jan 6, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Or not, depending on what the search engine's algorithm is designed to reward.


Well, the context is that the site in question is visited daily by a multitude of experts in the industry.

Rationalizing why Google would penalize it, whilst it's engineers simultaneously visit it every day is just...funny.

edit: Before you call it out...[sic] on it's :)

Leosghost

8:33 pm on Jan 6, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Personally I've never been to Google news, for "general news", I go direct to Reuters, BBC, Le Point,Al Jazeera, etc..

EditorialGuy

11:12 pm on Jan 6, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Well, the context is that the site in question is visited daily by a multitude of experts in the industry.


If that's true, Barry might want to ditch the comments, since the comments are mostly coming from trolls, not from industry experts.

Rationalizing why Google would penalize it, whilst it's engineers simultaneously visit it every day is just...funny.


A site isn't being "penalized" just because it isn't ranking on the first page of the SERPs. And, as I pointed out earlier, what a community may enjoy isn't necessarily what a searcher is looking for (See my post above).

Wilburforce

11:44 pm on Jan 6, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



A site isn't being "penalized" just because it isn't ranking on the first page of the SERPs.


That isn't what is described in the interview or Barry's posts: we are not talking about ranking, but a sudden substantial drop in Google-referred traffic.

If the SERPs alone account for this, then the effect is site-wide affecting many terms.

We can bicker about the terminology, but it looks like a site favoured by senior Google staff has fallen foul of a new or adjusted algorithm factor.

I find that ironic, whether you call it "penalized" or not.

EditorialGuy

1:25 am on Jan 7, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



We can bicker about the terminology, but it looks like a site favoured by senior Google staff has fallen foul of a new or adjusted algorithm factor.

I find that ironic, whether you call it "penalized" or not.


I don't, probably because "senior Google staff" aren't typical searchers. They may well visit Barry's site just to see what he and his commenters are saying about them, but I can't believe that any high-ranking person at Google needs to search on "Panda 4.1" or "Penguin 3.0" to find out what's happening with the algorithm this week or this month.

incrediBILL

3:16 am on Jan 7, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I can't believe that any high-ranking person at Google needs to search on "Panda 4.1" or "Penguin 3.0" to find out what's happening with the algorithm this week or this month.


I'm pretty sure some high ranking Google search engineers do read what's going on in the outside world to monitor the collateral damage reported. If they didn't, how would they ever comment on what's being written which they have done in the past?

Sometimes at conference like PubCon the Google engineers have mentioned things they've seen going on at the various search discussion sites.

Let's keep to the topic at hand and not drive it off course speculating on Google's reading habits and other trivial minutia that has nothing to do with the topic and adds no value to the thread.

Thanks.

Clay_More

4:46 am on Jan 7, 2015 (gmt 0)

10+ Year Member



Way back in the day, and just to reinforce incrediBill's comment.
I got an email from a Yahoo! employee asking whether a forum statement(different forum than this) had any substance. Obvious that there was internal discussion about the statement and people were seeking additional information. I threw out my substantiated 2cents.

People trying to push agendas through forums is pretty old. I find this specific thread interesting because we get to see if Barry gets a pass, or does he get to suffer Google's wrath for an indefinite time period. Kind of sucks for Barry, but interesting to watch.

rish3

5:28 am on Jan 7, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Let's keep to the topic at hand and not drive it off course speculating on Google's reading habits and other trivial minutia that has nothing to do with the topic and adds no value to the thread.


Sorry, but that's exactly what makes this news worthy. You can call it minutia if you like, but it's what makes this an interesting topic.

The *owner* of the web site in question posted this in the thread:

So I think this is a bit awkward for them.


I'll bow out of the thread, but the cat is already out of the bag.

GreyBeard123

8:28 am on Jan 7, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



If that's true, Barry might want to ditch the comments, since the comments are mostly coming from trolls, not from industry experts.

I was under the impression that visitors leave comments.

rustybrick

12:20 pm on Jan 7, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google has admitted, they use the site and the forums as a gauge to see if the algos they release were noticeable to the webmaster community.

Honestly, I've uncovered cases of bugs in the algo, several times, where they made changes after I reported sites getting hit when they should not have.

They make mistakes and the community here and other places helps them surface those mistakes.

jrs79

1:07 pm on Jan 7, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



aren't typical searcher


You keep saying this, but I don't think there is such a thing as a "typical searcher." There are different types of searches and different levels of preexisting knowledge around these searches, but what is "typical."

Care to describe the typical searcher that you think is looking for info on SEO and Panda updates?

aristotle

2:34 pm on Jan 7, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



rustybrick wrote:
Honestly, I've uncovered cases of bugs in the algo, several times, where they made changes after I reported sites getting hit when they should not have.

Unfortunately the problem is more fundamental than just a few "bugs" here and there. A good algorithm would produce rankings that evolve slowly and gradually. Instead, we're continually seeing reports where a site has ranked number 1 for years, then overnight falls to page 10. Either the algorithm's evaluation of the site was badly wrong before, or it's badly wrong now. When this keeps happening so often with a 15-year-old algorithm, it's more than just growing pains. There's at least one basic flaw somewhere.
They make mistakes and the community here and other places helps them surface those mistakes.

Google is taking the wrong approach if it's trying to fix these "mistakes" individually one at a time. Their engineers should be wondering why so many of these mistakes keep "surfacing", and try to address the fundamental causes.

LostOne

12:33 pm on Jan 8, 2015 (gmt 0)

10+ Year Member



after I reported sites getting hit when they should not have.


Guess one has to be in the tech or SEO industry?

WebEnthusiast

12:54 pm on Jan 8, 2015 (gmt 0)

10+ Year Member



Guess one has to be in the tech or SEO industry?

Actually not. I remember when he had article about site being hit in Music industry.

roshaoar

3:16 pm on Jan 8, 2015 (gmt 0)

10+ Year Member



^^ Strongly agree with what you posted Aristotle. And further to that, I think an algo that keeps changing is surely indicative of a bad algo. A great algo would change little, it wouldn't need to. But the problem is, I guess, that people keep successfully gaming the algo.

silentneedle

4:16 pm on Jan 8, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



@aristotle I agree with that, I'm also pretty sure that the google engineers have that problem on the screen. As a programmer myself I would throw an algo which requires manual corrections that often to the trash-bin or at least rethink the concept, especially if it's big-data related.

Whitey

7:17 am on Jan 12, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@roshaoar
ps. Discussion of this particular site could really benefit from being in separate minor thread. The issues in the Google presentation are really bigger than one guy's site.

Since this thread has gone quiet, maybe it's a good time to see if there is any interest in analysing and discussing some of the many other aspects of the hangout, which folks may have missed out on. There really are some juicy snippets worthy of discussion IMO, as you say @rushaoar. So here goes again :

Here's a Penguin specific question ( I've abridged it ) :

33m52s Will quality link building help release a site from a Penguin penalty, without the need for Webmastertools access or the disavow file being applied, by shifting the percentage of low quality links, so that the high quality links become the majority? [youtube.com...]

JM: That would definitely help. We look at it on an aggregated level across everything that we have from your website, and if we see that things are picking up and things are going in the right direction, then that's something our algorithms will be able to take into account. But if you have access to the disavow file you should look to clean up those old issues as well.

So it will make no difference in Google's eyes to not use the disavow file.

What about the talk of ratios of building new good links v in parallel bad link take downs. Is JM indicating a more measured approach to safeguard traffic while rebuilding good links. He talks of "generally Google seeing things going in the right direction"

There's another question coming from this as if you don't have access to WMT, you won't see this :
•The Partial matches section lists actions that impact individual URLs or sections of a site. It's not uncommon for pages on a popular site to have manual actions, particularly if that site serves as a platform for other users or businesses to create and share content. If the issues appear to be isolated, only individual pages, sections, or incoming links will be impacted, not the entire site. [support.google.com...]
So you will recover without a reconsideration request on targeted manual actions. Y/N ?

Thoughts?

GreyBeard123

8:19 am on Jan 12, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



So you will recover without a reconsideration request on targeted manual actions. Y/N ?

I think the answer is yes...

We recovered from a manual penalty, on a few pages (not site-wide), without a reconsideration request.

It might have been due to the fact that we obtained a few quality links, or these kind of penalties have expiring dates.

Whitey

8:48 am on Jan 12, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@GreyBeard123 - Thanks for sharing that. Can you confirm your logic, behind not submitting a reconsideration request for this scenario?

In this hypothetical example, there isn’t a site-wide match, but there is a “partial match." A partial match means the action applies only to a specific section of a site. ........ By fixing this common issue, the webmaster can not only help restore his forum's rankings on Google, but also improve the experience for his users.

Once you’ve corrected any violations of Google’s quality guidelines, the next step is to request reconsideration. With this new feature, you'll find a simpler and more streamlined reconsideration request process. Now, when you visit the reconsideration request page, you’ll be able to check your site for manual actions, and then request reconsideration only if there’s a manual action applied to your site. If you do have a webspam issue to address, you can do so directly from the Manual Actions page by clicking "Request a review."

........... We hope it reassures the vast majority of webmasters who have nothing to worry about

[googlewebmastercentral.blogspot.com.au...]

My logic say's :

The manual action is against the link. You can’t change the link. What are you being considered a request for? This instruction on Google makes no sense IMO


Take this in the context of JM's above remarks on Penguin, about not having access to WMT as well, it kinda backs that thinking up.
A lot of folks are divided on this. ( btw - I'm with you on this one ).

I wish Google would clarify this in black and white terms, because it is confusing the absolute heck out of many webmasters and site owners.
This 126 message thread spans 5 pages: 126