Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

John Mueller talks Panda and Penguin penalties on hangout-30Dec14

         

Whitey

6:27 am on Jan 2, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



English Google Webmaster Central office-hours hangout

Streamed live on Dec 30, 2014
https://www.youtube.com/watch?v=Ba_qLBFlIe4&t=08m37s [youtube.com]

I thought this was one of the more notable conversations involving a potential specific Panda penalty issue on Barry Schwartz / RustyBrick's [webmasterworld.com...] website, the things that the algorithm may be looking for, and later some further insight into Penguin penalties and the disavow tool.

John Mueller said that he would pass on the Panda query to his team, and hopefully we get some insight into the potential Panda quality issue's Barry will need to resolve. It would be good to have some response from the G team.

All sorts of gems in the mix, things like proportions of content, comments, time to recovery, process clarification etc. etc. The video starts at 8mins 37secs, so I may have missed some other gems which I'll check when I next have a spare hour or so.

Anyway, the hangout might be worth having a look at, and passing on your analysis and comments. One thing, ..... it's complicated [ as if you didn't know ]. Any inputs appreciated.

Can we keep this thread OT and avoid the temptation to run Google down - more try and work out what this video teaches us.

[edited by: Robert_Charlton at 9:36 am (utc) on Jan 2, 2015]
[edit reason] fixed YouTube url [/edit]

netmeg

2:12 am on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@Netmeg - interesting, you said you did nothing to these sites. What if you, hypothetically had, say promoted them? What do you think if the festivals started to get talked about more out there? Was there an issue perhaps around co-citation, social, or the lack of it. What do you think Google picked up on to give it that current growth. What's your hunch? Just curious / asking :)


I should have said I did nothing extra for the one affected site that I didn't do for the others. But if I had to guess - and it's a total guess - while the site got almost no traffic at all from Google for 19 mos (even during peak season) it ranked very well in Bing (and obviously Yahoo) and it developed a small but enthusiastic fan base from that. Plus all my sites are linked together. Plus some Facebook promotion to the people close to the target area.

I suspect that Google uses site engagement metrics, and I can't believe that all that Chrome and Android info is just going unused either. So if I had to guess, I would say that G made some tweak having to do with site engagement for Panda 4 that flipped a switch and started the traffic coming back in.

But of course, I will never know for sure.

Whitey

2:53 am on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



6 months? WTF? No wonder my new sites are not ranking!

@RedBar - I read this a little differently, more along the lines "if your site had no good content, and now it does". Not all sites will have had a standing start. And some are in a very high grade/ competitive area, like the Spanish SEO's customer e-commerce site.

Have you done anything in the interim, promoted, lifted the chatter, social activity, continued to improve the content? or are you just preferring to wait. Thought I'd compare your experience to @Netmeg's [ although they may be completely different situations].

FranticFish

8:28 am on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



that doesn't mean a forum that's perfect for its community is what searchers are looking for when they ask a question or enter a keyphrase in a search box.

But there are ten results in a SERP right?

It's called DIVERSITY: something that Google don't seem to appreciate, but something that is essential for a healthy search engine and a healthy web in general.

And, as others have pointed out, if there was ever a site crying out for comment moderation, it would be YouTube. Spending ten minutes there is pretty much a crash course in everything that's wrong about the way far too many people behave on the internet, but it doesn't seem to have been hurt by any quality-focused updates or algo changes. In fact, I can't recall the last time I saw a non-YouTube video in the SERPs.

netmeg

1:13 pm on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I don't think Joe User wants diversity. Joe User wants his answer. None of us here are Joe User.

roshaoar

1:54 pm on Jan 4, 2015 (gmt 0)

10+ Year Member



My experience is on trend with EditorialGuy - it is in Google's interest to promote the sites that satisfies most people. And the way to that isn't obsessing about small incremental SEO (which should really just be the technical starting point), but instead obsess about having unique content that people will want to share, bookmark, cite, translate and reread.

To answer the OP's thoughts, saw the vid, makes perfect sense to me. If you write short articles and then have lots of comments, then you're bound to run into problems imo. Now if you write long beefy comprehensive articles which get shared, bookmarked, cited, translated and reread, and just show a few of the best comments, then you'd actually be producing something worthwhile rather than relying on others to do it for you.

glakes

3:03 pm on Jan 4, 2015 (gmt 0)



Since when do site owners think they do not need to moderate the comments, especially SEO experts like Barry? I think that is just common sense.

I think you and many are forgetting that Barry's SER site reports on SEO news and measures the pulse of the SEO community. I've seen Barry moderate comments, but for the most part he let's people expand on topics, vent, etc. These comments are not spam but many of the comments have a common theme - they demonize Google. And that demonization of Google is the *unfiltered* general consensus of the SEO community regardless of whether you want to believe it or not. So instead of giving Barry's site great ranks, Google can bury it and all the negative comments about Google at the same time.

Regardless of what Google does with Barry's site, he has a solid following. If I want to get an idea of how people are perceiving a new tool from Google, an algorithm change, etc., Barry's site is the first place I visit. Because I know Barry runs more of a fair and impartial (neutral) site then any other SEO/webmaster blog or forum. Most of the other SEO/webmaster blogs and forums (this one included) pander too much to Google and have bias/restrictive posting policies that limit discussions to a point that they rarely break a superficial state.

Wilburforce

3:08 pm on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



if you write long beefy comprehensive articles


You will probably only attract a very small minority of internet users.

I don't know of anyone who would use their phone to read long beefy comprehensive articles (Google's current focus on mobile users is worth keeping in mind).

Certainly adding a lot of UGC to a concise well-written article might dilute its focus in any algorithm's view (although most humans would have no difficulty in seeing the article as the important content). However, what we are looking at here is a sudden drop in ranking for pages that Google previously had no difficulty with either, so it isn't safe to assume the UGC has anything to do with it.

However, we have all seen pages consisting of spam spam spam keyword1 spam spam spam spam keyword2 spam spam..., and possibly what looks like a minor tweak to an algorithm component dealing with that kind of abuse could have resulted in this kind of collateral damage.

The conclusion that there is something wrong with the presentation-style or content with a long-proven track-record isn't warranted by the evidence. That something might be wrong with the algorithm tweak is more likely.

EditorialGuy

3:19 pm on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I've seen Barry moderate comments, but for the most part he let's people expand on topics, vent, etc. These comments are not spam but many of the comments have a common theme - they demonize Google.


Exactly. And they're just as worthless to the searcher who wants useful information about "panda 4.1," "adwords quality scores," "john mueller hangout," etc. as they'd be if they demonized Bing, Yandex, Barack Obama, or Kim Jong-un.

glakes

3:47 pm on Jan 4, 2015 (gmt 0)



And they're just as worthless to the searcher who wants useful information about "panda 4.1," "adwords quality scores," "john mueller hangout," etc. as they'd be if they demonized Bing, Yandex, Barack Obama, or Kim Jong-un.

Instead of finding actionable information, these people just may need a shoulder to cry on until they pick up the pieces and move on. That may be of more value to them then any alleged actionable tasks promoted by the many SEO wannabees that frequent many blogs and forums. Mixed in with their anger, tears, etc., there are a good number of professional SEOs that comment on Barry's site with good information. I'd say Barry's site is well-rounded and offers something of value for all that are interested in SEO/webmaster news.

Value is in the eye of the beholder, and I find Barry's SEO blog to be ranked #1 in my opinion. If nothing else, bias and subjective comments are permitted in both directions on Barry's site - that's far more valuable then any strict anti-Google and pro-Google blog or forum. Discussions on Barry's site are permitted to evolve into deeper topics that include how Google's search algorithm changes impact the other hundreds of properties they own or are investors in and its impacts on other sectors instead of the common on-topic shallow discussions we see at most places (here included).

[edited by: Robert_Charlton at 6:37 pm (utc) on Jan 6, 2015]
[edit reason] removed personal comments, per Google forum Charter [/edit]

Wilburforce

3:53 pm on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



they're just as worthless to the searcher who wants useful information


That isn't strictly true: the number of individual posters gives an idea of the scale of the problem, and a fair number comment on how they have been affected (it isn't all blame and flame).

Also, the main article usually provides very useful information (and it is at the top of the page, "front and centre"). People who want to read the compelling summary do not have to read the less compelling comments, but if the former is buried because of the latter, they are denied both.

Whether, if UGC is having this effect, it would currently make any difference to use e.g. <article>/<footer> to differentiate between levels of importance is debatable, but if Google is intentionally de-ranking good content that allows added UGC then something like this needs to be implemented. Otherwise, what we will get is censorship by the back door.

The argument that "Google has penalised Barry's site, therefore what is wrong with Barry's site is A (B, C...)" is predicated on the assumption the Google is infallible (or at least always reasonable). I don't make that assumption.

RedBar

4:41 pm on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Companies have and still do wait like this all the time. This is how typical business actually works.


I have been in international business 47 years, this is not how the typical business in my sphere works, period.

RedBar

4:51 pm on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Joe User wants his answer. None of us here are Joe User.


And I also feel that for some of us that Joe User is either getting his answer elsewhere, maybe other sites, maybe social, however within my industry the consensus is rapidly returning to word of mouth/personal recommendation as it used to be before the noughties.

My products are heavy even for wholesalers, for retail purchasers they're an almost impossibility to transport...the logistics are just against travelling far and wide for a retail purchaser uless buying a small truck load.

Oooops, went a bit off topic!

EditorialGuy

5:35 pm on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The argument that "Google has penalised Barry's site, therefore what is wrong with Barry's site is A (B, C...)" is predicated on the assumption the Google is infallible (or at least always reasonable)


No, it's based on the assumption that Google gets to make its own judgments about a searcher's intent, the value of original information vs. short summaries published by aggregators, and the value of a given set of user comments.

Google isn't keeping anyone from reading Barry's summaries or the chaotic comment threads that accompany them; it just isn't assigning them as much value as it does to, say, original and in-depth articles by Barry and other reporters at Search Engine Land.

FranticFish

6:35 pm on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Joe User wants his answer

But what if the answer's not straightforward? And, if every single one of the ten sites on page 1 return the same 'answer' to a 'question' that might have different shades of meaning, then why have ten results?

Whitey's '2015 predictions' thread contained some interesting points about whether search engines should be responsible for veracity, and this seems relevant here.

WebmasterWorld has a strict moderation policy and high signal to noise ratio as a result. For what it's worth, that would be my personal preference if I ran a discussion website. Other sites are more freely moderated or hardly moderated. There's doubtless more noise, but there's still some real gems too. And (for example), if you're trying to understand an algorithm run by a profit-making enterprise then that enterprise's business operations could well be relevant when trying to understand how that algorithm operates in practice, and understanding what people are observing, as opposed to what that company's public relations team are allowed to say on the record.

It seems to me that - if the comments are the problem - then the site owner is being basically told that they need to completely change their editorial slant to conform. As Barry said, where would he start?

The popularity of the site is not in question, so it seems that Google are effectively saying that a lively discussion site is low quality. Anyone who has spent any time reading comments on any newspaper sites would mark SER as HIGH quality by comparison.

It's not being asserted that the comments are fake, or by bots, or being spammed for links. So I would have thought that anyone who prided themselves on neutrality would see penalisation as a false positive, rather than telling the site owner that the problem is theirs: effectively 'We don't like your community and how they express themselves'. That's an editorial decision, and it shows bias. It's not "organising the world's information".

EditorialGuy

7:06 pm on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It's not being asserted that the comments are fake, or by bots, or being spammed for links. So I would have thought that anyone who prided themselves on neutrality would see penalisation as a false positive, rather than telling the site owner that the problem is theirs: effectively 'We don't like your community and how they express themselves'. That's an editorial decision, and it shows bias. It's not "organising the world's information".


Quality scores aren't the same as "penalties," and all search algorithms involve editorial decisions. Nearly a dozen years ago, a U.S. federal judge wrote:

Page Rank [the judge obviously meant "page ranking] is an opinion - an opinion of the significance of a particular web site as it corresponds to a search query. Other search engines express different opinions, as each search engine's method of determining relative significance is unique....

...A statement of relative significance, as represented by the Page Rank, is inherently subjective in nature.


Source:

[internetlibrary.com...]

FranticFish

9:51 pm on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



That source mentions an editorial decision on the significance of a site to one query.

What's being made here looks to me more like a judgement on a website as it corresponds to ANY search query - a blanket editorial decision that, once made, is made in advance and pre-judges the site as being a less satisfactory answer for anything going forward.

Now, you could argue - and convincingly - that this is still reasonable, were you able to show that this is how your evaluation process ran across the board. But you might look like a complete hypocrite if you also ran the web's biggest publicly accessible repository of bad grammar, bad spelling, trolling and advocacy of hate crimes including rape, racism, and even genocide.

---- Added:
To turn it around, what signals might you look for that would indicate quality, with the absence of them more likely to indicate lower quality or even spam?

- comments unique to this site (no cutting and pasting, or drive-bys)
- similarity of comments from page to page (i.e. lots of similar comments indicates little discussion, could indicate spam)
- number of different commenters involved (per post, on the site as a whole)
- topical interest vs longterm interest (how comments are spaced out time and date wise; an old article being 'resurrected' might indicate spam)
- number of sub-threads (i.e. replies to replies)
- topical relevance of comments to subject of article
- number of comments that cite sources relevant to article (or irrelevant to article)

These are all pretty clean signals, and side-step others that are - to my mind at least - far less clear like post length, spelling/grammar, conversational tone (including use of expletives or exclamation).

Note that whilst all of these indicate quality none of them try to judge anything other than noteworthiness as implied by usage/popularity.

Martin Ice Web

10:58 pm on Jan 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month




6 months? WTF? No wonder my new sites are not ranking!


I guess it depends on brand factor and niche.
I just rebuild a site for a customer. I splitted the site from single multilanguage to sub domain languages. The old pages got redirected to the sibdomains. The different language sites did not rank very well but now 2 weeks after moving the sites the pages ranking very well for the targeted keys on all languages with multiple pages.

Wilburforce

12:02 am on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



No, it's based on the assumption that Google gets to make its own judgments about a searcher's intent


This equates "judgments about a searchers intent" with the effects of an algorithm change, which are not even apples and oranges.

It isn't impossible that Google can get it wrong (i.e. that some algorithm changes do not fulfil its judgment about a searcher's intent): some effects of an algorithm change may be unforeseen or unintentional, and if every change always exactly fulfilled intention there would be no need of further evaluation or refinement.

Of course Google is ENTITLED to get it wrong (or to reduce any page's rank intentionally for that matter), but what JM actually said was "I don't really know exactly what our algorithms are looking at specifically with regards to your website". That statement qualifies the series of guesses and hints that followed it, which may or may not apply to Barry's site.

rustybrick

1:33 am on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



FYI, just to chime in about the comments. I delete/spam over 50% of the comments left on the site. So I do spend an enormous time moderating the comments.

I am really enjoying the the feedback here.

RedBar

1:54 am on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I guess it depends on brand factor and niche.


In my widget sector there are some massive brands, including mine, MOST are only known as wholesale B2B. Globally many, many of these huge businesses do NOT want to be known by any search engines, they are, quite simply, not interested!

Has that come as a surprise that some companies do not want to be known?

Maybe so, but some of us DO want the products to be known but when the originator gets buried beneath all the spammy crap, then what?

What has this to do with this discussion? Google's not getting its facts correct many-a-time, as someone said here or another thread, its veracity. Is it any wonder that we as webmasters have to question all the time these days about almost every twist and turn Google does since it seems to be getting things wrong more and more all the time.

I am really enjoying the the feedback here.


There has been some valuable feedback here for all and especially so Google...I do hope they are reading this, after all, WE are in the real "world", We are the ones having to bear the brunt of any algo change they make at the behest of ANY algo change THEY believe is the better for ?who?/them or the user?.?.?

They may not suffer, many of us do!

Selen

2:43 am on Jan 5, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



all search algorithms involve editorial decisions

I think it's incorrect. If that was the case, search engines would be responsible for content they present to users. (if someone is actively moderating comments he/she may be legally liable for their contents). At least that's what my professor has recently told me in college ;).

Planet13

2:50 am on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@ rustybrick / Barry:

Can you please let us know more about the drop in traffic / visibility you have experienced?

For instance, can you deduce what TYPE of keywords that have lost traffic?

I ask because I have seen your site outrank the original source article many times. For instance, I have seen your site that referenced a Matt Cutts Blog Post on XYZ actually outrank the original Matt Cutts Blog post on XYZ.

Whitey

3:05 am on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I just rebuild a site for a customer. I splitted the site from single multilanguage to sub domain languages. The old pages got redirected to the sibdomains. The different language sites did not rank very well but now 2 weeks after moving the sites the pages ranking very well for the targeted keys on all languages with multiple pages.

@Martin_Ice_Web - was this a Panda effected site? I'm just wondering if the content on the new subdomains has been properly assessed for Panda, and proven stability for quality. The aggregation process seems to take some time according to JM.

but what JM actually said was "I don't really know exactly what our algorithms are looking at specifically with regards to your website". That statement qualifies the series of guesses and hints that followed it, which may or may not apply to Barry's site.

@Wilburforce - yes, I don't think it's clear at all- but the discussion helps strengthen some focus I suppose on areas of attention. If it becomes obvious to a group, then you'd have to hope the consensus would better guide our decision making around the overall understanding around the quality guidelines to break the Panda penalty.

I am really enjoying the feedback here.

@rustybrick - thanks for providing us with your continual contributions with JM on the hangouts, as well as the others on the panel, to make these discussion possible. Let's keep our fingers crossed for you.

incrediBILL

5:13 am on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Most of the other SEO/webmaster blogs and forums (this one included) pander too much to Google and have bias/restrictive posting policies that limit discussions to a point that they rarely break a superficial state.


That's not true whatsoever.

We post topics pro and con all the time. Admins and mods often post things that aren't in favor of what the SE's are doing, I do all the time. We're about as unbiased as it gets as long as you stick to the topic.

What some call censorship, we call keeping on topic. If keeping ON TOPIC and being respectful of other posters is a restrictive posting policy then yes, it's restrictive.

If you want to post something off topic, start a new thread, it's as simple as that.

Martin Ice Web

7:15 am on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@whitey, the site never did suffer from panda or peguin. Even not allthough there has been 4 languages on one domain and content was mixed languages and very shallow. The german domain seems considered as a brand. I guess this backlink and the aggregation of the german and .com domain in WMT helped this site. It has about thousand backlinks from one domain. It never did hurt. The bad ranking was due to the fact that it wasn't supported by the owner. With the german page we can nearly rank for everything we like that is close to the niche.
The site has now been setup to quality content to a degree that is possible in this niche.

roshaoar

11:10 am on Jan 5, 2015 (gmt 0)

10+ Year Member



One thing to observe, his site maybe popular but that is no guarantee of quality. Like McDonalds.

I sampled 3 random articles - all less than 200 words, 2 were regurgitated items from elsewhere made into a very thin article, and one had some (limited) gravitas, but the 15 comments of bile below it made it horrible to read.

Now if this site used to outrank originals then I'd have to say that overall it's a good thing that it no longer does.

ps. Discussion of this particular site could really benefit from being in separate minor thread. The issues in the Google presentation are really bigger than one guy's site.

rustybrick

12:39 pm on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@Planet13 truth is, I don't track keywords. I just saw a Google organic traffic drop on that date. Maybe searchmetrics, which only looks at ranking data has better metrics? I assume it was not just one or two keywords, but many many keywords. I do have tens of thousands of stories over 11 years old, with on average 5 new stories per work day. So a lot of content.

Planet13

2:48 pm on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@ rustybrick

Thanks for the elaboration.

Another question if I may: Many on this forum seem to think that John Mueller was implying your site could have passed a ratio of bad comments versus good comments, or passed a ratio of user generated info versus author-originated info.

Was that your takeaway from John Mueller as well? Or did you get a different impression?

I know that you have been able to ask JM quite a bit through the google hangouts, so maybe your overall impression is different since you have had more opportunities to pick his brain than the average webmasterworld used (such as myself) has had.

elguiri

4:25 pm on Jan 5, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



I think there is a load of BS in John Mueller's reply to Barry's problem. I don't think Google can distinguish between "ranty" and "non-ranty" comments, and why the hell would it care anyway?

I think there are some specific things that are more likely to cause problems, like the fact the site has all of the tag and category pages available for indexing (12,000 pages of it!). This is crap duplicate content that Google has long punished.

Sometimes things are right in front of your nose.

Planet13

5:13 pm on Jan 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think there are some specific things that are more likely to cause problems, like the fact the site has all of the tag and category pages available for indexing (12,000 pages of it!). This is crap duplicate content that Google has long punished.


While you could be right about that, to play Devil's Advocate, what if all those link tags and category pages increase user engagement enough in a way that is significant (and discernible) to google?

Some users probably LOVE tag clouds.

And I would speculate that google is smart enough to discern between "duplicate" content and content that is available under multiple URLs. (I don't know if Barry's site makes effective use of the canonical link tag or not.)
This 126 message thread spans 5 pages: 126