Welcome to WebmasterWorld Guest from 54.224.121.67

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

John Mueller talks Panda and Penguin penalties on hangout-30Dec14

     
6:27 am on Jan 2, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 28, 2005
posts:3072
votes: 27


English Google Webmaster Central office-hours hangout

Streamed live on Dec 30, 2014
https://www.youtube.com/watch?v=Ba_qLBFlIe4&t=08m37s [youtube.com]

I thought this was one of the more notable conversations involving a potential specific Panda penalty issue on Barry Schwartz / RustyBrick's [webmasterworld.com...] website, the things that the algorithm may be looking for, and later some further insight into Penguin penalties and the disavow tool.

John Mueller said that he would pass on the Panda query to his team, and hopefully we get some insight into the potential Panda quality issue's Barry will need to resolve. It would be good to have some response from the G team.

All sorts of gems in the mix, things like proportions of content, comments, time to recovery, process clarification etc. etc. The video starts at 8mins 37secs, so I may have missed some other gems which I'll check when I next have a spare hour or so.

Anyway, the hangout might be worth having a look at, and passing on your analysis and comments. One thing, ..... it's complicated [ as if you didn't know ]. Any inputs appreciated.

Can we keep this thread OT and avoid the temptation to run Google down - more try and work out what this video teaches us.

[edited by: Robert_Charlton at 9:36 am (utc) on Jan 2, 2015]
[edit reason] fixed YouTube url [/edit]

9:10 am on Jan 2, 2015 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:11952
votes: 308


Whitey, thanks for that. Regarding which site of Barry's has gotten hit by Panda... it's not the rustybrick site in Barry's profile, but it's rather Search Engine Roundtable, and the seroundtable discussion is this one...

Did Comments Cause This Site To Get Hit By Google Panda 4.1?
Dec 31, 2014 - by Barry Schwartz
https://www.seroundtable.com/google-panda-ser-hurt-comments-19652.html [seroundtable.com]

As is typical in many of Barry's articles, the article is very concise, and the extended discussion is in the Comments section of the article. Barry announced in the article that he's decided to remove the Comments from the HTML source code, based on this input from Google's John Mueller...

He said that maybe, he said he didn't look into the site specifically, that maybe, if there are a lot of low quality written comments compared to the quantity of content, than maybe that is bringing the site down?

In the comments section of this article, there's considerable discussion about what other issues might be. As Whitey suggests, there are a lot of gems in the mix, with various considerations of how Google might score the type of material Barry publishes, a mixture of Barry's reporting and UGC comments.

One factor, eg, might be Barry's method of working. This is how he characterizes his content, which is "short, quick, and to the point", and often a distillation of previously published material....

...90% are from digging through the discussion forums and bringing out tidbits of information no one would have seen otherwise.

Is Google regarding the brevity of Barry's content as a shortcoming and discounting it, rather than rewarding it as a distillation? What about questions of originality, also discussed in the comments?

Or is it, as Barry believes John is suggesting, simply the ratio of useful content to fluff?

Worth mentioning that Barry is very highly regarded an SEO journalist, with a large following, myself included... and one would think that Barry's authority and the traffic the site attracts would be a large factor in its favor.
10:05 am on Jan 2, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 30, 2002
posts: 2609
votes: 92


Seems that Google has a lot of problems with UGC from the non-answers being given. That point about Google seeing the whole page and not necessarily differentiating between site content and UGC is especially interesting for sites with their own commenting software/engine that is not based on WP/Joomla or the commercial commenting systems. Perhaps it might be best for newer sites to use a commercial commenting system because the people in Google can't cope with custom commenting systems?

That splitting of UGC to a separate URL suggestion is worrying because it is effectively a suggestion to help Google rather than site users and many small sites might not be able to afford or implement the extra complexity.

The links from directories question seemed to be another problematic non-denial denial one in that it indicated that dodgy link building should be repaired by the site owner. This does raise questions about how banjaxed Google's algorithm is that it cannot automatically flag links from spam directories. Google seems to rely on site owners fixing its problems rather than site owners fixing their problems.

Regards...jmcc
10:25 am on Jan 2, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 30, 2002
posts: 2609
votes: 92


Is Google regarding the brevity of Barry's content as a shortcoming and discounting it, rather than rewarding it as a distillation? What about questions of originality, also discussed in the comments?
Looks like a classic hammer flaw in Google's algo.(If all you've got is a hammer then everything seems like a nail.) The problem with using any kind of textual analysis on a body of text is that you need a decent body of text for the algorithm to be reasonably effective. Short and concise posts may not provide a sufficient body of text and may look spammy to a crudely developed algorithm that isn't modified for exceptions like limited text cases. Might be worth seeing if long posts rank/fare better than short posts on the affected sites as this might highlight or confirm any flaws in the algo. Thinking like a search engine developer, this is a bit of a mess when the parser cannot discriminate between site content and UGC due to a custom commenting system. The algo might be taking all text as the input.

Regards...jmcc
3:43 pm on Jan 2, 2015 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month Best Post Of The Month

joined:May 9, 2000
posts:24719
votes: 612


You know, this is one good example of a site appearing to be hit, yet, on the face of it, a quality and authority site.

Yes, there's a lot of good little snippets in the video, and it's worth listening to (you don't actually need to see it, just listen, so run it in the background).

It's speculation that the site was hit by Panda, and not confirmed.

This is the kind of thing that may impact every site, especially those accepting comments and discussions. Just about most blogs and every forum.

The key part here I read into this is ensuring quality content, and minimising off-topic comments.
4:05 pm on Jan 2, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Oct 14, 2013
posts:2752
votes: 328


more try and work out what this video teaches us.


That Google really doesn't have a clue what its algo is doing!

I'm not bashing, this is the reality, this video is uhm, ahhh, one thinly-veiled I don't know excuse after excuse.

So UGC can now bring down a page/site unless it is moderated and the lower quality UGC is deleted, awesome, now we know yet another way to ruin someone else's site.
4:52 pm on Jan 2, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month

joined:June 28, 2013
posts:3050
votes: 581


Maybe Google is getting better at distinguishing aggregator sites from sites with original content, and giving a boost to the latter at the expense of the former? (Barry may be a fine search journalist, but his posts at Search Engine Roundtable are often little more than abstracts of more detailed articles that he wrote for Search Engine Land or summaries of threads at forums like Webmaster World.)

Still, I'd be inclined to place most of the blame for Barry's problems on the user comments that accompany his short posts. Barry does little if any moderation, and the comments on Search Engine Roundtable are a perfect example of what Eric Schmidt meant when he said the Internet is becoming a "cesspool" where false information thrives. SERoundtable's comments are loaded with fake identities ("Matt Cutts," "Larry Page"), personal attacks, and bitter rants.

IMHO, comments should add value to the author's content. If they detract from the value of that content, it shouldn't be surprising when third parties (such as search engines) are unimpressed.
4:54 pm on Jan 2, 2015 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month Best Post Of The Month

joined:May 9, 2000
posts:24719
votes: 612


RedBar, you could look at it that way, in the sense of negative seo, but you could also look at it from the positive perspective on how to help ensure your site keeps itself in good standing. :)
5:19 pm on Jan 2, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month

joined:June 28, 2013
posts:3050
votes: 581


So UGC can now bring down a page/site unless it is moderated and the lower quality UGC is deleted, awesome, now we know yet another way to ruin someone else's site.


Maybe, but so what? Readers don't care about the motives behind nasty or useless comments, they care about what's on the page.

The solution is simple: If you're going to rely on user-generated content, take responsibility for what gets published.
5:29 pm on Jan 2, 2015 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14627
votes: 392


...simply the ratio of useful content to fluff?


I think that's simply it. The entire page, including comments, is considered the content. So the fluff is overwhelming the signal. The hardass response from Google could be that the ball is in Barry's side of the court to moderate the comments or come up with a technical solution to block the comments from crawlers. Would be nice to be able to no-index/no-follow a portion of a page, heh.

Blocking a portion of a page from crawling could be construed as cloaking. But that's similar to what we do with no-followed links published in a UGC context because the links do not necessarily reflect a vote by the web page. Similarly, UGC may not be considered trustworthy by the publisher. Perhaps a schema.org solution could help the search engines identify content that is not trusted and instruct them to not index or follow it?

Oh! This is very interesting!
6:24 pm on Jan 2, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 16, 2009
posts:1067
votes: 73


Wasn't a 'noindex' tag for parts of a page on a wishlist here a way back?

[opinion/rant]I think this is a TERRIBLE signal for Google to give a damn about because it's so messy, and a penalty on the site in question should be a cause of embarrassment for the search engine concerned as a glaring false positive.[/opinion/rant]

If you were going to deal with comments differently from other parts of the page, what would you look for?

- brevity of post (i.e. pure character count)
That would make Netmeg our lowest quality poster. Yeah.

- superlatives (i.e. words like great / good / super)
If you're smart enough to detect these, then you should also be smart enough to dial them down and count them as noise

- bad grammar/spelling
Looked to me like a signal from JM's answers

- irrelevant to the topic
Possibly valid on a quality count as it points to a non-moderated page.

My takeaway from this is that popular website owners should either block or edit their comments. Not a good outcome.
7:24 pm on Jan 2, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month

joined:June 28, 2013
posts:3050
votes: 581


I think that's simply it. The entire page, including comments, is considered the content. So the fluff is overwhelming the signal.


Sure. And in the case being discussed here, the "fluff" isn't just fluff: It's mostly nasty, abusive content of a kind that Google may not want to promote in its SERPs. After all, the person who's searching Google for "panda" or "penguin" or whatever probably wants useful information, not vituperative rants from trolls.

Some site owners may feel that an "anything goes" approach to UGC is essential to building community. They have the right to choose that path, but Google, Bing, and other search engines also have the right to judge whether the resulting pages are what searchers are looking for.
7:25 pm on Jan 2, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8552
votes: 239


Perhaps a schema.org solution


This comes up in the comments and John Mueller says that this would not have an effect on the overall algorithmic quality rating

structured data markup like that doesn't significnatly change the quality of a page / site, so if you're looking to improve the quality, I'd start elsewhere.
7:26 pm on Jan 2, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8552
votes: 239


followup...


John Mueller 5 hours ago

I'd just see this like any other kind of UGC you might have on a site. In the end, the webmaster is the one who publishes the content and provides the framework for it to be crawled, indexed, and shown to users. It's not a matter of saying "please ignore this part, I didn't write it myself" (the random visitor of your site wouldn't do that either), it's really more of a matter of making sure that the site overall is of high quality.

For most sites, completely turning off UGC just because there's some low-quality UGC out there seem a bit too much to me, just like you wouldn't remove all comments on a blog just because there's some comment spam getting through. UGC can provide a lot of value, and if there's a passionate community on a site, I'd try to find the right balance or split it in a useful way. Other sites work hard to get that kind of UGC :)
9:42 pm on Jan 2, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 30, 2002
posts: 2609
votes: 92


@Martinibuster
I think that's simply it. The entire page, including comments, is considered the content. So the fluff is overwhelming the signal. The hardass response from Google could be that the ball is in Barry's side of the court to moderate the comments or come up with a technical solution to block the comments from crawlers. Would be nice to be able to no-index/no-follow a portion of a page, heh.
Ironic given that such a problem is really Google's doing in that its parsers cannot differentiate between content and UGC. With ordinary commenting systems, it would be a case of modifying the parsers. However there would be a scalability issue. The real problem is that custom commenting systems would require custom parsers. This is, I think, why the use of a separate URL for comments was being so strongly "suggested".

Oh! This is very interesting!
I think that the Spanish webmaster talking about a comparison website mentioned making some content in a page non-indexable or similar. The separate comment URL approach is messy but if a site has been hit, then there's little to lose from a bit of experimentation.

Regards...jmcc
11:10 pm on Jan 2, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month

joined:June 28, 2013
posts:3050
votes: 581


The hardass response from Google could be that the ball is in Barry's side of the court to moderate the comments or come up with a technical solution to block the comments from crawlers. Would be nice to be able to no-index/no-follow a portion of a page, heh.


It might be nice for Barry, but would it provide a good experience for Google searchers? How would searchers benefit from having Google ignore low-quality content on a page?
11:49 pm on Jan 2, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 28, 2005
posts:3072
votes: 27


If you drill down into JM's line of conversation he expresses a vagueness about what it could be ( without analysis ) which he would need to dig deeper into. But what I find intriguing is the talk about "negative comments" ( phrases ) , excessive content that does not add value, proportions on a page, assessments being considered over the entire site.

In the conversation JM describes the concept of the algorithms assessing a block of content and determining if it is quality or low quality. I would have thought Barry's blog was high quality because he trawls the SEO world and surfaces content others don't have time to trawl. How can an algorithmn determin if this is high or low quality. Is it on the extent of the originality. I get a sense that false positives can play into this and something so strong needs to be done to differentiate.

Maybe Barry has to think about in depth articles of originality and posting the articles on Forbes etc. That would be sad IMO due to the value users put in his resource.

Then take this with questions the Spanish SEO asked regarding his client who has a site in a highly competitive medium that relies on database driven content. JM is clear on how he speaks about this, it being content that Google considers and the layout / presentation being important. To me, the inference is that Google is watching how users interact on the UI.

Also when asked about whether JM had access to Panda analysis tools on specific websites the answer seemed to be "yes" but it looked like JM was uncomfortable. My interpretation was both a need to be guarded and also a need to be open. I wonder what Google's policy is these days on discussing penalties. Has it loosened ?

Some of you may have a keener technical eye for detail than me, so again your better inputs would be appreciated

[edited by: Whitey at 11:59 pm (utc) on Jan 2, 2015]

11:54 pm on Jan 2, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 7, 2006
posts: 982
votes: 79


How would searchers benefit from having Google ignore low-quality content on a page?


How does Google identify low-quality? It isn't possible to ignore something you can't see.

There is a difference between a good book and a long book. Everything an algorithm considers must be quantifiable.
1:04 am on Jan 3, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month

joined:June 28, 2013
posts:3050
votes: 581


There is a difference between a good book and a long book.


There's also a difference between a page that's 100% useful content and one that's 20% useful content and 80% rants, hate comments, insults, etc.

Based on what John Mueller seems to be saying, I'd guess that Barry got himself into trouble by letting the inmates run the asylum. To solve the problem, he could:

- Use his editorial discretion to moderate the comments on his pages, so that comments add value to his posts (which they certainly aren't doing now);

or:

- Put the comments on separate pages (preferably pages that aren't indexed).
1:06 am on Jan 3, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 28, 2005
posts:3072
votes: 27


@Wilburforce - if only we really knew. If I was a betting man I'd say Google can see what we think is blocked and no followed. And it does take this into consideration when assessing quality. If the user can see it, so can Google. Forget about indexing for users. It's there for users or it's not. Just my 2 cents worth of speculation here.

But my speculation is short of scientific certainty.

@ergophobe - thanks for the update from JM. Kinda supports my earlier thought.

@EditorialGuy - actually I did notice a comment on Barry's article re this, criticizing him for having lots of thin content pages and posting a page example. Not that I like rants and abuse, but the commenter may have a point around maintenance / cleaning up constantly. I know Barry claims not to earn anything from his efforts, but then I'm sure it gives him notoriety and business referrals. So maintenance may not be an excuse for forgetting about quality.

Just reminding folks there's a lot more to the hangout than just Barry's site. Thoughts and analysis appreciated if you can spare the time :)
3:37 am on Jan 3, 2015 (gmt 0)

Full Member

joined:Dec 11, 2013
posts:229
votes: 38


I'm not sure about a good definition of 'USG.' It looks webmasterworld is 100% USG then? Is USG content automatically penalized in search results?
3:47 am on Jan 3, 2015 (gmt 0)

Moderator

WebmasterWorld Administrator ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8552
votes: 239


@Whitey - RE your sticky mail asking where I was seeing Mueller's comments, I figured I'd answer here in case others were wondering...

They are in the comments to the video you linked to. I couldn't figure out how to link to them any more precisely.

Expand the replies to the query from John Britsios and you'll see Mueller's replies.
5:09 am on Jan 3, 2015 (gmt 0)

New User

joined:Jan 3, 2015
posts:6
votes: 2


Long time lurker making first post.

I think more webmasters would be inclined to entertain Google's suggestions if they led by example on their own properties, like YouTube. Show us how you want the content and comments presented and rid the domain of all the comments that add no value.
5:20 am on Jan 3, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 28, 2005
posts:3072
votes: 27


@ergophobe - thanks for the clarification. Is it possible to embed key videos into the discussions for future use with the time and text of commentary switched on?

@Bucklee - welcome to the forums
5:29 am on Jan 3, 2015 (gmt 0)

Junior Member

joined:May 16, 2014
posts:141
votes: 0


My overall opinion of John Mueller is that he speaks in generalities with any specifics rarely seen. In the portion of the video I watched, John Mueller seemed to be stating the comments were the probable issue which I found very refreshing.

Several references to design which I found interesting, also that ads were not an issue in this case, but ad content could potentially be an issue for a site. UGC is kind of a double edged sword, works great but you have to keep an eye on that other edge.
5:30 am on Jan 3, 2015 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 22, 2011
posts:98
votes: 0


@Bucklee, very true about the Youtube comments. At the end of the day links and popularity determine what you can get away with. For an example smaller sites would never get away with the sheer duplication happening on Pinterest.
6:06 am on Jan 3, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member planet13 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 16, 2010
posts: 3823
votes: 29


I am not 100% sold on the comments being the root of the problem, and I would be reluctant to have any sort of strong suspicion that "hiding" comments from googlebot will help rankings (As far as I understand, Barry has made efforts to prevent google from indexing comments - apologies if I understand this incorrectly).

To me, it is not a preponderance of unhelpful content, but more a lack of extremely helpful content. Is his site being punished for having unhelpful comments? Or is his site being punished for NOT HAVING ENOUGH HELPFUL content? (Whether that content is generated by the site owner or the users.)

You can take away the comments from seoroundtabe, and you will still have a dearth of truly helpful content. No offense to Mr. Schwartz, but his site is more-or-less a directory that lists various SEO articles around the web, with minimal input by himself. Quite frankly, the BEST thing about his articles are the user-generated comments, and as people have mentioned, those comments aren't all that great.

And out of curiosity, what keywords SHOULD seoroundtable rank for?

To be honest, I have always thought that seroroundtable was always ranked higher than it should rank. I have often tried to search up SEO articles from different people and have found that some of the seoroundtable.com pages that REFERENCED the original article outranked the original article itself.

I figured that was part of the branding / trust algorithm kicking in. The articles on seo roundtable would be maybe two paragraphs tops, have a link to the original article, and then have 200 comments. None of the material on the seoroundtable.com site would really make that page more helpful than the original article.

~~~~~


And yeah, I watched the video and listened to JM's comments. While some people insist that JM was implying that enough bad comments could sink your ship, I didn't get that.

And if a simple ratio of bad user-generated content - compared to original, authoritative content - were the key to ranking well, then yelp would have face planted long ago. They have NO original content and the reviews are often beyond ridiculous. (Reprinting the phone number and address of a business is NOT original content.) But they have enough helpful info in the reviews that yelp still ranks extremely high in local search.
7:11 am on Jan 3, 2015 (gmt 0)

Full Member

5+ Year Member

joined:May 30, 2009
posts:233
votes: 7


Used to read that site often, but lately it's just regurgitated, as someone said above, fluff. And most of the comments come from the same people on almost every post, and add nothing useful or actionable.

As far as trying to figure out what Mueller really meant...it's moot...we never figured out what Cutts was trying to say, either. Very little of what they say is definitive and makes sense 100% of the time.
8:19 am on Jan 3, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 30, 2002
posts: 2609
votes: 92


@Wilburforce
How does Google identify low-quality? It isn't possible to ignore something you can't see.

There is a difference between a good book and a long book. Everything an algorithm considers must be quantifiable.
That's the big problem with Google's approach. You only have to read what was recommended as being a "good" site to realise that Google has a very restricted and almost completely academically influenced approach to what is a "good" website. It completely differs the vast majority of real world websites but that doesn't stop Google trying to evaluate what is essentially a human response. This stunted academic view versus real world view issue is at the heart of the problem and one of the main reasons for Google's near complete failure on Social Media. It might be very good on content rich issues but it is absolutely pathetic on what are essentially human value judgements hence it has to have panels of "quality" raters.

People write differently depending on their audience. That means that if I was writing a technical article, I would use technical terms and not explain everything for people who were outside the field. If someone is writing a fanzine type of article about celebrities, it would be a completely different writing style. The content size might also be different. There's been a gradual change in writing styles over the past few decades and the stuffy academic tomes that Google seems to have in mind (large multi-page articles) have largely disappeared and have been replaced by bursty, short articles that are more like news bulletins than long and well reasoned academic papers. Unfortunately the long academic style articles offer the most data for Google to analyse whereas the short articles don't have enough content for reliable content analysis. And where catalogue and comparison sites are concerned, most pages will be, by their very nature, thin content.

I'm sure there are some Google patents where Google describes how it can measure quality in a webpage.

Regards...jmcc
9:46 am on Jan 3, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 7, 2006
posts: 982
votes: 79


@EG

There's also a difference between a page that's 100% useful content and one that's 20% useful content


Yes, we know that, but how does Google differentiate?

In the discussion JM refers to "difficulty" with what is principally that kind of differentiation on Barry's site.

It is no use writing "good content" - many of us already do - if Google is applying the same kind of grammatical evaluation that I get from MS products. Similarly, there is no point in blocking or editing UGC if Google cannot tell the difference between a rant and a sound argument.

The other question that isn't really addressed in that discussion is the search-term(s) affected. While "good content" might be front and centre on the page, the search term should be front and centre in the algorithm.

What JM is saying is that the importance of matching search-terms with content can be diluted by some "quality" evaluation which - by his own admission - has difficulty with some types of UGC.

I don't think we should assume that every time a site loses position it needs "cleaning up". Possibly it is the algorithm that requires it.
This 126 message thread spans 5 pages: 126
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members