homepage Welcome to WebmasterWorld Guest from 174.129.76.87
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 100 message thread spans 4 pages: < < 100 ( 1 [2] 3 4 > >     
Google Rewrites Quality Guidelines
netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4686381 posted 3:06 pm on Jul 9, 2014 (gmt 0)

I'm going to post this link here, because it's from a credible source (WebmasterWorld user jensense) and because it touches on some of the things we've been discussion lately - particularly the Knowledge Graph. Interesting, and worth a read.

[thesempost.com...]

Here's the analysis on "Supplementary Content"

[thesempost.com...]

[edited by: brotherhood_of_LAN at 2:35 pm (utc) on Jul 11, 2014]
[edit reason] Added extra link [/edit]

 

jmccormac

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



 
Msg#: 4686381 posted 1:01 am on Jul 11, 2014 (gmt 0)

I think you misunderstand how human quality raters are being used. It would be idiotic for a search engine to create algorithms without human input or feedback.
I think that you don't understand how search engines are built. But then you are no different from the millions of other search engine users who see search engines as simple black boxes where a query provides a result.

In the past week, I've run 110K domain website usage and development surveys on COM/EU/FR/CO/CO.UK and am currently running other web usage surveys on a number of other TLDs. On June 26th, I ran website usage surveys on all domains in the top ten new gTLDs. These website usage surveys categorise how websites are being used in various TLDs. They rapidly categorise how websites are being used and identify holding page sites, PPC parked sites, compromised sites (something that Google cannot even do properly), clone websites, duplicate content websites, brand protection registrations. These website usage surveys are precursors to building search engine indices. This gives me a bit more of an insight into how the web really looks. Last month, the surveys covered about 1M websites. I've also run multi-million domain website usage surveys. So what experience do you have in search engine development?

Most of the problems with Google are easily solved and can be solved from an algorithmic point of view. Of course to SEO fans, who really haven't much of a clue about this subject from the search engine developer point of view, this reliance on human quality raters is a great thing. It means that search switches from the objective to subjective opinion of a bunch of human raters who may or may not know anything about the subject of the site they are rating. So it is more money for the cargo-cult high priests while the fundamental problems of Google's search are ignored in favour of simple sticking plaster efforts.

Google's fundamental problem is this: GIGO. Rather than stopping the garbage getting into the index, Google is using the raters approach to figure out what is garbage and what is gold.

Regards...jmcc

[edited by: jmccormac at 1:36 am (utc) on Jul 11, 2014]

lucy24

WebmasterWorld Senior Member lucy24 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4686381 posted 1:02 am on Jul 11, 2014 (gmt 0)

In the science world papers can be published by ANYONE to be pulled apart, disproved and debunked, by fellow scientists.

Well, hardly. If a paper's background doesn't include peer review before publication, most scientists won't even consider it worth the bother of debunking. Unless, of course, some non-scientist who can't tell the difference gets hold of it. But that's another issue.

Now, if every website had to be approved by at least two or three other humans with websites before it could go online ...

Hm. Interesting line of thought.

Saffron



 
Msg#: 4686381 posted 4:58 am on Jul 11, 2014 (gmt 0)

I believe the most well known and highly regarded expert on feline genetics had no "formal" qualifications. But I guess Google would consider your everyday vet more qualified.

Another man I know of is a well known for his expertise on a personality disorder, he spends his life writing about it. He has no formal qualifications, but he suffers from the disorder. Not everybody likes him, but he's certainly somebody who knows what he is talking about in regards to the condition.

It's just not always that black and white.

Jenstar

WebmasterWorld Senior Member jenstar us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4686381 posted 11:56 am on Jul 11, 2014 (gmt 0)

A huge percentage of the people building websites in my niche are nowhere near claiming the title of "WEBMASTER". They're hobbyists/enthuisiasts who put up some kind of website or page using the easiest way they can find.

Webmastering IS NOT what they are doing, or ever will be doing!


Well we do see more and more of them popping up because suddenly their $200 AdSense check turned into $2 and they start trying to figure out why. But those don't tend to be in super high competitive fields (if they were, spammers would either try and spam them to death, or buy the forum/site to use and abuse).

I think having quality raters that aren't SEOs is the smart thing for Google to do. After all, SEOs aren't a very big % of legitimate searchers, and we tend to look at things much differently than my mom would.

I link to the Quality Rater's Guidelines now in the post netmeg linked (I am not sure the rules if I am allowed to post it), and I did an in-depth piece on the "Supplementary Content" that was talked about quite a bit here - local and small business ones will be coming next week by request :)

And I must say, I am impressed this thread managed to derail into the internet's most favorite topic ever... cats!

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4686381 posted 2:26 pm on Jul 11, 2014 (gmt 0)

I'll post it.

Here's the analysis on "Supplementary Content"

[thesempost.com...]

Catalyst

10+ Year Member



 
Msg#: 4686381 posted 6:04 pm on Jul 11, 2014 (gmt 0)

Bingo just found a leaked copy. Just got my hands on it and am digging through. Hate to link to it in case Google finds out and buries it.

Also am really busy so don't want a bunch of PMs asking about it. So I guess I'll give hints, but Google can easily track it down that way too if they are reading this.

Hint: Ana Hoffman G+. (Mods can delete if it's best this info is not shared.)

Not much in it pertaining to local, which is my main focus. But I'm still digging around. Lots of info inside about how Google evaluates quality content.

Linda

LostOne

5+ Year Member



 
Msg#: 4686381 posted 8:42 pm on Jul 11, 2014 (gmt 0)

Lots of info inside about how Google evaluates quality content.


Yes, very interesting. Though I'm a bit spooked to look at the example pngs as it goes to a Google login.

lucy24

WebmasterWorld Senior Member lucy24 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4686381 posted 9:20 pm on Jul 11, 2014 (gmt 0)

Google can tell if users are finding what they want on your page by if those users go and click another search result within a very short time span of visiting your site

Is there a post somewhere that spells out "very short"? I guess it's another ante quem / post quem issue: if someone clicks on a second search result one minute after seeing your page, it means you didn't have what they're looking for. But if the clicks come only one second apart-- especially if there's a whole string of them-- that's someone opening results in tabs before even looking.

brotherhood of LAN

WebmasterWorld Administrator brotherhood_of_lan us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4686381 posted 9:23 pm on Jul 11, 2014 (gmt 0)

>Is there a post somewhere that spells out "very short"?

I think the consensus is "relative to all the other results for that query" , or similar queries.

lucy24

WebmasterWorld Senior Member lucy24 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4686381 posted 9:28 pm on Jul 11, 2014 (gmt 0)

relative to all the other results for that query

That could lead to some interesting shuffling, since the last-clicked result on any one page will always have the longest lag time.

brotherhood of LAN

WebmasterWorld Administrator brotherhood_of_lan us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4686381 posted 9:34 pm on Jul 11, 2014 (gmt 0)

Perhaps, and that particular URL could get shuffled up the deck until that variable wouldn't apply, which'd level it out.

Historically rank #10 always got a better CTR than rank #9 just because it's at the bottom of the page. I imagine that kind of signal is something that Google are on top of.

Jenstar

WebmasterWorld Senior Member jenstar us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4686381 posted 9:44 pm on Jul 11, 2014 (gmt 0)

But if the clicks come only one second apart-- especially if there's a whole string of them-- that's someone opening results in tabs before even looking.


Google won't take the click behavior of one person into account (unless that person happens to be part of Google's internal search quality team!) so I don't think we need to worry about one person opening all results in individual tabs - that isn't really a widespread searcher behavior, but more of a one-off.

Now if the majority of 10,000 people only spend 15 seconds on your site before clicking on result #2, that is the kind of behavior to be worried about.

Shepherd



 
Msg#: 4686381 posted 1:33 am on Jul 12, 2014 (gmt 0)

I don't think we need to worry about one person opening all results in individual tabs - that isn't really a widespread searcher behavior, but more of a one-off.


That's interesting, was just thinking about what I do, I, and many people I know, always right click and open in a new tab when looking at the search results.

lucy24

WebmasterWorld Senior Member lucy24 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4686381 posted 2:44 am on Jul 12, 2014 (gmt 0)

There's actually a Google search prefs setting to open links in a new window. It's the last thing on the Preferences page. afaik, they can't tell whether it's physically a window or a tab. That's a browser setting.

But I was thinking particularly of opening multiple results before you look at any of them. Are the people who do this the same people who select Recent Posts and open each one in a new tab? Probably...

Jenstar

WebmasterWorld Senior Member jenstar us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4686381 posted 4:17 am on Jul 12, 2014 (gmt 0)

There's actually a Google search prefs setting to open links in a new window. It's the last thing on the Preferences page. afaik, they can't tell whether it's physically a window or a tab. That's a browser setting.


Well your searches are tied to your Google profile (for that setting, you need to be logged in to save it), so I bet that is something they internally track. And it wouldn't be hard for them to see "this person opened 6 webpages in 5 seconds" either.

But there still is a time difference between the time it takes for 5 results to open in web tabs versus the time it takes to click a link, let the landing page load, realize it doesn't have what you want, click back and then click another result.

micklearn

5+ Year Member



 
Msg#: 4686381 posted 4:29 am on Jul 12, 2014 (gmt 0)

I think having quality raters that aren't SEOs is the smart thing for Google to do.


Sorry, in my experience and things I've read, there is no way that Google could 100% knowingly hire someone who wasn't an SEO. Many ways around that...one being having a spouse or friend sign up/get hired to be a quality rater and then the SEO steps in and does their thing.

Selen



 
Msg#: 4686381 posted 4:31 am on Jul 12, 2014 (gmt 0)

I read the guidelines. What worries me that there is a focus on mere existence of 'supplemental content' or custom 404 pages, but at the end of the day the quality of content is what matters. And I'm not really sure if a rater is able to tell the difference. If content is good and detailed enough, the visitor doesn't have to look further for related content/links...

brotherhood of LAN

WebmasterWorld Administrator brotherhood_of_lan us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4686381 posted 4:44 am on Jul 12, 2014 (gmt 0)

> 'supplemental content'
> If content is good and detailed enough

I understand where you're coming from with that, but consider a document that has nothing but text (much like the early nineties web)... though perhaps in some cases it helps Google join the dots contextually where a visitor does not need it.

There is a lot of conceptual things that are hidden to the bot. One of the images in the document highlighted a "print a page" option as 'supplementary content' and that makes sense to me. There's such a link at the foot of the threads on here and have been for years. Same with "email a friend".

I can think of a lot of cases where content may not be obvious to a robot, like the "free tools" section of this site. There are hundreds of offerings of seo tools on the web, some better than others, but it's hard for a bot to know outside of citations (which we know can be a problematic area for measuring quality).

Jenstar

WebmasterWorld Senior Member jenstar us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4686381 posted 5:21 am on Jul 12, 2014 (gmt 0)

Oh, I am sure there are some SEOs or friends/spouses of SEOs who are quality raters, but from what I understand, they try to avoid anyone with more than a passing knowledge of the industry.

Google not only wants the person to read great content, but also have a great user experience wherever they send them. Two sites with similar content, one with great supplemental content and one without any/much, well I can see why Google wants to send them to the place where the experience itself is better.

It's important not to forget that supplemental content is only one extra piece of the puzzle that was in the guidelines. But it is one of the parts that SEOs can easily change.

superclown2

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



 
Msg#: 4686381 posted 11:16 am on Jul 12, 2014 (gmt 0)

I've got sites that fit in with all the quality signals which are nowhere in the SERPs as well as others that break every rule in the book but which have still been consistent earners for years. No doubt the lessons to be learned from this 'leaked' article are valuable but they are not the only ones.

superclown2

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



 
Msg#: 4686381 posted 11:20 am on Jul 12, 2014 (gmt 0)

It's important not to forget that supplemental content is only one extra piece of the puzzle that was in the guidelines. But it is one of the parts that SEOs can easily change.


On a number of occasions in the past I've added good relevant supplemental information to sites only to watch them slide down the rankings soon afterwards. Handle with care is my advice.

Shepherd



 
Msg#: 4686381 posted 12:59 pm on Jul 12, 2014 (gmt 0)

I've got sites that fit in with all the quality signals which are nowhere in the SERPs


I think it's important to think about where this guide and the human rater fall in the life-cycle of a search result. It is my understanding that the human rater only comes into play once the algo has determined that a webpage should be returned in the search results. I see the human raters as grading the algo more so than the webpages themselves.

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4686381 posted 2:07 pm on Jul 12, 2014 (gmt 0)

You still have to be useful. I can write 1000 pages of deep content about Caesar's adventures in Gaul, and it probably won't do as well as a single page that solves some everyday problem in a new and unique way.

Good and detailed isn't enough. Someone has to *want* it.

I've got sites that fit in with all the quality signals which are nowhere in the SERPs as well as others that break every rule in the book but which have still been consistent earners for years. No doubt the lessons to be learned from this 'leaked' article are valuable but they are not the only ones.


These guidelines (and Jen's summaries) are great and useful, but they are still just one piece of the entire pie.

philgames



 
Msg#: 4686381 posted 2:21 pm on Jul 12, 2014 (gmt 0)

a new and unique way.


But how will people find your new and unique way because people wont be searching for the unique stuff because it is unique and so you will be still copmepting in the tough generic keyword searches for the niche and be at the bottom of the pile. And thne someone reads what you have put and then rewrites/rehashes your unique insight (may or may not give a link) and then what you have written is no longer unique.

To me only getting local businesses and relations with other website is the only way to get links. but oh wait thats against google guidelines.

EditorialGuy

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4686381 posted 3:03 pm on Jul 12, 2014 (gmt 0)

To me only getting local businesses and relations with other website is the only way to get links. but oh wait thats against google guidelines.


No, it isn't.

webcentric

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4686381 posted 3:14 pm on Jul 12, 2014 (gmt 0)

To me only getting local businesses and relations with other website is the only way to get links. but oh wait thats against google guidelines.


And this is where the FUD meets the real world, causing all sorts of knee-jerk reactions. When the goal is to please Google and your decisions are based on myth and speculation, it's not gonna be good for your user, so...

Rule #1: Consider your audience BEFORE you consider what Google wants. What your audience wants should lead you to a better understanding of what Google wants, but, if not, so what? If your visitor comes back to you over and over without stopping at Google first, who cares?

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 4686381 posted 3:24 pm on Jul 12, 2014 (gmt 0)

But how will people find your new and unique way because people wont be searching for the unique stuff because it is unique


Because I know how to work more channels than just organic search. And because if I don't think there's a reasonable demand for it, I won't bother.

And thne someone reads what you have put and then rewrites/rehashes your unique insight (may or may not give a link) and then what you have written is no longer unique.


Yep, and that will always be the case. It's a cost of doing business on the web. You have to have more authority (and expertise) than your scrapers. That's part of the job.

To me only getting local businesses and relations with other website is the only way to get links. but oh wait thats against google guidelines.


And this is where the FUD meets the real world, causing all sorts of knee-jerk reactions.


That's not FUD, that's a complete misunderstanding of the document and the environment. It's usually more useful to read and think than it is to skim and freak out.

superclown2

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



 
Msg#: 4686381 posted 3:24 pm on Jul 12, 2014 (gmt 0)

Another issue I have with these guidelines is a comment that the reviewer should reward constantly updated content. My experience is that making too many small changes to a site can condemn it to purgatory. However major changes, like a complete re-write of individual pages, can have no effect whatsoever on where the various key phrases reach in the SERPs. These are B2C sites in the UK.

I see the human raters as grading the algo more so than the webpages themselves


That's an interesting thought. Are there any indications as to whether or not the raters have the power to alter the position of the reviewed site in the SERPs?

[edited by: superclown2 at 3:27 pm (utc) on Jul 12, 2014]

brotherhood of LAN

WebmasterWorld Administrator brotherhood_of_lan us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4686381 posted 3:27 pm on Jul 12, 2014 (gmt 0)

There is a lot of ideas to digest on those guidelines. Some of it is common sense, some not, some is new ground. Hopefully this thread has served its job in sharing the guidelines to people who can use it to their advantage.

Perhaps if anyone wants to talk more specifics, you can create a new thread? Cheers.

Selen



 
Msg#: 4686381 posted 3:46 pm on Jul 12, 2014 (gmt 0)

I understand where you're coming from with that, but consider a document that has nothing but text (much like the early nineties web)...

WebmasterWorld has nothing but useful text thousands of educated people fully enjoy it :).

n0tSEO



 
Msg#: 4686381 posted 3:54 pm on Jul 12, 2014 (gmt 0)

That's right, Selen!

Also, many academic pages with amazingly helpful content are nothing but text with some basic HTML formatting. Yet, they provide great value!

I really can't see a lot of wisdom in these new guidelines for quality raters.

This 100 message thread spans 4 pages: < < 100 ( 1 [2] 3 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved