homepage Welcome to WebmasterWorld Guest from 54.166.255.168
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 198 message thread spans 7 pages: < < 198 ( 1 2 3 4 [5] 6 7 > >     
CNN: Growing Backlash to AdSense Farm Update
Brett_Tabke

WebmasterWorld Administrator brett_tabke us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4272916 posted 3:17 pm on Feb 26, 2011 (gmt 0)

[money.cnn.com...]

Google made one of the biggest changes ever to its search results this week, which immediately had a noticeable effect on many Web properties that rely on the world's biggest search engine to drive traffic to their sites.

The major tweak aims to move better quality content to the top of Google's search rankings. The changes will affect 12% Google's results, the company said in a blog post late Thursday.

Comments from site operators lit up on the WebmasterWorld.com forum starting on Wednesday. Many webmasters complained that traffic to their sites dropped dramatically overnight, and others expressed concern that they can't adapt quickly enough to Google's changes to its algorithm.

 

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4272916 posted 4:56 am on Feb 28, 2011 (gmt 0)

I think that's the problem with detecting user dissatisfaction for some content farm pages. The return to the SERPs is too slow, because it takes a decent amount of reading before users realize the article is totally messing with them. Because it's so slow, it doesn't meet the criteria for a bad result, where the user would bounce pretty darned fast.

Reno

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4272916 posted 5:58 am on Feb 28, 2011 (gmt 0)

how can Google even begin, at an advanced level to detect "low quality"

A number of us have been saying for weeks now, "tell us exactly what is meant by 'quality'". Even a cursory review of threads will see that question raised over and over. They can't tell us because it's in the eyes of each viewer, not a dictionary, and one person's "low quality" is another person's "exactly the answer I was looking for". Google dug themselves a pit and now they've fallen into it.

..........................

fathom

WebmasterWorld Senior Member fathom us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4272916 posted 10:49 am on Feb 28, 2011 (gmt 0)

A number of us have been saying for weeks now, "tell us exactly what is meant by 'quality'".


Quality to a blind bot is a relative thing.

Even Wikipedia can be considered a content farm... it has a massive archive and within that lots of rather "thin" pages which are especially thin when compared to its even more remarkable "compelling" pages...

In context... Wikipedia's "thin" pages shouldn't get the VIP treatment simply because they're associated with its compelling pages... that would be unfair... wouldn't it?

Now apply that to all domains unilaterally... and all domains are equally compared.

This IMHO is still Google's link obsession... pages that don't have unique links to them aren't compellingly unique or rather pages with lots of unique links are more compelling thus pages with less (per domain) are considered "low quality".

The "FARM" part is about cultivating a volume. An ezine isn't cultivating a single SEO page and that page acquires lots of unique links for SEO... it has 20,000 similarly crafted SEO pages for 20,000 different SEOs all after their page and none are consider "high quality" because none stand out in the crowd. It's up to the ezine to get rid of 19,999 and focus on the one!

All Google is saying ...quality trumps quantity ... having 10 million pages of nothing much isn't better than alot less but spectacular!

Back at your simple question "tell us exactly what is meant by 'quality'"... that's easy... UNIQUE LINKS!

<added>in the same context WebmasterWorld could also be a content farm... few threads get lots of unique external links to them while some really outstanding threads get massive amounts of unique links because they are compelling... but not every thread of WebmasterWorld should be at the top of Google because of a few great resources are there.</added>

TheMadScientist

WebmasterWorld Senior Member themadscientist us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4272916 posted 12:06 pm on Feb 28, 2011 (gmt 0)

Sure they can tell user feelings. They hit the back button and select another listing on the serp.

LMAO! Maybe we could score thread pages the same way? Do you have any idea how many pages (and probably full threads - 'bad results') you could eliminate using the same logic? I bet you could take 15 pages out right now based on my behavior last week alone.

Do you think it matters how I go back?
EG Does it mean something different if I: press 'CMD + [' OR press 'delete' OR physically click?

Just because I wanted to see something different at the time I was clicking to and fro doesn't necessarily mean I liked or disliked what I saw somewhere ... It may well have been something I liked and would like to come back to at a later date - If my results change how can I do that? - I've got enough bookmarks already ... Or maybe it does mean I didn't like it and I just don't know I didn't?

[edited by: TheMadScientist at 12:23 pm (utc) on Feb 28, 2011]

iThink

10+ Year Member



 
Msg#: 4272916 posted 12:17 pm on Feb 28, 2011 (gmt 0)

Wikipedia's "thin" pages shouldn't get the VIP treatment


Can't agree more with this. A large number of wikipedia pages with no more than 50 words of actual text about a topic, rank for some competitive keywords.

Today the debate is on whether content farms are worthy of getting traffic from Google. Tomorrow debate will be whether forums (webmasterworld included) are worthy of getting traffic from Google. After all most threads in most forums are just conversations of a group of people trying to state their opinions as facts. People with vested interests in promoting their goods or services often participate in forum threads and they have no regards for the truth or the accuracy of what they post. So quality of content in forums is suspect. I tend to take most of what I read in forums with a pinch of salt. So should we expect google to devalue the forums in rankings?

To me internet is more and more looking like a dictatorship with dictator Google ruling with an iron fist. History has shown us that weak dictators are most insecure and most worried about their survival and they like to tinker with rules all the time, mostly with negative results.

fathom

WebmasterWorld Senior Member fathom us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4272916 posted 12:33 pm on Feb 28, 2011 (gmt 0)

A large number of wikipedia pages with no more than 50 words of actual text about a topic, rank for some competitive keywords.
They did... but now?
iThink

10+ Year Member



 
Msg#: 4272916 posted 12:38 pm on Feb 28, 2011 (gmt 0)

They did... but now?


May be give a -30 penalty to wikipedia or is -90 better? :)

econman

10+ Year Member



 
Msg#: 4272916 posted 1:57 pm on Feb 28, 2011 (gmt 0)

how come it didn't learn what people like and take corrective action filtering out the 'content farms' one SERP at a time all along?


I don't have all the answers, but I think this update involves a fundamental shift in Google's goals -- for the first time they are attempting to assess the "quality" of a web page or website (still don't know which).

Until now, they have been focused on "relevance" without regard to "quality." These are not the same thing.

As to why they are doing this in a big splash, rather than through tiny incremental changes -- perhaps to reap as much public relations benefit as possible?

And, perhaps to "shock" the system, so they have a sharp before and after contrast which allows them to do a clean A vs B analysis on the data they started obtaining late Thursday?

Brett_Tabke

WebmasterWorld Administrator brett_tabke us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4272916 posted 1:59 pm on Feb 28, 2011 (gmt 0)

> it takes a decent amount of reading

For sure time spent on an external page is important; however, we don't know the extent of that without seeing G's data. The fact they return to the same SERP and then click on another link, is much more telling to me. I think G would be looking for a 'dead head' search (where the searcher stops searching for a result) as higher ranking.

> you could eliminate using the same logic?

There is some merit to that line of thinking, however, we are not a search engine and that is not our mission in life. Our goal is on site engagement; whereas Googles' goal is higher OFF site engagement.

TheMadScientist

WebmasterWorld Senior Member themadscientist us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4272916 posted 2:06 pm on Feb 28, 2011 (gmt 0)

...the fact they return to the same SERP and then click on another link, is much more telling.

But why did they return and click on another one?

Did the first one not answer the question?
Is it a new topic and they wanted confirmation of the first answer?

Why did they click on 3 different results?
Were they comparing services taking notes as they go?
Did they get conflicting answers from the first 2?

Why was the visit time longer on one than the other?
Did a site disable the back button with JS?
Or were they trying to find the answer they thought should be on the page and wasn't?

Those are only a few I can think of that don't tell you anything about correct result v incorrect result based on visit behavior ... There's too much you can't tell from click data and visit time, imo.

Brett_Tabke

WebmasterWorld Administrator brett_tabke us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4272916 posted 2:14 pm on Feb 28, 2011 (gmt 0)

Exactly the right questions to ask the Data TheMadScientist. Unfortunately, we don't have the data. Interpreting that data has to be a fairly involved thing, but with the quantity of traffic going through Google, there has to be parts of the data that stand out in detectable patterns. Remember, Google has a treasure trove of data to work with from multiple sources.

TheMadScientist

WebmasterWorld Senior Member themadscientist us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4272916 posted 2:23 pm on Feb 28, 2011 (gmt 0)

Yeah, I totally agree with you on the amount of data, and being able to tell what should or shouldn't be there to some extent on a broad scale I think is possible, but the granularity of personalization like they're trying for? Hmmm...

ADDED: Here's another example of 'can't tell' ... There's some sites I visit for longer than others even though they don't have the answer I'm looking for because I think 'WOW! This is a cool design, how can I incorporate something like this on, blah...' ... It sends a 'false positive' for the result relative to the query, even on a broad scale and more so on a personal one ... I personally think they're getting a bit too granular with things because of the individuality of personalities involved with actually giving people what they're looking for ... It's really an interesting question and would be fun to try, but when I think about it behavior without direct input from the user, imo, doesn't give anywhere near enough of a granular picture to say 'correct' or 'incorrect'.

Jon_King

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4272916 posted 2:48 pm on Feb 28, 2011 (gmt 0)

Consider how the Adword to "Organic" click ratio plays into this. They have been playing with "Organic" title tags imo to test infuence on this ratio (and other signals).

ie If the title is changed to the most compelling possible based on existing user data but the user comes back for more or comes back for more quickly, or jumps into the adwords column... what does that say about that page?

When we talk about time between serp clicks I think the data over hours or days, maybe even years could be a stronger signal than seconds.

mslina2002

10+ Year Member



 
Msg#: 4272916 posted 2:48 pm on Feb 28, 2011 (gmt 0)

Well, let's take a look at the Super Bowl commercial as an example where the bloke used Google to find a mate, find a restaurant, shop for flowers, find a ring, etc.

These were all positive experiences that followed from positive searches. Nothing to do with bounce rate, time on site, keyword density, etc. all factors ON the site. The positive or successful experience was determined by the search that follows.

Perhaps if enough people search for "best engagement ring" after visiting a dating site, that would deem that site a great user experience (vs. "how to get rid of a stalker"). LOL.

TheMadScientist

WebmasterWorld Senior Member themadscientist us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4272916 posted 2:53 pm on Feb 28, 2011 (gmt 0)

If the title is changed to the most compelling possible based on existing user data but the user...

But it's not, even if they would like to think it is or they think that's what their data seems to be telling them ... I know based on clicks for titles and previews they refuse to use.

Site (page) in the top 10 on Google hasn't generated a click in 3 days with the title and preview they decided are best ... Site's (page) listed other places using the title, description and preview of the actual page the visitor lands on and gets all kinds of clicks ... Unless the best title is the one that drives users to AdSense clicks they're not using the best title or preview ... If they want AdSense clicks, then they probably nailed it!

Jon_King

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4272916 posted 3:03 pm on Feb 28, 2011 (gmt 0)

Yes indeed TheMadScientist. The data can be applied in many ways...

Mslina, that way of interpreting positive vs negative signals rings important to me.

TheMadScientist

WebmasterWorld Senior Member themadscientist us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4272916 posted 3:24 pm on Feb 28, 2011 (gmt 0)

The data can be applied in many ways...

Yeah, imo too many ways ... One for each page in the system + One for each searcher + One for each search ... What's that Chaos Theory thing about? Little changes having a large effect like ripples on water, or something ... How many ripples are they putting into the system with the 'microgranularity' they insist on inserting?

Is there data to support the 'too many ways' and 'too much microgranularity' theory I have?
Seems there may be: Bing On the Up? [webmasterworld.com]

albo

5+ Year Member



 
Msg#: 4272916 posted 4:06 pm on Feb 28, 2011 (gmt 0)

I heartily agree with @Brett_Tabke's #4273671. And I think the goodness of G's change should be determined by the relevance and availability of user search results, of course.

But, as ReadWriteWeb pointed out this AM, '[W]hat's "useful" is in the eye of the beholder and in the eye of Google engineers, one might argue.'

RWW published a chart showing (according to Sistrix) a drastic decrease in hits for e.g. suite101. [readwriteweb.com...]

With this, I'm disappointed. But who am I? I searched for some general info on other things, and found I had to scroll through several pages of results before finding a useful result. But who am I? G's idea differs from mine. And so it shall be...

[edit to add URL and wisecrack: A while ago, in a different context, G said something like, "Another SE is just a click away." (In other words, "Like it or lump it.") Me? Click! Bing.]

freejung

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4272916 posted 5:02 pm on Feb 28, 2011 (gmt 0)

But why did they return and click on another one?

Did the first one not answer the question? ...

I would think that with a large volume of data you wouldn't want to try to work out these kinds of questions on a case-by-case basis. Rather, you would want to develop several independent quality metrics, and then test them against each other and against known datasets to extract patterns, then apply those patterns to new datasets and test again.

User behavior will vary a lot from user to user, from site to site, and from one niche to another, but on a large scale patterns will emerge.

moTi

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4272916 posted 5:03 pm on Feb 28, 2011 (gmt 0)

how to detect quality? define quality! which kind of user wants/needs quality at all? and in which situation? quality can be difficult to comprehend at times. all the more if you're in a hurry. bad navigation forces longer visits, rubbish articles on crappy web pages take more time to scan. doesn't adsense contradict on-page quality? complete information given, no need to click an ad. quality valuers don't like to click ads. and so on and so on..

what's relatively clear to me, is that only a human - a sophisticated and educated one - can define and determine real quality. not with clicktrough behavior, but with common sense. google is lost there. algos lack media competence. just like most humans.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4272916 posted 5:09 pm on Feb 28, 2011 (gmt 0)

For several years, Matt Cutts, along with other Google spokespeople, have been talking about the fact that "bounce" is a noisy signal. Yea, they measure it but apparently they can't use it for any heavy lifting in the algo.

Spencer

5+ Year Member



 
Msg#: 4272916 posted 5:16 pm on Feb 28, 2011 (gmt 0)

There's only one way to get through to Google and that's to stop taking advantage of its money making schemes. Until enough people stop using them they will continue to believe that they have a legitimate mandate to continue in the same way.

The problem with the SEO and Webmaster community has always been its inability to speak with one cohesive voice and until that happens they will never be heard.

apauto

5+ Year Member



 
Msg#: 4272916 posted 5:22 pm on Feb 28, 2011 (gmt 0)

For those of us who have been wrongly hurt in this update, what if we all submit "reinclusion" requests to see if Google could manually verify our sites?

Reno

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4272916 posted 5:25 pm on Feb 28, 2011 (gmt 0)

Until now, they have been focused on "relevance" without regard to "quality." These are not the same thing.

Excellent observation. Websurfers are individuals who bring to their search their own set of criteria. I want a page without any popup windows or unsolicited music or slow loading widgets or megabyte graphics, and I want the answer to my query where I can see it.

Someone else may want a "cool" looking interface (I could care less) and the latest Rihanna song playing in the background. To each their own. Google seems to have the idea that it can detect what I want, and they can't. Nor can they detect what anyone else wants. What they used to be able to do, and do fairly well, was help me find the answer to my query. When they were "relevant", they built their empire; but now, as they seek to define what we all should regard as "quality", they overstep their bounds, and the mess we are seeing is the result.

.....................

backdraft7

WebmasterWorld Senior Member



 
Msg#: 4272916 posted 5:36 pm on Feb 28, 2011 (gmt 0)

@apauto - I suggest taking a few steps back, re-examine your site and fix anything that makes it spammy, thin or farm-like. Don't just swamp Google with re-inclusion requests. Make it a point to find a dozen things wrong with your site and fix them, then and ONLY then should you file a reinclusion request.

I suspect most will deny anything is wrong with their site. That's typical, and usually wrong.

kd454

5+ Year Member



 
Msg#: 4272916 posted 5:49 pm on Feb 28, 2011 (gmt 0)

From what Matt said in his video "re-inclusion requests" are only for manual giving penalties, if it is caused by the "algo" you have to fix what ever the "algo" flagged you for. Good luck trying to figure that out.

Spencer

5+ Year Member



 
Msg#: 4272916 posted 6:28 pm on Feb 28, 2011 (gmt 0)

Google is just the guy with the ball in the playground.
The guy with the ball gets to say what game is played.
Eventually, the playground will develop a new game.

Copeland

10+ Year Member



 
Msg#: 4272916 posted 6:41 pm on Feb 28, 2011 (gmt 0)

"Some of my sites were up 30% while others were down 20%. My content is all original and really strong. Not a content farm. I run ad-sense on some but not the others." - This type of post is worthless and clutters the forum. If I had one wish for WebmasterWorld for the coming year it would be for forum members to self police their posts by asking themselves "Is my post adding to the conversation or not?". "Reporting" how an update impacted you akin to saying "it snowed at my house too" after a blizzard. Really? Wow... thanks for the insight.

My thoughts on the update: Google identified the large content syndication sites and devalued them along with sites linking to and from them. The Sistrix report (can I mention this on WBW?) shows a substantial list of these sites and their estimated traffic losses. My initial hypothesis is that Google identified variables associated with "content farms" -high publishing volume, poor inbound link profile- and devalued them along with sites that link to/from them. IMO, if eHow (again, hoping I can name them) indeed avoided a decline it is because eHow's back link profile is slightly better than the restof the "content farms".

This is all just my hypothesis. Hoping others can chime in with more thoughts on what Google changed then how it impacted their particular situation.





apauto

5+ Year Member



 
Msg#: 4272916 posted 6:52 pm on Feb 28, 2011 (gmt 0)

@apauto - I suggest taking a few steps back, re-examine your site and fix anything that makes it spammy, thin or farm-like. Don't just swamp Google with re-inclusion requests. Make it a point to find a dozen things wrong with your site and fix them, then and ONLY then should you file a reinclusion request.

I suspect most will deny anything is wrong with their site. That's typical, and usually wrong.


There is nothing that makes it spammy. It's an eCommerce site with hand written descriptions.

backdraft7

WebmasterWorld Senior Member



 
Msg#: 4272916 posted 6:56 pm on Feb 28, 2011 (gmt 0)

There is nothing that makes it spammy.

No need telling me...
You may want to explore other areas that may be making your site less relevant.
I certainly don't suggest knee jerk changes, and your lower position may be temporary.
My point is simply to follow the WMGL's as close as possible.
It's their sandbox, unfortunate as that may be, you gotta play by their rules.

Bewenched

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4272916 posted 7:36 pm on Feb 28, 2011 (gmt 0)

@TheMadScientist
Someone asked why people 'hate' eHow in a thread and I don't remember exactly which one, but I'll tell you why I despise eHow, it's simple: jsNoFollow in the source code instead of proper, linked attribution like it should be. Makes Me Livid!


It's funny that you mention them. I just got two google notifications of our name being used on their site. Yup they quoted us word for word on their stupid site.. gave us "bogus" link credit with their no follow... but our site has a explicit copyright notice and they copied us anyway. GRRrr.

This 198 message thread spans 7 pages: < < 198 ( 1 2 3 4 [5] 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved