Welcome to WebmasterWorld Guest from 22.214.171.124
Earlier this week Google launched an algorithmic change that will tend to rank scraper sites or sites with less original content lower. The net effect is that searchers are more likely to see the sites that wrote the original content. An example would be that stackoverflow.com will tend to rank higher than sites that just reuse stackoverflow.com's content. Note that the algorithmic change isn't specific to stackoverflow.com though.
I know a few people here on HN had mentioned specific queries like [pass json body to spring mvc] or [aws s3 emr pig], and those look better to me now. I know that the people here all have their favorite programming-related query, so I wanted to ask if anyone notices a search where a site like efreedom ranks higher than SO now? Most of the searches I tried looked like they were returning SO at the appropriate times/slots now.
I just wanted to give a quick update on one thing I mentioned in my search engine spam post.
My post mentioned that “we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.” That change was approved at our weekly quality launch meeting last Thursday and launched earlier this week.
This was a pretty targeted launch: slightly over 2% of queries change in some way, but less than half a percent of search results change enough that someone might really notice. The net effect is that searchers are more likely to see the sites that wrote the original content rather than a site that scraped or copied the original site’s content.
[edited by: Brett_Tabke at 9:22 pm (utc) on Jan 28, 2011]
[edit reason] Added link for the Cuttlets [/edit]
But I'm wondering how does Google know what site it's "a crap" and which one isn't.
Hello world!This is close to ridiculous!
Posted on January 26, 2011 by admin
Welcome to hhu Sites. This is your first post. Edit or delete it, then start blogging!
those who achieved success using black hat methods are not stupid, just lazy.
I think these pages have such huge uniqueness patterns onpage due to all of the crazy mashup that's going on that it makes this page look the most unique to Google for the query.
I dont know if that makes any sense or not but that's what I am seeing.
Not all but most of the sites that have moved up are ehow type sites with large amounts of onpage junk all jumbled up with a article written by someone who really is clueless.
[edited by: blend27 at 11:43 pm (utc) on Jan 29, 2011]
[edited by: crobb305 at 11:57 pm (utc) on Jan 29, 2011]
[edited by: Lapizuli at 5:53 pm (utc) on Jan 30, 2011]
Why would anyone be worried about something that only affects 2% of the queries, unless the queries are limited the specific verticals?
Basically we have five short-tail queries and about 1,000 long-tail queries. 2% queries affected means we would statistically lose about 0 of short-tail and about 20 of long-tail. Not something I would care about.