|Google Using Site-Unique Bias?|
Is Google’s ranking algorithm biased against specific sites?
| 4:17 pm on Jun 19, 2006 (gmt 0)|
A report on the net suggests Google is using “site-unique bias” to suppress access by their users to sites they don’t like. Site-unique bias is the use of a ranking process (“algorithm”) that contains or refers to lists of domain names or other site-unique information such as unique character strings or phrases. Sites that are negatively affected still show up in a search for “site:domainname.com” but rank very poorly in any other search. Positive ranking bias for “friends of Google” could also be used.
The report suggests Google may be switching from outright banning to site-unique bias because the new approach involves fewer hassles with site owners while otherwise delivering essentially the same benefits in suppressing user access to spam, competitive, inconvenient, or otherwise editorially undesirable sites. Site-unique bias is much harder for a site owner to prove than outright banning. Google can claim that “the same algorithm applies to everybody” (technically true), and that any sudden catastrophic loss of rank is “due to a change they made to their algorithm” (also technically true). Google can save money on “re-review” of complaining sites and avoids some potential legal problems associated with outright banning.
The Kinderstart case is cited as a demonstration of site-unique bias. Kinderstart’s company web site (kinderstart.com), as you might expect, ranks number one on a Yahoo or MSN search for their company name (“Kinderstart”), but ranks very poorly on Google, even on a search for the company name.
| 10:39 am on Jun 21, 2006 (gmt 0)|
I have no idea whether that is the case or not, but there is evidence that Google is 'demoting' sites, rather than necessarily banning them.
This new-ish behaviour is quite likely to protect them from litigation, and would be a good tactic in that area.
The main benefit to the wider web of such an algo change is rather different, however:
After an algo change, spammers often need to develop new tricks to replace those that no longer work - once upon a time, they could try something, and see if the site got banned; if it didn't, chances are they'd 'got away with it'
Now, the spam development program is much more complicated, as Google does not give yes/no feedback to spammers. This slows them down, and makes for better serps for the rest of us.
| 3:52 pm on Jun 21, 2006 (gmt 0)|
My site has been similarly hit. A bunch of sites were hit with what you mention on April.26th. Some have returned after reinclusion requests...many have not. The classic symptons including the site you mention are exactly the same:
Buried on all search terms from obscure to competitive.
Buried for uniquecomanyname search , as well as, "uniquecomanyname.com"
Assuming the site has no subdomains the index page will never appear as the first listing on a site:domainname.com . A random page or two will always appear above the home / index page.
I have done everthing possible to clean up my even a minuscule hint on spam site to no avail. I really have no clue what put me on the list; however, I am thinking I might as welll start over with a new domain and a 301. Problem is I rank well in other search engines.
[edited by: tedster at 6:21 pm (utc) on June 21, 2006]
| 2:45 pm on Jun 27, 2006 (gmt 0)|
Quad: I think that the advent of scraper sites chamges the "benefit" that you mention. Now algo changes adversely affect normal "honest" web sites but do not adversely affect the spammer who can merely rerun his scraper program to adjust his site to the new algo. The scraper script can do automated searches using their favorite search engine and steal the text from the top ranked sites. When the algo changes they just rerun the script.
The algo changes are hurting the good guys more than the bad guys.
| 2:48 pm on Jun 27, 2006 (gmt 0)|
Jim: If you start over with a new domain name and use a 301, won't Google find and ban or "demote" the new site by following the 301?
| 4:24 pm on Jun 27, 2006 (gmt 0)|
I realize that link dropping is frowned upon here, but it's hard to put much faith in an unattributed "report."
| 4:49 pm on Jun 27, 2006 (gmt 0)|
Easy to find using Google for the phrase.
It's all very convincing - use of ODP data is one thing that leads to getting buried in the SERPS.
I think of my main site and yes, I use reordered/recategorised ODP data, and it's buried. Proved then. The only problem is that I have two other sites that have been buried at some time that don't use ODP data.
So, we have had competing explanations for buried sites over the past 18 months:
Canonical www and non-www problems
Linking patterns (in and out)
Everyone can choose the explanation that is most likely for their own site - if you have some spammy links then that's the problem, no 301 redirect for non-www then that's the problem, use ODP data then that's the problem. I'm sure this must be parallelled by a Star Trek episode where an alien force creates hallucinations that prey on each crew member's deepest fears.
| 5:40 pm on Jun 27, 2006 (gmt 0)|
Having sites buried in the SERPs because they incorporate recycled ODP data is hardly proof that, in the original poster's words, "Google is using 'site-unique bias' to suppress access by their users to sites they don’t like."
| 7:40 pm on Jun 27, 2006 (gmt 0)|
My take, if google wants to ban a site, more power to them. After all its their servers and equipment. I do not agree with the kinderstart lawsuit one bit.