homepage Welcome to WebmasterWorld Guest from 54.205.254.108
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 48 message thread spans 2 pages: < < 48 ( 1 [2]     
Does Google Whitelist Certain Websites?
tedster




msg:4281618
 8:32 pm on Mar 14, 2011 (gmt 0)

Webmasters have long wondered if Google maintains a "whitelist" that protects certain domains from penalties. As reported on SERoundtable today, Google issued a statement to clarify the question. Their answer denies anything like a "get out of jail free" card, but it explains what they call "exception lists".

Like other search engines (including Microsoft's Bing), we also use exception lists when specific algorithms inadvertently impact websites, and when we believe an exception list will significantly improve search quality. We don't keep a master list protecting certain sites from all changes to our algorithms.

The most common manual exceptions we make are for sites that get caught by SafeSearch - a tool that gives people a way to filter adult content from their results. For example, "essex.edu" was incorrectly flagged by our SafeSearch algorithms because it contains the word "sex." On the rare occasions we make manual exceptions, we go to great lengths to apply our quality standards and guidelines fairly to all websites.

[seroundtable.com...]

 

mromero




msg:4282976
 2:50 pm on Mar 17, 2011 (gmt 0)

Madscientist

Mr. Wall has a good piece on this topic - the screen shots are there for anyone to see.

If this is not evidence of some type of (insert color here) list, I do not know what else it would be.

tedster - the CIA and State have tons of backlinks and likely extra tonnage from the NSA we cannot see. These guys cannot shoot straight much less see a revolution coming <GRIN>

netmeg




msg:4283008
 3:06 pm on Mar 17, 2011 (gmt 0)

(there is no "FAIR" in search)

yea I SAID it!

TheMadScientist




msg:4283031
 3:32 pm on Mar 17, 2011 (gmt 0)

Mr. Wall has a good piece on this topic - the screen shots are there for anyone to see.

If it's the blog I stickied you about, then uh, I'm not seeing a screen shot of anything except a result where chrome is in an advertising spot above IE9, which isn't 'organic search white listing' and a page that had a tweet on it from a Large News Authority and Does Not Mean it's there because of some kind of 'white list' it just means a page had more 'weight' than it should. To the best of my knowledge Google has not said the Panda update was aimed at Content Farms ... We're the ones who have been saying that ... The Panda Update, to the best of my knowledge was aimed at eliminating low quality Sites (<-- This is a very key word imo), which would implicitly indicate it would be easier for an overall high quality site to be 'mis-ranked' too high occasionally.

Nothing I saw on the blog I looked at was any type of evidence of a 'white list' imo.

bwnbwn




msg:4283633
 1:05 pm on Mar 18, 2011 (gmt 0)

If the guys who wrote the algo were just down the hall,
the boys that wrote the algo own thee company and I highly doubt any single department has full access to it. I am sure the departments are split up each as a small portion or department they work. nobody can get the whole code except maybe like 3 people if that many. It would be a very stupid mistake on Googles' part and I know they are much to bright to allow this to happen. Surps being fixed I don't belive at all but sites that are major players in their field yes I have a strong feeling they are helped. But there again we have seen were major players violated certian parts of Google's policy and were filtered.
aristotle




msg:4283818
 7:21 pm on Mar 18, 2011 (gmt 0)

I strongly doubt that a manual whitelist exists.

Some sites might appear to be whitelisted, but this is the result of how the algorithm works. It enables some major sites to build up a high trustrank, and this high trustrank overcomes negative factors that would lower the rankings of lesser sites.

But the high trustrank isn't assigned manually. Instead, it is calculated by the algorithm from various factors. So there isn't any manual intervention or adjustment.

Webwisemedia




msg:4283863
 8:39 pm on Mar 18, 2011 (gmt 0)

I do believe that is true. What I have noticed is that Google is digging deeper in the algorithm and making sure the keywords and metatags fit the content of your website. That is why Google keeps changing their look of their website and changing the way the Organic section looks.

iamlost




msg:4283864
 8:42 pm on Mar 18, 2011 (gmt 0)

There is various evidence that 'organic' query results are not always wholly 'natural':
* Google properties' positions hard coded or otherwise ensured.

* undisclosed exception lists utilised via undisclosed methodologies.

* prominent sites immediate emergence following public mention of situation. Similarly, near immediate drop or removal of sites on their misbehaviour becoming public.

In one sense to expect otherwise would be foolish: why wouldn't a business ensure it's own position?
An algorithm is a blunt instrument almost requiring exception lists to mitigate collateral damage, manual interventions to correct gross (public) errors.

However, Google's default position of denial, followed by definition parsing is too familiar to the behaviour of, for instance, certain politicians caught in the wrong, to be accepted. Once may be chance, twice might be coincidence, but we are long past either.

Call them what you will: Google 'helps' it's algorithm via various non-algorithmic means.

TheMadScientist




msg:4283876
 9:08 pm on Mar 18, 2011 (gmt 0)

They're still not Yahoo! [webmasterworld.com...]

Yahoo! had (has?) a patent for hand editing results and it was alright and legal, but Google let's false positives through parts of it's algo using an exception list and that's wrong? I doubt anyone is winning a rankings related lawsuit any time soon ... They're opinion and they can obviously be hand edited.

anallawalla




msg:4284879
 2:45 am on Mar 21, 2011 (gmt 0)

I have long believed a whitelist exists that includes a number of major publications. One reason I think so is because a number of such websites are not "penalized" for duplicate content (unlike lesser mortals like us) from sources like Reuters, AP,etc. (which is carried by a number of these big publications).


This is again a case of backlinks and not a whitelist. The newspapers that repeat the newswires are probably linked from more sites and from higher TrustRank sites than the wire services, that most people only know as a byline.

I believe that the duplicate filter algorithm is given a lot less importance when the site is very trustworthy, otherwise the newswires would always be the first result. IOW when relevance is evaluated, ceteris paribus, the duplicate site with the most TrustRank is displayed at the top. In practice, such pages contain a lot of side matter, which would also contribute to the relevance, whereas the Reuter's original copy is probably a single page per story. (I didn't investigate)

danimalSK




msg:4285035
 1:41 pm on Mar 21, 2011 (gmt 0)


Question: what is the difference between a whitelist and an algorithm? this is an algorithm:

if ($domain_name =~ /wikipedia|google|youtube/){
$rank += 5;
}

I don't really see the difference (nor the point in this conversation). It's pretty clear Google tweaks the "algorithm" to prefer certain sites / characteristics; whether this is through a whitelist or not is a pretty moot point.

TheMadScientist




msg:4285040
 1:50 pm on Mar 21, 2011 (gmt 0)

It's pretty clear Google tweaks the "algorithm" to prefer certain sites / characteristics;

That's what makes em a search engine and not a random jumble of results, isn't it? You're right, the point of this discussion is moot and imo borders on inane ... In some senses it's a completely ridiculous discussion.

To have a top 10 you have to favor certain sites or pages in one way or another don't you? If people want to think they have a white list that is across all algorithms and that's not fair, then so be it ... Life's not fair either.

danimalSK




msg:4285051
 2:14 pm on Mar 21, 2011 (gmt 0)


That's what makes em a search engine and not a random jumble of results, isn't it?


Yea sorry on reflection that sentence isn't the most sensible thing I've ever written. But yea, the point still stands, whether the algorithm is tweaked using a whitelist on Cutt's usb stick, a bunch of Larry Page clones turning dials on a Heath Robinson contraption, or by a few hundred engineers writing code - is irrelevant.

tedster




msg:4285055
 2:16 pm on Mar 21, 2011 (gmt 0)

And yet - to be blunt - if someone doesn't see the difference between an exception list and a whitelist, then they're going to be crippled as they try to analyze Google results.

Sometimes I think people begin by being upset with Google - and then they look for ideas that justify their feeling. That, however, is not analysis and it generates terrible SEO. You might as well throw chicken bones.

TheMadScientist




msg:4285072
 2:34 pm on Mar 21, 2011 (gmt 0)

Yea sorry on reflection that sentence isn't the most sensible thing I've ever written.

Actually, it very sensible, imo.

...if someone doesn't see the difference between an exception list and a whitelist...

I've almost given up on confusing people with logic and reason rather than just letting their raw emotion run ... It's better for me and the sites I'm involved with that way ... If people want to think their site(s) lack of rankings are due to a white list, then you're right, their SEO is going to be terrible; better for everyone else who takes the time to understand.

danimalSK




msg:4285075
 2:38 pm on Mar 21, 2011 (gmt 0)

And yet - to be blunt - if someone doesn't see the difference between an exception list and a whitelist, then they're going to be crippled as they try to analyze Google results.

Sometimes I think people begin by being upset with Google - and then they look for ideas that justify their feeling. That, however, is not analysis and it generates terrible SEO. You might as well throw chicken bones.


Agree. But what most people seem to be taking umbrage with is the idea of "a list" not whether there are per algorithm exception lists, carte blanche across the board whitelists, or can't get penalised unless the NYT calls them out lists.

Anyway, my point is there's basically no difference between a "list" and the algorithm. Matt Cutt's can say it's always algorithm based, even if he just added another regex to the "Panda exception list" if statement. The algorithm is constantly tweaked, and that's the end of it. Whether that algorithm is tweaked via "a list" or a code change is irrelevant.

TheMadScientist




msg:4285076
 2:45 pm on Mar 21, 2011 (gmt 0)

But what most people seem to be taking umbrage with is the idea of "a list"

I really think it has to do more with their rankings and not being able to regain them and not wanting to admit their site might not be the 'quality' or 'the answer' the end user(s) want to visit as much as they may like to think it is, and needing someone or something to blame other than themselves more than any list ... If the complainers thought they were on it, somehow I doubt they would be complaining any more, so it's not about the list, it's about their rankings, imo.

I can't ever remember being upset at Google for not ranking my site(s) as high as I would like ... They run their site and do the best they can, however they do it ... I run my sites the best I can ... Hopefully, the algo and I agree on a definition of quality and where a site or page should rank, but if not then that's life ... not everyone likes every site, including Google, to begin with, so their opinion of my site and my opinion of my site and even the opinion of the end user of one of my sites may differ, but I guess that's where not relying solely on Google for business and maturity come in to the decision making process...

TheMadScientist




msg:4285083
 3:01 pm on Mar 21, 2011 (gmt 0)

It's really sad to me how little respect people give the Googlers for running a great website ... That's what Google is ... It's a Website ... They do the best they can with it and between dissecting the English (and other) languages, spam, parsing, sorting, ordering, and returning results by choosing from around a Trillion pages for something like a Billion queries a day system wide I think they do a pretty f*ing good job, personally.

It's funny they only built and concentrate on one site, isn't it? Why is that? Why is their quality so much higher than everyone else's for the most part? How many sites do they have? One? What are they thinking?

I wonder what would happen if the people here who think more is better only concentrated on one website? How many sites does wikipedia have? How NYTimes? IRS.Gov? Tiwtter? FaceBook? How many sites do the 'big players' run?

I read so many posts about people running 50+ websites like it's something special ... Anyone can build a bunch of websites, but it takes something a bit more special to only need one or two, imo ... Maybe people should complain about the white list (exception list) and not being on it when they can honestly say they have a site so special they basically concentrate on running only one?

aristotle




msg:4285154
 4:59 pm on Mar 21, 2011 (gmt 0)

if ($domain_name =~ /wikipedia|google|youtube/){
$rank += 5;
}



That is incorrect. The algorithm doesn't give special treatment to particular websites. Some sites gain a higher trustrank than others through algorithmic calculations. But all sites have the same chance.

This 48 message thread spans 2 pages: < < 48 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved