Forum Moderators: open

Message Too Old, No Replies

Google's Banning Procedure Not Logical

banning the entire site instead of the pages that vilated...

         

sohu8976

1:10 am on May 13, 2003 (gmt 0)

10+ Year Member



Hi, I am new here. From some search engine comparisons I have done (google vs some Foreign language search engines), I personally think Google's banning procedure (or panalty) is not very logical.

I will put it to simple terms, and my English is not very good, so, please bare with me. :)

Say a website has over 10000 pages, has many volunteer editors, guest books, forums etc... And there are some pages that violated google's rule, but these vilation was not intended to carry out. For example, Say some guest left some funny "hidden" links in his/her posting, one of the site's volunteer editor copied an article from "about.com" etc... Then the BIG GOOGLE S.P.A.M. filter comes in and BAM, this site is gone... maybe for good!

There are some foreign language search engines ban particular pages instead of the entire site according to their SPAM polices, what do you guys think?

GoogleGuy

1:27 am on May 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



We can do penalties on individual pages or parts of a site as well. If one person on with a geocities account is spamming, it wouldn't make sense to throw the entire site out.

BigDave

1:41 am on May 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sohu,

Are you suggesting that this has happened? Or is this a hypothetical worst case scenario that you fear might happen?

sohu8976

3:08 am on May 13, 2003 (gmt 0)

10+ Year Member




GoogleGuy, thanks for the clarification. Geocity is a good example. I assume google has it's own detailed policies regard to the banning process.

BigDave, the reason I post this is that I have read many postings on the net regards to sites banned by google and some other search engines, and I have to say I found there are many sites are entirely banned by google or other engines because of a few of the pages on the site vilated the rue of the search engines. My understanding of the banning process carried by some foreign language search engines are:

spider (deep crawl) all sites (ofcause declude the extremely bad ones) every month -> filter, takes out the bad pages (hiddent links, duplicated contents etc) -> show all legitmit pages (no matter wheather the site spams or not).

For the extremely bad sites, send a deep crawler every 3-6 months. etc...

In my oppion, this is a very logical proces...

BigDave

3:27 am on May 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It is also logical to make trying to cheat very expensive so that those that do it will have a bigger disincentive.

If you were in a poker game, and someone found some cards up your sleeve, they would not say "this hand does not count". At the very least they would never play with you again, at the most someone might find what was left of your body in a ditch.

If you are a real problem, why should google bother with you. There are plenty of sites that do not cheat, so they can just use them.

It is also good to remember that of every site that claims to be completely clean when banned, there have only been a couple that have been. Those webmasters are not telling you everything.

WebGuerrilla

3:30 am on May 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member




soho,

The threat of losing the entire site is the only thing that guarantees that most of us will behave ourselves. :)

If Google were to only dump the actual page that contained offensive content, then Google would end up with much more of it.

sohu8976

4:48 am on May 13, 2003 (gmt 0)

10+ Year Member



Hi WebGuerrilla and BigDave:

I see you points. However, if Google can come up with a kick ass AI, plus, they never reveal their code, so, the spammers may very well create those testing spam pages to keep test the filter, but only then, the filter will become more and more intellegent.

You may say ban the entire site would save google from worrying about the details and scare the WebMaster more, but there is a better way which is filter the individual pages, instead of sites...

The bad Chinese king once said, he rather killed 1000 innocent people instead of risking 1 enemy leak through, I hope goole does not turn to be a bad king in people's heart. :)

BigDave

5:10 am on May 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



f Google can come up with a kick ass AI,

and

which is filter the individual pages, instead of sites...

My simple reply is 5,000,000,000 pages.

They can only run very simple filters on *all* the pages. Fancy AI filters would only be able to run on a few million of the pages each month. And we are not even getting into the fresh pages here. It may happen in the future, but it ain't going to happen yet.

With this disparity, they *must* have a disincentive to the cheaters.