Welcome to WebmasterWorld Guest from 54.145.144.101

Message Too Old, No Replies

Does Google Ban or Filter Web Directories?

   
1:06 pm on Jul 28, 2005 (gmt 0)

10+ Year Member



I think the subject worth a thread itself. It's a suspision so far. Yet I don't see dmoz, yahoo nor any major web directory were banned/filter nor PRed zero as my web directory did. I tried to check it in Alexa (powered by google) and I see some results from my site. Appearently, Alexa brings old results from Google but something weird is that Alexa itself has PR0 now. But that's another story!

If you run a web directory, feel free to post your experience here.

10:37 pm on Aug 9, 2005 (gmt 0)

10+ Year Member



I have tried opening the following related subject on another thread, but appearently WW moderators did not approve it.

If you have pages that does not comply with google webmaster guidelines (duplication content, etc..) and it happened that these pages are very usefull for your users, would banning googlebot via robots.txt from crawling/indexing these pages be enough action before asking for an inclusion request?

An example, assume that your site is banned for having an ODP clone, would excluding googlebot from crawling/indexing the ODP section in your site helps removing the ban?

10:51 pm on Aug 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My site was not a directory. It *was* the recipient of a site wide link on another banned site. Perhaps the filter penalized both the "sellers" of links and the "buyers" of same.

Now that would be interesting if they really would penalize a site for ROS links instead of neutralizing the ROS links. I have seen evidence (although not conclusive enough to state it as fact) in the past where ROS links did not help sites at all. I never believed in those anyways ;)

5:44 am on Aug 10, 2005 (gmt 0)

10+ Year Member



I have tried opening the following related subject on another thread, but appearently WW moderators did not approve it.
If you have pages that does not comply with google webmaster guidelines (duplication content, etc..) and it happened that these pages are very usefull for your users, would banning googlebot via robots.txt from crawling/indexing these pages be enough action before asking for an inclusion request?

An example, assume that your site is banned for having an ODP clone, would excluding googlebot from crawling/indexing the ODP section in your site helps removing the ban?

My best guess is not. When you do an inclusion request I don't think they spend much time reading about what you have done. They probably just visit the pages they have listed as problem pages and if they are still there they move on to the next site. I think you are lucky if they spend more than 30 seconds reviewing your site.

5:52 am on Aug 10, 2005 (gmt 0)

10+ Year Member



I think you are lucky if they spend more than 30 seconds reviewing your site

C'mon, my logs show that they spent a holly two minutes in their last review. Don't be exagurative ;)

2:06 pm on Aug 10, 2005 (gmt 0)

10+ Year Member



I am not. I can only speak for myself and my site ;o)
4:30 pm on Aug 10, 2005 (gmt 0)

10+ Year Member



I have a manually compiled directory - online yellow pages equivalent, 100% original content, took more than 4 years to compile. Earlier I got very little traffic from google, most of it was from yahoo. Suddenly, since the start of this month, the traffic from google has increased a lot.
I have done no SEO , 0 link exchange.So I dont think they are banning real web directories which take a lot of effort .
8:50 pm on Aug 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



caran1

>>I have a manually compiled directory - online yellow pages equivalent, 100% original content, took more than 4 years to compile.<<

I think that manually compiled are the magic words in your case.

You may wish to view webdude post msg #:52 with some indications in that direction:

[webmasterworld.com...]

10:27 pm on Aug 10, 2005 (gmt 0)

5+ Year Member



I still thinking as first day. They did ban some spammers, just for make smoke for ban future competitors.
Anyway, my directory recovers the traffic (of course from the serious Search Engines,not Google) and life continues.

I will be proud to exchange links with any webmaster with a serious content, but banned website by Google.

I will start for september several projects anti-Google. I still thinking that let know users the perversions and contradictions of Big G is very important.

3:11 pm on Aug 13, 2005 (gmt 0)

10+ Year Member



But still the magic of google continues..
<snip>, nothing by an exact ODP copy with adsense and some other ads. Thousands of pages indexed by google and rank high in serps. Discrimination!:)

[edited by: lawman at 12:30 pm (utc) on Aug. 14, 2005]

3:26 pm on Aug 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



moftary

>> Discrimination!:)<<

No..no. Thats whats called: Mutually Beneficial Business Relationships ;-)

3:40 pm on Aug 13, 2005 (gmt 0)

10+ Year Member



I wish it was but no my friend :)

Unfortunately my nuked directory was making thousands of dollars from adsense only, but at least it was seeded by ODP not an exact clone *sigh*

12:27 am on Aug 14, 2005 (gmt 0)

5+ Year Member



Some news, fresh for me, but probably not for others.

Seems Google did buy the services of some hundreds (probably thousends) of kiddies for different Universities for review the SERPS.
As far my info arrives, are these kiddies the ones giving subjetive information and banning hundreds of honest websites...

Any one has more about that?

2:43 am on Aug 14, 2005 (gmt 0)



are these kiddies the ones giving subjetive information and banning hundreds of honest websites...

It's highly unlikely that contract quality evaluators would have that authority. More likely scenarios are:

1) The evaluators flag sites for manual review by Google employees, and/or...

2) Their quality evaluations provide data that can be used to refine and test Google's search algorithms.

5:11 am on Aug 14, 2005 (gmt 0)

10+ Year Member



Their quality evaluations provide data that can be used to refine and test Google's search algorithms.

I think that's the case only..

IMO, the ban incident was automatically according to an algorithm, but if it's that's the case surely it's a weak algorithm since it caught some scraper sites, left some scraper sites and hit some legitemate ones.

12:18 pm on Aug 14, 2005 (gmt 0)

10+ Year Member



Algo changes are made to remove spammy sites from the serps. Whenever these changes are made there will always be some good sites that get knocked down along with the bad. Maybe it's time to start blaming the spammers instead of blaming google.
12:45 pm on Aug 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



But still the magic of google continues..
www.someonesdomain.net, nothing by an exact ODP copy with adsense and some other ads. Thousands of pages indexed by google and rank high in serps. Discrimination!

First of all, why are you putting other people url's into focus when you know that's not allowed. Second of all I cannot find that site in the top 100 results for any topic in it's categories - I have checked (just a dozen or so).

The only thing I see different with the URL you mentioned and your own directories that were hit are:

They give attribution to dmoz.
They show the actual listings without a 1-1/2 page down scroll down past the ads?

IMO, the ban incident was automatically according to an algorithm, but if it's that's the case surely it's a weak algorithm since it caught some scraper sites, left some scraper sites and hit some legitemate ones.

Define legitimate please. If you have multiple copies/mirrors of the same site, or very high percentage of duplicate content taken from other places you have something to worry about – if you don't, there isn't anything to worry about…yet.

1:40 pm on Aug 14, 2005 (gmt 0)

10+ Year Member



If you have multiple copies/mirrors of the same site, or very high percentage of duplicate content taken from other places you have something to worry about – if you don't, there isn't anything to worry about…yet.

One directory site that I considered could be a duplicate/mirror of mine has been removed from that directory at my request. Now that same site apears as a dead link and it still remains in the SERPs as a dead link a week later. Also, it's still positioned ahead of mine most of the time. Shouldn't any dead link have zero value almost instantly in the SERPs. If it is a case of being given a grace period to recover from dead link status, shouldn't it at least drop to the bottom of the list until officially declared zero value?
3:45 pm on Aug 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Shouldn't any dead link have zero value almost instantly in the SERPs.

Yes, in a perfect world, but I have a site that was taken down for almost 3 years and it's pages continued to rank in the top three positions. I recently brought the site back to life and plan on redeveloping it - why waste a domain if it doesn't want to go away.

4:17 pm on Aug 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The Contractor

>> I have a site that was taken down for almost 3 years and it's pages continued to rank in the top three positions.<<

That tells much about the quality of Google´s serps (The Graveyard) than your site that refuse to die ;-)

4:27 pm on Aug 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That tells much about the quality of Google´s serps (The Graveyard) than your site that refuse to die ;-)

Yeah, except it did just as well in Yahoo also...hehe

4:40 pm on Aug 14, 2005 (gmt 0)



IMO, the ban incident was automatically according to an algorithm, but if it's that's the case surely it's a weak algorithm since it caught some scraper sites, left some scraper sites and hit some legitemate ones.

A search algorithm is a work in progress, just like the Web itself.

It's reasonable to expect improvement but not perfection.

10:45 pm on Aug 14, 2005 (gmt 0)

10+ Year Member



Is it normal that banned sites gets updates in the Cache? I just noticed that the cache of my main page was updated 11 aug. (banned 28 Jul)
12:40 am on Aug 15, 2005 (gmt 0)



>> Is it normal that banned sites gets updates in the Cache? I just noticed that the cache of my main page was updated 11 aug. (banned 28 Jul)

you aren't banned then, maybe penalized somehow. Banned sites have no cache or indexed pages at all.

1:44 am on Aug 15, 2005 (gmt 0)

10+ Year Member



you aren't banned then, maybe penalized somehow. Banned sites have no cache or indexed pages at all.

I cannot find the term "banned" in any google reference so a "penalty" is the same as the "ban". A penalty does not mean that your site ranks #81479193 in the serps, it means that it's completely filtered/deindexed from the serps.

My sites that were banned disappeared completely from the index, filtered, but they were cached though. And then a day after a day got deindexed, removed permenantly, and accordingly from the cache also.

So I guess that it is strange that google updates its cache for a penalized/banned site. The same strange thing is that some googlebots still spidering the penalized sites.

I tried openning "why do googlebots spider penalized sites?" thread but it seems that moderators dissapproved it.

3:05 am on Aug 15, 2005 (gmt 0)



So I guess that it is strange that google updates its cache for a penalized/banned site. The same strange thing is that some googlebots still spidering the penalized sites.

That doesn't seem unreasonable, since the googlebots have no way of knowing when or whether a penalty might be lifted. If a penalty were lifted, Google would obviously want to have up-to-date data in its index.

3:52 am on Aug 15, 2005 (gmt 0)



From the 28th when my site was banned until the 5th when I submitted a reinclusion request Gbot only downloaded the sitemap, maybe 2-4 times a day and did nothing else.

Since the 5th I have been getting about 25% of my pages spidered each day. But some pages are inexplicably visited repeatedly so I don't think the coverage is necessarily complete.

My cached pages are still all dated pre July 28.

I take the spidering as an indication that there's still hope for reinclusion without further modification but I'll give it another week before I block Gbot and begin making changes.

Since y and M still provide some income from the (still) suspect pages I won't remove them completely.

This would be easier if Google didn't make it into some petty guessing game. They are definitely losing what good will they may have had with me.

4:15 am on Aug 15, 2005 (gmt 0)

10+ Year Member



EFV, that makes sense really.

andrea99, there is a very important issue we, victims of google, must be aware of. A spammer is a spammer, and all searching engines guidelines are the same. If your website is considered a spam, you would be banned from all searching engines and yes they would ban you actually before google be aware that you are spamming as their index is considered smaller than google's. Anyway the point is, if you are totally confident that you're on the right direction and you find all other searching engines ranking you well except for our beloved google that's banning you, then obviously they are the one that should correct their mistake, not you.

Yes they might were providing you 70% of your traffic but still a principle is a principle, not all the world is wrong and google is right.

Besides, when you send a reinclusion request and a google engineer visit your site for a couple of minutes then leaves your website banned as is. Cant this engineer take a little of his time just to let you know what TOS your site did break? Or is it google policy to let us play the guessing game?

On another thread we see Matt Cutts, a very friendly google engineer contacting a webmaster, that was complaining about his site penalty, and telling him in deeeep details where the webmaster went wrong. I think google should enforce this policy and start to be a good boy again:)

Peace

5:00 am on Aug 15, 2005 (gmt 0)

10+ Year Member



you aren't banned then, maybe penalized somehow. Banned sites have no cache or indexed pages at all.

site:www.domain.com = no results since 28 jul
link:www.domain.com = no results since 28 jul
cache:www.domain.com = 11 aug cache

That doesn't seem unreasonable, since the googlebots have no way of knowing when or whether a penalty might be lifted. If a penalty were lifted, Google would obviously want to have up-to-date data in its index.

My site still gets spidered a little bit, but there is a big difference between before and after 28 Jul.
It alsmost looks like it gets 2-3 pages deep before it all of a sudden remembers that I am banned ;o)

11:53 am on Aug 15, 2005 (gmt 0)

10+ Year Member



Doesn't Google work like this:

1. It sends out spiders.
2. It grabs the page data, and saves it.
3. Other software goes through the saved data, and decides its value?

No. 3 happens separately, on a hard disk in America somewhere. There's where it filters sites. The bots don't. If you're on a ban list, that's where you're filtered out.

I got my sites back in in a week by completely removing the 'grey' content; pseudo-directory pages with the links running through a redirect script. If the junk is still there when the engineer comes to visit, I imagine you stay banned.

1:29 pm on Aug 15, 2005 (gmt 0)

10+ Year Member



and what benifits does google make when spidering banned sites? too much resources they have?
Also the penalty is well connected to the activity of googlebots; Before the penalty I had tens of googlebots spidering my directory, after the ban it's only one or two.
This 588 message thread spans 20 pages: 588  

Featured Threads

Hot Threads This Week

Hot Threads This Month