homepage Welcome to WebmasterWorld Guest from 54.205.144.54
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 588 message thread spans 20 pages: < < 588 ( 1 ... 8 9 10 11 12 13 14 15 16 17 [18] 19 20 > >     
Does Google Ban or Filter Web Directories?
moftary




msg:726185
 1:06 pm on Jul 28, 2005 (gmt 0)

I think the subject worth a thread itself. It's a suspision so far. Yet I don't see dmoz, yahoo nor any major web directory were banned/filter nor PRed zero as my web directory did. I tried to check it in Alexa (powered by google) and I see some results from my site. Appearently, Alexa brings old results from Google but something weird is that Alexa itself has PR0 now. But that's another story!

If you run a web directory, feel free to post your experience here.

 

moftary




msg:726695
 10:37 pm on Aug 9, 2005 (gmt 0)

I have tried opening the following related subject on another thread, but appearently WW moderators did not approve it.

If you have pages that does not comply with google webmaster guidelines (duplication content, etc..) and it happened that these pages are very usefull for your users, would banning googlebot via robots.txt from crawling/indexing these pages be enough action before asking for an inclusion request?

An example, assume that your site is banned for having an ODP clone, would excluding googlebot from crawling/indexing the ODP section in your site helps removing the ban?

The Contractor




msg:726696
 10:51 pm on Aug 9, 2005 (gmt 0)

My site was not a directory. It *was* the recipient of a site wide link on another banned site. Perhaps the filter penalized both the "sellers" of links and the "buyers" of same.

Now that would be interesting if they really would penalize a site for ROS links instead of neutralizing the ROS links. I have seen evidence (although not conclusive enough to state it as fact) in the past where ROS links did not help sites at all. I never believed in those anyways ;)

runboy




msg:726697
 5:44 am on Aug 10, 2005 (gmt 0)

I have tried opening the following related subject on another thread, but appearently WW moderators did not approve it.
If you have pages that does not comply with google webmaster guidelines (duplication content, etc..) and it happened that these pages are very usefull for your users, would banning googlebot via robots.txt from crawling/indexing these pages be enough action before asking for an inclusion request?

An example, assume that your site is banned for having an ODP clone, would excluding googlebot from crawling/indexing the ODP section in your site helps removing the ban?

My best guess is not. When you do an inclusion request I don't think they spend much time reading about what you have done. They probably just visit the pages they have listed as problem pages and if they are still there they move on to the next site. I think you are lucky if they spend more than 30 seconds reviewing your site.

moftary




msg:726698
 5:52 am on Aug 10, 2005 (gmt 0)

I think you are lucky if they spend more than 30 seconds reviewing your site

C'mon, my logs show that they spent a holly two minutes in their last review. Don't be exagurative ;)

runboy




msg:726699
 2:06 pm on Aug 10, 2005 (gmt 0)

I am not. I can only speak for myself and my site ;o)

caran1




msg:726700
 4:30 pm on Aug 10, 2005 (gmt 0)

I have a manually compiled directory - online yellow pages equivalent, 100% original content, took more than 4 years to compile. Earlier I got very little traffic from google, most of it was from yahoo. Suddenly, since the start of this month, the traffic from google has increased a lot.
I have done no SEO , 0 link exchange.So I dont think they are banning real web directories which take a lot of effort .

reseller




msg:726701
 8:50 pm on Aug 10, 2005 (gmt 0)

caran1

>>I have a manually compiled directory - online yellow pages equivalent, 100% original content, took more than 4 years to compile.<<

I think that manually compiled are the magic words in your case.

You may wish to view webdude post msg #:52 with some indications in that direction:

[webmasterworld.com...]

carlosnx




msg:726702
 10:27 pm on Aug 10, 2005 (gmt 0)

I still thinking as first day. They did ban some spammers, just for make smoke for ban future competitors.
Anyway, my directory recovers the traffic (of course from the serious Search Engines,not Google) and life continues.

I will be proud to exchange links with any webmaster with a serious content, but banned website by Google.

I will start for september several projects anti-Google. I still thinking that let know users the perversions and contradictions of Big G is very important.

moftary




msg:726703
 3:11 pm on Aug 13, 2005 (gmt 0)

But still the magic of google continues..
<snip>, nothing by an exact ODP copy with adsense and some other ads. Thousands of pages indexed by google and rank high in serps. Discrimination!:)

[edited by: lawman at 12:30 pm (utc) on Aug. 14, 2005]

reseller




msg:726704
 3:26 pm on Aug 13, 2005 (gmt 0)

moftary

>> Discrimination!:)<<

No..no. Thats whats called: Mutually Beneficial Business Relationships ;-)

moftary




msg:726705
 3:40 pm on Aug 13, 2005 (gmt 0)

I wish it was but no my friend :)

Unfortunately my nuked directory was making thousands of dollars from adsense only, but at least it was seeded by ODP not an exact clone *sigh*

carlosnx




msg:726706
 12:27 am on Aug 14, 2005 (gmt 0)

Some news, fresh for me, but probably not for others.

Seems Google did buy the services of some hundreds (probably thousends) of kiddies for different Universities for review the SERPS.
As far my info arrives, are these kiddies the ones giving subjetive information and banning hundreds of honest websites...

Any one has more about that?

europeforvisitors




msg:726707
 2:43 am on Aug 14, 2005 (gmt 0)

are these kiddies the ones giving subjetive information and banning hundreds of honest websites...

It's highly unlikely that contract quality evaluators would have that authority. More likely scenarios are:

1) The evaluators flag sites for manual review by Google employees, and/or...

2) Their quality evaluations provide data that can be used to refine and test Google's search algorithms.

moftary




msg:726708
 5:11 am on Aug 14, 2005 (gmt 0)

Their quality evaluations provide data that can be used to refine and test Google's search algorithms.

I think that's the case only..

IMO, the ban incident was automatically according to an algorithm, but if it's that's the case surely it's a weak algorithm since it caught some scraper sites, left some scraper sites and hit some legitemate ones.

voices




msg:726709
 12:18 pm on Aug 14, 2005 (gmt 0)

Algo changes are made to remove spammy sites from the serps. Whenever these changes are made there will always be some good sites that get knocked down along with the bad. Maybe it's time to start blaming the spammers instead of blaming google.

The Contractor




msg:726710
 12:45 pm on Aug 14, 2005 (gmt 0)

But still the magic of google continues..
www.someonesdomain.net, nothing by an exact ODP copy with adsense and some other ads. Thousands of pages indexed by google and rank high in serps. Discrimination!

First of all, why are you putting other people url's into focus when you know that's not allowed. Second of all I cannot find that site in the top 100 results for any topic in it's categories - I have checked (just a dozen or so).

The only thing I see different with the URL you mentioned and your own directories that were hit are:

They give attribution to dmoz.
They show the actual listings without a 1-1/2 page down scroll down past the ads?

IMO, the ban incident was automatically according to an algorithm, but if it's that's the case surely it's a weak algorithm since it caught some scraper sites, left some scraper sites and hit some legitemate ones.

Define legitimate please. If you have multiple copies/mirrors of the same site, or very high percentage of duplicate content taken from other places you have something to worry about – if you don't, there isn't anything to worry about…yet.

frfvr




msg:726711
 1:40 pm on Aug 14, 2005 (gmt 0)

If you have multiple copies/mirrors of the same site, or very high percentage of duplicate content taken from other places you have something to worry about – if you don't, there isn't anything to worry about…yet.

One directory site that I considered could be a duplicate/mirror of mine has been removed from that directory at my request. Now that same site apears as a dead link and it still remains in the SERPs as a dead link a week later. Also, it's still positioned ahead of mine most of the time. Shouldn't any dead link have zero value almost instantly in the SERPs. If it is a case of being given a grace period to recover from dead link status, shouldn't it at least drop to the bottom of the list until officially declared zero value?

The Contractor




msg:726712
 3:45 pm on Aug 14, 2005 (gmt 0)

Shouldn't any dead link have zero value almost instantly in the SERPs.

Yes, in a perfect world, but I have a site that was taken down for almost 3 years and it's pages continued to rank in the top three positions. I recently brought the site back to life and plan on redeveloping it - why waste a domain if it doesn't want to go away.

reseller




msg:726713
 4:17 pm on Aug 14, 2005 (gmt 0)

The Contractor

>> I have a site that was taken down for almost 3 years and it's pages continued to rank in the top three positions.<<

That tells much about the quality of Google´s serps (The Graveyard) than your site that refuse to die ;-)

The Contractor




msg:726714
 4:27 pm on Aug 14, 2005 (gmt 0)

That tells much about the quality of Google´s serps (The Graveyard) than your site that refuse to die ;-)

Yeah, except it did just as well in Yahoo also...hehe

europeforvisitors




msg:726715
 4:40 pm on Aug 14, 2005 (gmt 0)

IMO, the ban incident was automatically according to an algorithm, but if it's that's the case surely it's a weak algorithm since it caught some scraper sites, left some scraper sites and hit some legitemate ones.

A search algorithm is a work in progress, just like the Web itself.

It's reasonable to expect improvement but not perfection.

runboy




msg:726716
 10:45 pm on Aug 14, 2005 (gmt 0)

Is it normal that banned sites gets updates in the Cache? I just noticed that the cache of my main page was updated 11 aug. (banned 28 Jul)

walkman




msg:726717
 12:40 am on Aug 15, 2005 (gmt 0)

>> Is it normal that banned sites gets updates in the Cache? I just noticed that the cache of my main page was updated 11 aug. (banned 28 Jul)

you aren't banned then, maybe penalized somehow. Banned sites have no cache or indexed pages at all.

moftary




msg:726718
 1:44 am on Aug 15, 2005 (gmt 0)

you aren't banned then, maybe penalized somehow. Banned sites have no cache or indexed pages at all.

I cannot find the term "banned" in any google reference so a "penalty" is the same as the "ban". A penalty does not mean that your site ranks #81479193 in the serps, it means that it's completely filtered/deindexed from the serps.

My sites that were banned disappeared completely from the index, filtered, but they were cached though. And then a day after a day got deindexed, removed permenantly, and accordingly from the cache also.

So I guess that it is strange that google updates its cache for a penalized/banned site. The same strange thing is that some googlebots still spidering the penalized sites.

I tried openning "why do googlebots spider penalized sites?" thread but it seems that moderators dissapproved it.

europeforvisitors




msg:726719
 3:05 am on Aug 15, 2005 (gmt 0)

So I guess that it is strange that google updates its cache for a penalized/banned site. The same strange thing is that some googlebots still spidering the penalized sites.

That doesn't seem unreasonable, since the googlebots have no way of knowing when or whether a penalty might be lifted. If a penalty were lifted, Google would obviously want to have up-to-date data in its index.

andrea99




msg:726720
 3:52 am on Aug 15, 2005 (gmt 0)

From the 28th when my site was banned until the 5th when I submitted a reinclusion request Gbot only downloaded the sitemap, maybe 2-4 times a day and did nothing else.

Since the 5th I have been getting about 25% of my pages spidered each day. But some pages are inexplicably visited repeatedly so I don't think the coverage is necessarily complete.

My cached pages are still all dated pre July 28.

I take the spidering as an indication that there's still hope for reinclusion without further modification but I'll give it another week before I block Gbot and begin making changes.

Since y and M still provide some income from the (still) suspect pages I won't remove them completely.

This would be easier if Google didn't make it into some petty guessing game. They are definitely losing what good will they may have had with me.

moftary




msg:726721
 4:15 am on Aug 15, 2005 (gmt 0)

EFV, that makes sense really.

andrea99, there is a very important issue we, victims of google, must be aware of. A spammer is a spammer, and all searching engines guidelines are the same. If your website is considered a spam, you would be banned from all searching engines and yes they would ban you actually before google be aware that you are spamming as their index is considered smaller than google's. Anyway the point is, if you are totally confident that you're on the right direction and you find all other searching engines ranking you well except for our beloved google that's banning you, then obviously they are the one that should correct their mistake, not you.

Yes they might were providing you 70% of your traffic but still a principle is a principle, not all the world is wrong and google is right.

Besides, when you send a reinclusion request and a google engineer visit your site for a couple of minutes then leaves your website banned as is. Cant this engineer take a little of his time just to let you know what TOS your site did break? Or is it google policy to let us play the guessing game?

On another thread we see Matt Cutts, a very friendly google engineer contacting a webmaster, that was complaining about his site penalty, and telling him in deeeep details where the webmaster went wrong. I think google should enforce this policy and start to be a good boy again:)

Peace

runboy




msg:726722
 5:00 am on Aug 15, 2005 (gmt 0)

you aren't banned then, maybe penalized somehow. Banned sites have no cache or indexed pages at all.

site:www.domain.com = no results since 28 jul
link:www.domain.com = no results since 28 jul
cache:www.domain.com = 11 aug cache

That doesn't seem unreasonable, since the googlebots have no way of knowing when or whether a penalty might be lifted. If a penalty were lifted, Google would obviously want to have up-to-date data in its index.

My site still gets spidered a little bit, but there is a big difference between before and after 28 Jul.
It alsmost looks like it gets 2-3 pages deep before it all of a sudden remembers that I am banned ;o)

tigertom




msg:726723
 11:53 am on Aug 15, 2005 (gmt 0)

Doesn't Google work like this:

1. It sends out spiders.
2. It grabs the page data, and saves it.
3. Other software goes through the saved data, and decides its value?

No. 3 happens separately, on a hard disk in America somewhere. There's where it filters sites. The bots don't. If you're on a ban list, that's where you're filtered out.

I got my sites back in in a week by completely removing the 'grey' content; pseudo-directory pages with the links running through a redirect script. If the junk is still there when the engineer comes to visit, I imagine you stay banned.

moftary




msg:726724
 1:29 pm on Aug 15, 2005 (gmt 0)

and what benifits does google make when spidering banned sites? too much resources they have?
Also the penalty is well connected to the activity of googlebots; Before the penalty I had tens of googlebots spidering my directory, after the ban it's only one or two.

tigertom




msg:726725
 2:02 pm on Aug 15, 2005 (gmt 0)

Just guessing but:

Googlebot can spider banned sites, and save the data. At Google HQ, other software fillets said data. Banned sites are filtered there; they aren't allowed in SERPs.

Then the GoogleBot won't have to have a long 'banned' list - It can follow links to any site; much less resource intensive. If a site gets unbanned, bingo, they've got its data, it's back in immediately.

That doesn't explain, 'though, why some banned sites don't get spidered; mine didn't. I thought that was normal: you're banned, GoogleBot stops coming. Maybe all banned sites are flagged into a 'sin bin' at Google HQ; if a link leads there, don't follow it.

Maybe someone who knows for sure can comment? :)

This is speculation.

PS: You have to make sure it's the data GoogleBots which are visiting, and not Googlebot-Images or MediaPartners-Google

This 588 message thread spans 20 pages: < < 588 ( 1 ... 8 9 10 11 12 13 14 15 16 17 [18] 19 20 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved