Welcome to WebmasterWorld Guest from 23.20.12.34

Forum Moderators: Robert Charlton & aakk9999 & andy langton & goodroi

Message Too Old, No Replies

Does Google Ban or Filter Web Directories?

     
1:06 pm on Jul 28, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:Dec 21, 2003
posts:427
votes: 0


I think the subject worth a thread itself. It's a suspision so far. Yet I don't see dmoz, yahoo nor any major web directory were banned/filter nor PRed zero as my web directory did. I tried to check it in Alexa (powered by google) and I see some results from my site. Appearently, Alexa brings old results from Google but something weird is that Alexa itself has PR0 now. But that's another story!

If you run a web directory, feel free to post your experience here.

8:37 pm on Aug 2, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:Dec 21, 2003
posts:427
votes: 0


jd01, it's google right to ban/filter all sites that contains ODP. But that includes other websites that use ODP like excite, lycos, alexa, as well as google directory. I have said that so many times and I have also said that they should update their TOS and webmasters guiedlines accordingly.
8:49 pm on Aug 2, 2005 (gmt 0)

Senior Member

joined:Oct 27, 2001
posts:10210
votes: 0


Anyway, if they were a responsible, accountable company, what they would do is test-apply all of the index changes, and then email all of the adversely effected (i.e. banned) people before the change is applied. (they surely have all emails listed on your webpage).

Logistics would make that impossible.

Here's a more practical suggestion: Ask Google to create an e-form that Webmasters could use to request manual quality evaluations of their sites (maybe for a fee) before problems arose with the SERPs. The e-form could have checkboxes for "Duplicate content," "Empty directory categories," "Autogenerated template-based pages," "Scraper pages," and other topics of concern. Although Google wouldn't be able to guarantee a clean bill of health and continued rankings in the SERPs, Webmasters would at least be alerted to potential problems, and they'd have peace of mind if their Google reviews didn't identify any trouble areas.

(I'm assuming that everyone reading this is "white hat" and would have nothing to hide from Google.)

8:54 pm on Aug 2, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:Dec 21, 2003
posts:427
votes: 0


europeforvisitors, if you arent being sarcastic (english is my 3rd language, so forgive me) then it's a great idea.

Cant be applied though because of a simple reason; they cannot guranatee that ANY site would remain in their serps with their algorithms/policy changes.

8:57 pm on Aug 2, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 6, 2005
posts:1678
votes: 71


jd01

>>I think the real problem here is people have gotten away with reproducing content for so long, they think it is their right to reproduce it, then SEO it, then rank for it.<<

The top of the serps Im dealing with STILL junk after 6 months of continuous updating, reshuffling and tweaking. Try to run a query for free online advertising and see the top sites for yourself. Is that something Google engineers should be proud of?

9:00 pm on Aug 2, 2005 (gmt 0)

Full Member

10+ Year Member

joined:Mar 29, 2002
posts:244
votes: 0


I think the real problem here is people have gotten away with reproducing content for so long, they think it is their right to reproduce it, then SEO it, then rank for it.

Right! I have seen ODP dumped directories with Adsense slapped into them to try and earn a buck.. That alone is no different from a scraper site. It is like - Hey! I'm going to scrape your content and make money on it.

moftary: In relation to your problem, which applies to a few others here:

Google does not have to update their TOS because they say it already in Webmaster Guidelines:

Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

This does not mean dump or scrape content into your site.

Dayo_UK

9:00 pm on Aug 2, 2005 (gmt 0)

Inactive Member
Account Expired

 
 


moftary

EFV is only being sarcastic in his last sentence.

Although it is a good idea from EFV - you will just have companies permanetly pushing the border line and asking Google all the time - is this to much? G = No - push a bit harder - is this to much? Google says Yes, they launch another site just below the threshold. And obv Google will want to change aspects all the time.

So will probably be to hard to implement.

9:11 pm on Aug 2, 2005 (gmt 0)

Senior Member

joined:Oct 27, 2001
posts:10210
votes: 0


europeforvisitors, if you arent being sarcastic

Well, maybe a little tongue-in-cheek, but not really sarcastic. :-) And while it may or may not be a good idea, I don't think it's likely to happen. To use a term that's often associated with Google, it probably isn't "scalable" (at least not easily).

9:12 pm on Aug 2, 2005 (gmt 0)

Senior Member from KZ 

WebmasterWorld Senior Member lammert is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 10, 2005
posts:2895
votes: 5


Ask Google to create an e-form that Webmasters could use to request manual quality evaluations of their sites (maybe for a fee) before problems arose with the SERPs.

It could be done even easier. There are HTML and CSS verifiers on the net, why not a search engine verifier? I don't think Google will make such a service available--because with a Google certified verifier it would be too easy for SEOers to reverse engineer Google's algorithm--but an independent service tuned with knowledge from people on this board could be made very accurate IMHO.

There are many smart programmers around here. Could be a nice project to set up such a system. Parameters which can be taken into account could be "percentage of reciprocal links", "percentage overlap with ODP", "number of content pages vs. number of directory pages" etc. By feeding this system with enough both healty and banned websites, I believe it is possible to tune such a system to give fairly accurate answers.

9:25 pm on Aug 2, 2005 (gmt 0)

New User

10+ Year Member

joined:June 16, 2005
posts:18
votes: 0


Perhaps a solution would be to have an ODP dump but only for registered users. Googlebot wouldn't have access to it and there would be no problem.
9:33 pm on Aug 2, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:May 16, 2003
posts:992
votes: 0


Create a useful, information-rich site, and write pages that clearly and accurately describe your content.


This does not mean dump or scrape content into your site.

Maybe it also means, don't create a directory. By definition a directory is just a list of links, with brief descriptions. In many ways this is in direct competition with Google for the eyeballs of searchers. Google has an incentive to use the results of unique directories to order its results, but less of an incentive to display those directory pages in its search results.

More importantly, directories do not generally present "natural" links, which is to say links in the context of articles, blogs or other writings. Here's a theory: The greater the quantity of links, the less likely it is that these were compiled by an expert in the subject. Perhaps this is Google's thinking behind this latest move?

9:55 pm on Aug 2, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 28, 2002
posts:3444
votes: 1


Its about time those directories are baned from google, who wants to search google, click a result and then make a search again on another site - I want a result not another directory/scraper/SE.
10:15 pm on Aug 2, 2005 (gmt 0)

Full Member

10+ Year Member

joined:Oct 26, 2004
posts:319
votes: 0


You got in before me, Oh mighty Zeus! :)

Google is a _search engine_. It doesn't need to list directories; they're same thing it provides, at one remove. It has a billion pages; losing a few million pages which point to other pages (which it already indexes) is no problem. To Google.

The ODP and Google's copy of it is a useful resource to them; a list of sites a human being has given the 'once-over' too. Saves them paying someone to do it.

A. Webmaster's copy of same is a waste of their hard-disk space.

A lot of people on WebmasterWorld have complained about scrapers, and pseudo or 'me-too' directory sites clogging up the SERPs; is it the case a different set of people are complaining now that these are booted?

I got banned too. I know why. I'll never put up substandard pages again. No more link pages of semi-relevant 'me too' sites either. Probably no reciprocal links pages either.

Do a reinclusion request, if you can, hand on heart, say yours is a great site, or even a good one, offering unique content.

If you can't please don't gripe what Google should do here, or how bad you think they are. I doubt they're listening. It means I have to scroll through it to find the juicy tips WebmasterWorld is famous for.

Did I really read one poster suggesting Google set up a form where one could ask 'Is my auto-generated site spam?'

[giggles insanely]

I want to read tips on what _we_ should do to get back in the index.

10:16 pm on Aug 2, 2005 (gmt 0)

New User

10+ Year Member

joined:Apr 9, 2005
posts:17
votes: 0


"Logistics would make that impossible."

That's not true at all. The people can be easily taken out of the equation, its just a diff program against the two indices and an emailer notice. The point is twofold:

1. everyone with legitimate sites gets some warning -- meaning that objections occur at a slower rate, rather than all at once (i.e. people know what is going on).
2. with a simple voting script to be included on website owners pages, they can get immediately feedback from distinct ips about the quality of the site being removed -- so it potentially is never removed, and if it is, the owners still have had reasonable warning to object.

I'm sick of you google supporters who are so willing to take their side on account of "being in their shoes". There's nothing special about what they do, or the way they do it, or the size of their index. The only thing they have is a bunch of webmasters willing to throw their own hard work out the window, rather than stand up and object. They are not out to help you, or anyone else for that matter. Google is out to help themselves. They make the index so they can maintain control of the information retrieval channel, not to help you or me make the web a better place, or help us find things. Its a sad day when you are willing to give up your or your neighbors work out of deference to an entity that does not have your interests at heart.

Maybe this should put it in perspective: Google's market cap is $80+ Billion.

10:22 pm on Aug 2, 2005 (gmt 0)

New User

10+ Year Member

joined:Apr 9, 2005
posts:17
votes: 0


The problem is that people here are both making money off of google, and trying to be listed in their results. Which creates an obvious conflict of interest.

Its obvious that they are trying to pander to the almighty Google for reinclusion. I thought this thread was for legitimate discussion of nonscraper/nonseo'd sites who do normal business, but then were arbitrarily banned by mistake (i.e. because of being a web directory).

10:30 pm on Aug 2, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 28, 2002
posts:3444
votes: 1


tigertom - It nice to see some are still loyale to me down on Earth. :)
10:47 pm on Aug 2, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 12, 2004
posts:1355
votes: 0


Hey, good discussion...

It's funny to hear people say that the reason that their site should not be banned is because it never had been banned before. (Not a good reason.) I hate to admit it, but after re-reading my re-inclusion request that I sent days ago, that seems to be my best reason as well.

Can anyone tell me if they've even gotten a response from their re-inclusion request? (Not the auto-response, a human response.)

10:54 pm on Aug 2, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 9, 2005
posts:1509
votes: 0


Some of us are not necessarily Google supporters, if you think what they are doing is wrong user-agent: Googlebot Disallow: / is perfectly appropriate.

A little reality - we do not right the rules, so we have to work to find out exactly what they are then play by them...

Its a sad day when you are willing to give up your or your neighbors work out of deference to an entity that does not have your interests at heart.

I guess perspective and context of posts is important here - I do not consider publishing a DMOZ dump hard work, nor do I believe scraping is very tough.

Would those who are complaining about being lost in the Google shuffle, please add a link (right at the top is fine) on every one of your pages to my site(s)? No?

Why not?

But, my site deserves to be listed on your site...
My site is better than all the rest of the ones you have links to now...
How will I continue to make my living if you don't do it?
You are not excluding my hard work from your directory are you?
Why would you do that?

Justin

11:11 pm on Aug 2, 2005 (gmt 0)

New User

10+ Year Member

joined:Apr 9, 2005
posts:17
votes: 0


I don't think anyone considers publishing a dmoz dump or a scraper site hard work. I'm just speaking for the people with legit sites, who don't do anything wrong, who were hit. There is no reason to assume that because google removed you that you were doing something wrong. That puts too much power in their hands -- the power of right and wrong, and of internet standards. but the internet was around long before G, and will be around long after. As their results, quality, and how they treat businesses decline, that end is likely to be sooner rather than later.
11:28 pm on Aug 2, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:July 9, 2003
posts:735
votes: 0


These circular arguments are getting old. My directory had 315 category pages, all created by hand, by me. There were no empty categories, and no data was drawn from any other site.

Google removed my PageRank 6 domain, without explanation or warning, and now refuses to respond to inquiries concerning it.

11:31 pm on Aug 2, 2005 (gmt 0)

New User

10+ Year Member

joined:Apr 9, 2005
posts:17
votes: 0


J_O
yeah, no doubt. no new information added to the discussion. can you sticky me the name of your domain? (if you want)
11:35 pm on Aug 2, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 9, 2005
posts:1509
votes: 0


My last post here:

without explanation or warning, and now refuses to respond to inquiries concerning it.

Why should they? Because, they are big? Because, it's the right thing to do? Why exactly should they give any reason to anyone?

If they do Bonus!

If I do not include a site in my links list, directory, or anywhere else after a request I do not feel an obligation to respond with why I did not add them. Do you? Do you tell every single site owner that requests to be added to your directory, but is not, exactly why they were not added and what to do to get added?

There was not a reply to my request to be added to the sites that were dumped, why not? What do I have to do to be added to your directory? I asked, I should at least get a response right? People owe that to me, don't they?

Justin

11:43 pm on Aug 2, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:July 9, 2003
posts:735
votes: 0


I guess they could correspond with people for the following reason: they're a global company attempting to interact with the public. Although I'm a webmaster, I'm also a surfer, and investor, etc. who deals with Google on more than one level.

Expecting some sort of professionalism is not asking too much. If they banned your site, they know why and could respond with a simple "yes your site is banned, because". This would give mistakenly exterminated site owners a chance to get back in.

1:28 am on Aug 3, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 23, 2004
posts:88
votes: 0


Did anyone with a legit white hat site that got banned have LESS than 1000 link partners? (either recip or not)
1:51 am on Aug 3, 2005 (gmt 0)

Full Member

10+ Year Member

joined:Feb 4, 2005
posts:231
votes: 0


Buddha - yes, I had LESS THAN 1000 reciprocal Links and was banned, but then was fortunate enough to become reinstated back into Google's SERPs with no detectable difference. It's almost as if I wasn't deleted, except for the 2 days of subpar AdSense earnings.
4:06 am on Aug 3, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 20, 2004
posts:58
votes: 0


I don't know why everyone is thinking this is a directory related issue (ODP or other). I only got one site banned and it's not a directory at all. It's a small original content only website about goldfish. I wrote everything in it. No directory, no AWS, no nothing to inflate indexed page numbers. Simple original content site of about 20 pages explaining the basics of good goldfish care. I do have sites with directories and even sites with ODP clones. Non of them were banned, only this small content site. Go figure. I did apply for re-inclusion and I'm condifent that the site will be re-indexed soon.
5:43 am on Aug 3, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:June 7, 2005
posts:79
votes: 0


I had less than 1000 link partners. I did have a link exchange page, but then again, who doesn't? I only had 25 or so listings on it though, and all were pertinent to my topic.

I did have about 50 pages with syndicated articles that I put Adsense on, but other than that, I had 400 pages of original content. No ODP directory either.

From what I can tell, there is no rhyme or reason to the ban. I have several sites, and the biggest and the best was the one that was banned. It was also the only one that brought in any decent Adsense revenue whatsoever. That was my suspicion, but from what I read here, not everyone had Adsense?

6:01 am on Aug 3, 2005 (gmt 0)

Full Member

10+ Year Member

joined:May 4, 2005
posts:306
votes: 0


I randomly selected websites in

[dmoz.org...]

and i found out that 2 out of 10 are not in Google.

6:35 am on Aug 3, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 4, 2004
posts:801
votes: 0


"I'm not posting to this thread any more. If people need a hand delivered personalized letter informing them how to build a website...."

Oh, man, you're making good sense contractor, don't blame you though, gets annoying I know. But good posts anyway.

6:36 am on Aug 3, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:June 7, 2005
posts:79
votes: 0


That is very interesting.... If the 2 out of 10 sites that are no longer in Google are in the ODP, they should be in the Google directory clone of the ODP data, and thus indexed in the engine itself.

Hmmmm .... it is also very hypocritical to me that Google:
1)uses duplicate content (the ODP)
2)Has duplicate sites with duplicate content (the numerous data centers, IP addresses, and country sites)
3)Scrapes pages for it own content (what do you think SERPs are - nothing but scraped pages!)
4)Duplicate and copied content via news.google site and News search page.
5)Saves and copies other sites original content on its own servers (cached pages)without permission.

Not that any of that matters, they have the power to do what they want to do. A case of "Do what I say and not what I do".

I don't have as much problem with algo changes, but by completely removing sites, they are taking away the democracy of allowing sites a chance to compete and denying the end user the chance decide for themselves what site they want to visit and whether it is relevant. And in trying to decided what sites to censor, they took out many good sites in what is being referred to as "collateral damage".

Google used to believe that they used democracy via links, (among other things), to decide which sites were important. Now, there is no way they can say that - as sites are being completely censored without reason - a very un-democratic process.

On another note, I heard rumor of at least one TV news outlet that reported about Google censoring sites, and I believe that they Google "wouldn't return our calls for comment". It was probably the site of one of the reporters family members, but they mentioned the site name on the news, and it definitely was not a scraper.

6:57 am on Aug 3, 2005 (gmt 0)

Senior Member

joined:Oct 27, 2001
posts:10210
votes: 0


I don't have as much problem with algo changes, but by completely removing sites, they are taking away the democracy of allowing sites a chance to compete and denying the end user the chance decide for themselves what site they want to visit and whether it is relevant.

But it's the search engine's job to show users which pages are relevant to a query. Users want search engines to filter irrelevant or low-quality pages.

And in trying to decided what sites to censor, they took out many good sites in what is being referred to as "collateral damage".

I've been a victim of collateral damage myself, so I can sympathize with people whose pages or sites may have been harmed by algorithmic hiccups or overzealous filters. But equating editorial opinion (which is what search rankings are) with "censorship" is just plain silly, and collateral damage--which inevitably occurs with every update--is likely to get fixed, at least in cases where the affected content meets Google's criteria for what should be indexed.

This 588 message thread spans 20 pages: 588