Forum Moderators: Robert Charlton & goodroi
If you run a web directory, feel free to post your experience here.
Anyway, if they were a responsible, accountable company, what they would do is test-apply all of the index changes, and then email all of the adversely effected (i.e. banned) people before the change is applied. (they surely have all emails listed on your webpage).
Logistics would make that impossible.
Here's a more practical suggestion: Ask Google to create an e-form that Webmasters could use to request manual quality evaluations of their sites (maybe for a fee) before problems arose with the SERPs. The e-form could have checkboxes for "Duplicate content," "Empty directory categories," "Autogenerated template-based pages," "Scraper pages," and other topics of concern. Although Google wouldn't be able to guarantee a clean bill of health and continued rankings in the SERPs, Webmasters would at least be alerted to potential problems, and they'd have peace of mind if their Google reviews didn't identify any trouble areas.
(I'm assuming that everyone reading this is "white hat" and would have nothing to hide from Google.)
>>I think the real problem here is people have gotten away with reproducing content for so long, they think it is their right to reproduce it, then SEO it, then rank for it.<<
The top of the serps I´m dealing with STILL junk after 6 months of continuous updating, reshuffling and tweaking. Try to run a query for free online advertising and see the top sites for yourself. Is that something Google engineers should be proud of?
I think the real problem here is people have gotten away with reproducing content for so long, they think it is their right to reproduce it, then SEO it, then rank for it.
Right! I have seen ODP dumped directories with Adsense slapped into them to try and earn a buck.. That alone is no different from a scraper site. It is like - Hey! I'm going to scrape your content and make money on it.
moftary: In relation to your problem, which applies to a few others here:
Google does not have to update their TOS because they say it already in Webmaster Guidelines:
Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
This does not mean dump or scrape content into your site.
EFV is only being sarcastic in his last sentence.
Although it is a good idea from EFV - you will just have companies permanetly pushing the border line and asking Google all the time - is this to much? G = No - push a bit harder - is this to much? Google says Yes, they launch another site just below the threshold. And obv Google will want to change aspects all the time.
So will probably be to hard to implement.
europeforvisitors, if you arent being sarcastic
Well, maybe a little tongue-in-cheek, but not really sarcastic. :-) And while it may or may not be a good idea, I don't think it's likely to happen. To use a term that's often associated with Google, it probably isn't "scalable" (at least not easily).
Ask Google to create an e-form that Webmasters could use to request manual quality evaluations of their sites (maybe for a fee) before problems arose with the SERPs.
It could be done even easier. There are HTML and CSS verifiers on the net, why not a search engine verifier? I don't think Google will make such a service available--because with a Google certified verifier it would be too easy for SEOers to reverse engineer Google's algorithm--but an independent service tuned with knowledge from people on this board could be made very accurate IMHO.
There are many smart programmers around here. Could be a nice project to set up such a system. Parameters which can be taken into account could be "percentage of reciprocal links", "percentage overlap with ODP", "number of content pages vs. number of directory pages" etc. By feeding this system with enough both healty and banned websites, I believe it is possible to tune such a system to give fairly accurate answers.
Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
This does not mean dump or scrape content into your site.
Maybe it also means, don't create a directory. By definition a directory is just a list of links, with brief descriptions. In many ways this is in direct competition with Google for the eyeballs of searchers. Google has an incentive to use the results of unique directories to order its results, but less of an incentive to display those directory pages in its search results.
More importantly, directories do not generally present "natural" links, which is to say links in the context of articles, blogs or other writings. Here's a theory: The greater the quantity of links, the less likely it is that these were compiled by an expert in the subject. Perhaps this is Google's thinking behind this latest move?
Google is a _search engine_. It doesn't need to list directories; they're same thing it provides, at one remove. It has a billion pages; losing a few million pages which point to other pages (which it already indexes) is no problem. To Google.
The ODP and Google's copy of it is a useful resource to them; a list of sites a human being has given the 'once-over' too. Saves them paying someone to do it.
A. Webmaster's copy of same is a waste of their hard-disk space.
A lot of people on WebmasterWorld have complained about scrapers, and pseudo or 'me-too' directory sites clogging up the SERPs; is it the case a different set of people are complaining now that these are booted?
I got banned too. I know why. I'll never put up substandard pages again. No more link pages of semi-relevant 'me too' sites either. Probably no reciprocal links pages either.
Do a reinclusion request, if you can, hand on heart, say yours is a great site, or even a good one, offering unique content.
If you can't please don't gripe what Google should do here, or how bad you think they are. I doubt they're listening. It means I have to scroll through it to find the juicy tips WebmasterWorld is famous for.
Did I really read one poster suggesting Google set up a form where one could ask 'Is my auto-generated site spam?'
[giggles insanely]
I want to read tips on what _we_ should do to get back in the index.
That's not true at all. The people can be easily taken out of the equation, its just a diff program against the two indices and an emailer notice. The point is twofold:
1. everyone with legitimate sites gets some warning -- meaning that objections occur at a slower rate, rather than all at once (i.e. people know what is going on).
2. with a simple voting script to be included on website owners pages, they can get immediately feedback from distinct ips about the quality of the site being removed -- so it potentially is never removed, and if it is, the owners still have had reasonable warning to object.
I'm sick of you google supporters who are so willing to take their side on account of "being in their shoes". There's nothing special about what they do, or the way they do it, or the size of their index. The only thing they have is a bunch of webmasters willing to throw their own hard work out the window, rather than stand up and object. They are not out to help you, or anyone else for that matter. Google is out to help themselves. They make the index so they can maintain control of the information retrieval channel, not to help you or me make the web a better place, or help us find things. Its a sad day when you are willing to give up your or your neighbors work out of deference to an entity that does not have your interests at heart.
Maybe this should put it in perspective: Google's market cap is $80+ Billion.
Its obvious that they are trying to pander to the almighty Google for reinclusion. I thought this thread was for legitimate discussion of nonscraper/nonseo'd sites who do normal business, but then were arbitrarily banned by mistake (i.e. because of being a web directory).
It's funny to hear people say that the reason that their site should not be banned is because it never had been banned before. (Not a good reason.) I hate to admit it, but after re-reading my re-inclusion request that I sent days ago, that seems to be my best reason as well.
Can anyone tell me if they've even gotten a response from their re-inclusion request? (Not the auto-response, a human response.)
A little reality - we do not right the rules, so we have to work to find out exactly what they are then play by them...
Its a sad day when you are willing to give up your or your neighbors work out of deference to an entity that does not have your interests at heart.
I guess perspective and context of posts is important here - I do not consider publishing a DMOZ dump hard work, nor do I believe scraping is very tough.
Would those who are complaining about being lost in the Google shuffle, please add a link (right at the top is fine) on every one of your pages to my site(s)? No?
Why not?
But, my site deserves to be listed on your site...
My site is better than all the rest of the ones you have links to now...
How will I continue to make my living if you don't do it?
You are not excluding my hard work from your directory are you?
Why would you do that?
Justin
Google removed my PageRank 6 domain, without explanation or warning, and now refuses to respond to inquiries concerning it.
without explanation or warning, and now refuses to respond to inquiries concerning it.
Why should they? Because, they are big? Because, it's the right thing to do? Why exactly should they give any reason to anyone?
If they do Bonus!
If I do not include a site in my links list, directory, or anywhere else after a request I do not feel an obligation to respond with why I did not add them. Do you? Do you tell every single site owner that requests to be added to your directory, but is not, exactly why they were not added and what to do to get added?
There was not a reply to my request to be added to the sites that were dumped, why not? What do I have to do to be added to your directory? I asked, I should at least get a response right? People owe that to me, don't they?
Justin
Expecting some sort of professionalism is not asking too much. If they banned your site, they know why and could respond with a simple "yes your site is banned, because". This would give mistakenly exterminated site owners a chance to get back in.
I did have about 50 pages with syndicated articles that I put Adsense on, but other than that, I had 400 pages of original content. No ODP directory either.
From what I can tell, there is no rhyme or reason to the ban. I have several sites, and the biggest and the best was the one that was banned. It was also the only one that brought in any decent Adsense revenue whatsoever. That was my suspicion, but from what I read here, not everyone had Adsense?
Hmmmm .... it is also very hypocritical to me that Google:
1)uses duplicate content (the ODP)
2)Has duplicate sites with duplicate content (the numerous data centers, IP addresses, and country sites)
3)Scrapes pages for it own content (what do you think SERPs are - nothing but scraped pages!)
4)Duplicate and copied content via news.google site and News search page.
5)Saves and copies other sites original content on its own servers (cached pages)without permission.
Not that any of that matters, they have the power to do what they want to do. A case of "Do what I say and not what I do".
I don't have as much problem with algo changes, but by completely removing sites, they are taking away the democracy of allowing sites a chance to compete and denying the end user the chance decide for themselves what site they want to visit and whether it is relevant. And in trying to decided what sites to censor, they took out many good sites in what is being referred to as "collateral damage".
Google used to believe that they used democracy via links, (among other things), to decide which sites were important. Now, there is no way they can say that - as sites are being completely censored without reason - a very un-democratic process.
On another note, I heard rumor of at least one TV news outlet that reported about Google censoring sites, and I believe that they Google "wouldn't return our calls for comment". It was probably the site of one of the reporters family members, but they mentioned the site name on the news, and it definitely was not a scraper.
I don't have as much problem with algo changes, but by completely removing sites, they are taking away the democracy of allowing sites a chance to compete and denying the end user the chance decide for themselves what site they want to visit and whether it is relevant.
But it's the search engine's job to show users which pages are relevant to a query. Users want search engines to filter irrelevant or low-quality pages.
And in trying to decided what sites to censor, they took out many good sites in what is being referred to as "collateral damage".
I've been a victim of collateral damage myself, so I can sympathize with people whose pages or sites may have been harmed by algorithmic hiccups or overzealous filters. But equating editorial opinion (which is what search rankings are) with "censorship" is just plain silly, and collateral damage--which inevitably occurs with every update--is likely to get fixed, at least in cases where the affected content meets Google's criteria for what should be indexed.