| 6:37 pm on Oct 7, 2005 (gmt 0)|
I have no doubt that Google favores spammers - just look at all the duplicate content issues in the G index. its a free for all. read my post [webmasterworld.com...] #976
| 8:30 pm on Oct 7, 2005 (gmt 0)|
I'm as big a critic of Google as anyone, but I have to disagree. I don't see them favoring spammers in any way, and in fact find that they have been pushing new filters to stop spammers almost too far.
I think your issue pertains more to the recent efforts of Google to take out directories they feel hold no value. Was your directory unique from the 10,000 that just sold links? Did you have empty categories?
Directories have been public enemy in the eyes of Google for some time now. In their eyes, it's a glorified link farm. How many times have you actually used a general directory to find something you were looking for?
As for Google telling you why you were banned, that isnt their responsibility. Why would they want to tell you how to game their system?
| 8:55 pm on Oct 7, 2005 (gmt 0)|
Actually Google has recently anounced a trial of informing legitimate website owners of the cause of their site being banned. This seems like a classic case where Google should be doing just that. Google has to realise that they hold immense power and that they have to use it wisely in relation to how they treat web site owners. I also disagree that Google favours spammers. In fact they are so infattuated with it that innocent sites get caught wrongly and all sites that step out of line are treated in the same way. Not all spammers deserve the same heavy handed treatment
| 9:00 pm on Oct 7, 2005 (gmt 0)|
>>Google seems to be gradually losing the spam war.<<
It seems so, unfortunately. I guess Matt Cutts WebSpam Team wouldn't be happy to hear that though ;-)
Having said that, reading GoogleGuyīs and Mattīs recent posts, it seems that they are really doing what they can to eliminate spam. But its a very difficult and complex war and I don't know how much resources have been assigned for that task.
| 10:38 pm on Oct 7, 2005 (gmt 0)|
You have just said in perfect English what I have been saying for months since having the same incident. Unfortunately my English is not good enough to express the situation.
I double what you say, and can give examples ensuring that this policy "unintentionally" favour spammers.
Ironically, Google would not give you any reasons for delisting your site because that would give spammers "hints" about what could send their sites to jail. A very weird way of thinking for such a major empire? Personally, when I spam and find that my spammy site was delisted for a reason I would stop using my spammy technique. A day after a day I would stop spamming permenantly. oppositely when I dont spam and find my site banned for an unknown reason, I surely will not waste months playing trial-and-error with google and even I have the time, they dont have the patience to keep responding to my "i have fixed a few things in my site, please take a look now!" requests. Instead I would invest the time creating several spammy sites for the purpose of revenue and the purpose of revenge as well :)
I dont know whether it's just you and me that are affected with that social engineering methodology, but then again I can ensure that it is certainly not their intention to do so and favour spammers, which BTW knows much more than legitimate sites owners about google backdoors, and definitely dont need to wait for a hint from the google team.
Just my thoughts
| 10:49 pm on Oct 7, 2005 (gmt 0)|
|Google seems to be gradually losing the spam war. |
To be fair to Google, they're facing an exponential growth in spam. I can think of recent posts here by members who have dumped millions of script-generated pages onto the Web, with most of those pages having little or no real content. Google's "do no evil" philosophy may be keeping them from being as aggressive as they need to be to win the war quickly. (If they didn't care about collateral damage, they'd probably be winning more battles.)
|I suspect they are putting most of their effort into diversifying into email, maps, video, etc. |
Is there any evidence that they're pulling engineers from their search team to work on e-mail, maps, video, etc.? If not, how would those other corporate pursuits interfere with their efforts to improve search quality?
| 10:55 pm on Oct 7, 2005 (gmt 0)|
|it seems that they are really doing what they can to eliminate spam |
I believe that they are tightening algorithms and just add more spam filters that eleminate many legitimate sites in comparison with spammy sites. As in ratio if you get me..
Let's say that I run an Email provider company and protects my customers from spammers using spamassasin with a low score (that is a tight algorithm). That would make a customer receives a very low amount of spam in his inbox, while losing some important emails (ham) on the other hand. I dont belive that any customer would live with losing "any" ham, but they can survive with some spam. I belive that google would filter your 100 hundreds email which contains 90 spam and 10 ham as follows.. eleminate 10 spam and 5 ham leaving your inbox with 80 spam and 5 ham, so instead of having %90 spam you would now get %95 :)
*thinking* why not letting the gmail team runs the searching engine? ^o)
| 11:03 pm on Oct 7, 2005 (gmt 0)|
|To be fair to Google, they're facing an exponential growth in spam. |
If I were google I would buy the ODP, dedicating my team to it as editors/reviewers, converting it to a commercial service for a small annual fee and limiting the searching engine to its listings. IMHO, a combination between a searching engine and directories seem the only way to elemenate spam. Else I would stay as is and get my team responding to banned websites owners with clear reasons for banning.
| 11:06 pm on Oct 7, 2005 (gmt 0)|
I was looking for some technical info today. Many of the top results had zero content except adverts, including adsense. Adsense is encouraging people to take a scatter-gun approach and just create as many pages as possible with no content simply in the hope that people will click on google-ads. So, far from fighting spam, Google is indirectly creating it.
| 11:21 pm on Oct 7, 2005 (gmt 0)|
>>I believe that they are tightening algorithms and just add more spam filters that eleminate many legitimate sites in comparison with spammy sites. As in ratio if you get me.. <<
I guess you are right. Or at least thatīs what we have seen until now.
However, I think that they are still testing and testing .. and testing and havenīt settled yet on the algo they wish.
| 11:37 pm on Oct 7, 2005 (gmt 0)|
bears5122: It has occurred to us that Google might be blocking access to our directory merely because it is a directory or for some other "editorial" or "business" (competitive) reason having nothing to do with abuse. Their web site states that they only ban for "abuse" but also says that a site can be delisted for secret reasons and that they can decide that something is "abuse" even if they have never disclosed anywhere that they consider that particular practice to be "abuse".
However, if Google has an editorial policy shouldn't they so state? Wouldn't it be even better if they told us what it was? People use search engines precisely because they want unfiltered, unbiased, and uncensored access to Internet information and would generally be unhappy to find that a search engine was handpicking the information they were allowed to see. Secretly doing so, is to me, dishonest.
Our directory does not charge for listings. Unlike Dmoz (Google indexes 14.7 million Dmoz pages) we don't have empty pages or pages with few links. The many people bookmarking it obviously feel it provides value to them. There is no possible way that our pages have less value to the public than the vast majority of the 8.2 billion pages Google continues to index!
I think the argument that disclosing why a site has been banned helps spammers is very weak. By now most spamming techniques are well known. The spammers do not appear to need any help. However, refusing to disclose why a site has been banned could be used to cover up the fact that Google is banning sites for non-abuse reasons. I, for one, would therefore trust Google more if they did disclose why they ban sites.
I am not saying that Google isn't working hard to stop spam. However, their policy regarding how they treat webmasters is not helping and is actually hurting. The non-disclosure and the punishment periods seriously hurt legitimate site owners while not hurting spammers. I have seen many posts by site owners to the effect of "I am tired of trying to deal with Google and trying to figure out by mental telepathy what I'm supposedly doing wrong; I am going to get a new domain name." Voila, witness the birth of a new spammer.
| 11:39 pm on Oct 7, 2005 (gmt 0)|
Sites come and go with all of the search engines, not just Google. That's just the way it is. If you only have one site and are relying on free traffic to make an income from that site, then perhaps you need to reconsider your business model instead of blaming all of your problems on Google.
You can have multiple sites without being a spammer. Having mutiple sites helps to spread your risk of algo changes. If you only want to have one site then it is a good idea to make that site profitable without relying on free search engine traffic, and just treat any free traffic as a bonus that could disappear tomorrow.
Here is one of my favorite posts on the Subject from Brett: msg #92
| 12:09 am on Oct 8, 2005 (gmt 0)|
As usual I agree with Jane_Doe.
I comment from time to time (not always) when I see things I believe the SE's could be doing better. But hey, the SE's are trying to cope with a tidal wave of spam.
Most SEO's I speak with these days say it's harder then ever to get spam pages into the G SERP's. Y too for that matter. Some of Y's filters are brutal.
The SE's have an incredibly hard job and it gets harder all the time as spam gets more sophisticated and the number of Web pages grows geometricallly.
My issue with G right now is the amount of collateral damage they're accepting. My issue with Y right now is that they're too often showing the wrong page from a given site for a given search. Both of these problems have to do with how each SE is fighting spam.
I do agree that in this current environment, it is becomming increasingly important to either understand SEO, or have someone on your team or on call who does. Too many innocents being taken out.
But that is quite different from thinking that G or any of the SE's favor spam.
| 12:20 am on Oct 8, 2005 (gmt 0)|
|We made some changes to make the site more search engine friendly such as intentionally delisting (via robots.txt or noindex tags) pages that had little persistent value (news, etc.) and implementing a sitemap. We also attempted to fix anything that might be perceived as "abuse" (such as removing keyword tags) |
Why would news pages have little value? In my experience, news pages linked from homepage rank extremely well and then go down over time. Of course it's my site news written by me.
What is "keyword tags"? If you mean keywords metatags, they do no harm if used to describe what's on page, and there are believers that they count somewhat in Google even more in other SEs.
Wow I've learned putting quote in box...wishing great weekend to all of you boys and gals at WebmasterWorld ;)
| 1:15 am on Oct 8, 2005 (gmt 0)|
Maybe the "tidal wave of spam" would be smaller if it weren't sponsored by Goooooogle.
Kind of hard to empathize with the big G when they are stupid, greedy and/or lazy enough to pay the very people who degrade the G SERPs and poison our legitimate domains.
| 3:04 am on Oct 8, 2005 (gmt 0)|
|I believe that they are tightening algorithms and just add more spam filters that eleminate many legitimate sites in comparison with spammy sites. As in ratio if you get me.. |
"Many"? I think that's probably overstating things. And the e-mail comparison doesn't work, because users will tolerate missing pages in SERPs a lot more than they'll tolerate missing e-mails from friends or business associates. (After all, how many people go beyond the first 10 or 20 results for a search anyway? Whether a formerly top 10 or 20 page has dropped in the results or disappeared altogether is immaterial to the user except under rare circumstances.)
|Maybe the "tidal wave of spam" would be smaller if it weren't sponsored by Goooooogle. Kind of hard to empathize with the big G when they are stupid, greedy and/or lazy enough to pay the very people who degrade the G SERPs and poison our legitimate domains. |
Apples and oranges. Google is a big corporation, and divisions or units of large organizations often work at cross-purposes. (Sometimes they even compete with each other: Think VW vs. Skoda vs. Audi or one Kellogg's breakfast cereal vs. another.) I think we can be confident that the Google Search team isn't in responsible in any way for the flood of "made for AdSense" sites, and I'd guess that most Google search engineers dislike AdSense in the same way that Microsoft software programmers used to dislike MSN (and maybe still do).
|I think the argument that disclosing why a site has been banned helps spammers is very weak. By now most spamming techniques are well known. |
Maybe, but why should Google want to make the spammers' job any easier by disclosing its techniques?
|However, refusing to disclose why a site has been banned could be used to cover up the fact that Google is banning sites for non-abuse reasons. |
That's pretty farfetched, and if I were Google, I wouldn't bother responding to paranoia, conspiracy theories, or baseless accusations by disgruntled Webmasters and SEOs. On the other hand, I'd listen to legitimate reinclusion requests, but only if they were phrased politely.
| 4:02 am on Oct 8, 2005 (gmt 0)|
Because Google's one hand doesn't know what the other hand is doing, that makes sponsoring spammers and intellectual property theives OK? I fail to see how exonerating the "search team" makes the problem either excusable or non-existent.
Ninety-nine percent of scraper sites carry Google ads. If G doesn't have to take responsiblity for their own mistakes, who should?
| 6:41 am on Oct 8, 2005 (gmt 0)|
Atticus, you're talking about blame and I'm talking about practical realities. (For what it's worth, I've been unhappy and openly critical of Google's lax approach to AdSense since the beginning.)
| 7:11 am on Oct 8, 2005 (gmt 0)|
Take responsibility how? How does this apply to "favoring"?
Google still "favors" high quality, content rich web resources. That doesn't mean they don't screw up every single day or sometimes smile on the wrong site(s).
| 8:01 am on Oct 8, 2005 (gmt 0)|
There is a blindingly simple solution to this.
All google needs to do is introduce some kind of 'premium membership' in return for a fee. They would then be able to use the money that this raises to finance a team of support staff that would manually 'hand review' individual websites promptly.
Anyone with a serious online business would be happy to pay this - spammers would be happy to pay too - but their sites would not stand up to a manual review.
Google could then crank up the spam filters in their algo - knowing that serious sites will not be hit - this would also solve the sandbox issue - premium members would have their sites added to the index immediately.
In my opinion, most of Googles spam problems are caused by their obsession with automating everything in their algo - if you do that, you will always get some good sites thrown out with the bad.
The best possible SERPS would come from a combination of automation and manual hand editing.
* Spammers would find life a lot harder
* Important sites would find life a lot easier
* Surfers would see much better SERPS
* Google would earn extra income from premium membership
* Google would earn a LOT of extra income from having a far cleaner, more user friendly search engine
* Some unemployed people would get jobs
| 8:32 am on Oct 8, 2005 (gmt 0)|
Well said. Thats just what I think too.
| 8:34 am on Oct 8, 2005 (gmt 0)|
I'm glad Google has gotten smarter on catching directories.
Directories make the Web appear as a huge dump of useless resources.
I include DMOZ and all of its "licensed" copies within the same level of useless bandwidth.
| 8:52 am on Oct 8, 2005 (gmt 0)|
Something else ...
Directories were relevant in the early days of the Web when CPU processing power, storage capacity, and modem speed were limited.
Remember 1994? 90-100 MHz processors, 722Mb-850 Mb hard drives and 90 percent of the people were connected at 14.4 Kbps.
So there was a place and a time for a few good fellas cataloging the Web for the rest of us.
Today, I rather put my trust on the power of technology and not on those good ole fellas.
| 8:55 am on Oct 8, 2005 (gmt 0)|
So Altair, it's time to put that old machine to rest. :)
| 9:43 am on Oct 8, 2005 (gmt 0)|
If any search engine with decent existing traffic was able to implement a site rating system a la the product review websites to sit alongside and complement the algorithm based search then it would clean up.
Let users register and let them vote, perhaps once per week, on how they rate a site for a particular serp.
Possibly open to abuse but then, as we are seeing, so is Google right now.
| 10:47 am on Oct 8, 2005 (gmt 0)|
|All google needs to do is introduce some kind of 'premium membership' in return for a fee. They would then be able to use the money that this raises to finance a team of support staff that would manually 'hand review' individual websites promptly. |
The problem with a premium membership system is that Google would not be able to keep up with demand. In addition, those whose sites remained banned would be very unhappy.
Nevertheless, there is merit and logic in this idea and I would not be surprised if Google implemented something along these lines.
| 1:37 pm on Oct 8, 2005 (gmt 0)|
|Let users register and let them vote, perhaps once per week, on how they rate a site for a particular serp. |
Google already implemented this feature long time ago as the Vote buttons in the toolbar. But then again, manual voting can be abused in the same way as SEOs now abuse the holes in Google ranking algorithms.
Going back to the original post, I doubt if Google is favoring spammers at the moment. Since a long time I follow a specific keyword search in the SERPs which returns less than 1000 results. This low amount of matching pages makes it possible to see all results, so it gives a good impression of the way how Google fights spam. Until one year ago about 50% of the listed pages were spam. At this moment the amount of spam pages has been reduced to less than 15% which is mainly accounted for by one group of spam sites. Most of these spam domains stay in the SERPs for a few weeks and are then banned. New domains pop up at the same rate as they are deleted.
I must say that not only spam pages are effectively removed by Google for this particular keyword search, but also a number of--less valuable--regular pages, mainly pages with "my links" and forum pages. I would estimate that about 10 to 15% of legitimate pages have been removed from the SERPs as collateral damage for this particular keyword search.
As a searcher I am happy with these figures where the number of spam pages reduced from 50% to 15% with the collateral damage rather low, and mainly at unimportant pages. Google's policy is certainly not in favor of spammers IMHO. I however understand the pain the OP feels when because of Google's actions an in his eyes legitimate domain is banned.
| 3:50 pm on Oct 8, 2005 (gmt 0)|
|All google needs to do is introduce some kind of 'premium membership' in return for a fee....Google could then crank up the spam filters in their algo - knowing that serious sites will not be hit - this would also solve the sandbox issue - premium members would have their sites added to the index immediately |
That might be a fine idea if Google were only a commercial index. But I don't think Google is ready to favor e-commerce and affiliate sites over academic, reference, hobby, and other information sites that wouldn't be able to justify paying extra for Business Class perks.
| 5:42 pm on Oct 8, 2005 (gmt 0)|
Jane_Doe: Google never did provide very much traffic to our directory because the page rank system naturally doesn't favor directories (I think it was 13 percent of search engine referrals). This was an interesting tradeoff between what is good for people and what is good for Google ranking. More outgoing links on a directory page (up to a point) is good for people. Less outgoing links is better for rank. There are pages on ODP that have NO outgoing links and are therefore completely useless but rank higher than pages that do have listings.
We did get enough traffic that we do have to make some effort to get it back. We are not blaming "all our problems" on Google. For now, Google is the best search engine out there. I do think that as industry leader, they should act more responsibly.
rytis: News, weather reports, etc. change daily and therefore having them indexed in a search engine might not provide much benefit to the searcher. The searched for term might not be there by the time the searcher gets there. News that is more persistent (happened last month) might be better. I think the sitemap is a great idea to allow webmasters to define relative importance of pages without completely delisting them.
We eliminated meta keyword tags on the off chance they were the cause of the problem. This is an example of the type changes site owners have to make even though they detract from the site and probably won't fix the problem.
lufc1955: My understanding of Google's disclosure plan is that they are going to send emails to SOME site owners that DO NOT inquire, disclosing the reason for a ban but will still refuse to disclose the reason to site owners that DO inquire. (If this is not right, maybe Googleguy or Matt Cutts could set me straight.) This allows them to disclose obvious spamming techniques (invisible text, etc.) while still keeping it secret if some site owner should somehow stumble on some new unknown method for spamming Google (and also keeping it secret if they are in fact banning sites for non-abuse reasons). Even so, I think this plan is a good idea. It just doesn't solve our problem.
europeforvisitors: Regarding "farfetched" and "paranoia". Somebody once said that if you eliminate all the likely explanations, then you have to consider unlikely explanations, no matter how farfetched. Many site owners (including us) think they have eliminated all the more likely possibilities and are still banned.
Google has a perfect right to ban sites for purely arbitrary reasons and to keep these arbitrary reasons secret if they so choose. The New York Times doesn't publish its editorial policy and Google doesn't have to either. However, if people wanted their information filtered through an editorial filter they could get much higher quality data from the Times. People use Google because they DON'T want their data filtered and are willing to put up with some level of garbage to get it. It is dishonest for Google to pretend that they have no editorial filter and that their search engine is a mechanical, unbiased, device if it is not.
zafile: Our directory specializes in providing local information for small towns. We try to consolidate a lot of links on one page so someone bookmarking that page has one-click access to a lot of information they frequently use. There are a lot of directories that provide a similar "niche" service. Nobody has to use a directory but shouldn't they be able to if they do want to?
Picking and choosing which directories or other sites people are going to be allowed to see is an "editorial policy" or possibly just the individual preference of a Google employee. The trouble with an editorial policy or individual preference is where does it stop? Would it be OK to block access to sites about Barney? About UFOs? About Republicans?
| This 92 message thread spans 4 pages: 92 (  2 3 4 ) > > |