Forum Moderators: Robert Charlton & goodroi
If we are doing something wrong we would obviously appreciate any clue as to what that thing is. How big a deal could it be to tell us?
It gets worse. Fixing the problem and requesting reinclusion does not result in immediate reinstatement. I understand that reinstatement occurs after the expiration of some "punishment period". The length of the punishment period is a secret and apparently varies depending on the alleged "offense". A "fix something and see if that worked" approach therefore could take years so legitimate web site owners are forced into a "shotgun" approach in which they make multiple, expensive changes in the often futile hope that one of them will eventually get them reinstated.
Spammers have none of these problems. Domain names are cheap. A spammer can serve up the same (or nearly the same) information under 20 different domain names and get 20 times the exposure of a legitimate site. If Google eventually finds and bans some of these domains, more are easily added. Spammers don't have to worry about registered trademarks, brand recognition, business cards, print ads, etc. Google policy harasses legitimate site owners and has no effect on hard core spammers who are laughing all the way to the bank. Google should disclose specific reasons why a site has been delisted and discontinue the childish punishment period. Google's policy of treating legitimate web site owners as the enemy is creating spammers, not helping with the spam problem.
I have noticed that the quality of Google searches has been declining and there are ever more garbage sites popping up in high ranking results. Google seems to be gradually losing the spam war. I suspect they are putting most of their effort into diversifying into email, maps, video, etc.
Questioning their motives is just trash talk; it doesn't contribute to the discussion. And expecting them to manually check millions of site just isn't realistic: Google is a spidered search engine that was built on algorithms, not a Yahoo Directory or a DMOZ. Trying to impose a new search model on a company that is committed to scalable algorithms isn't likely to be a productive use of anyone's time.
Trash talk eh? You are of course entitled to your opinion but it is only your opinion. I think there are others who would agree that it does contribute to the discussion. I am not alone in my views on this as has been demonstrated whenever the subject has been raised over the last year or two.
With regard to Google's commitment, Polaroid was committed to instant film while all around them people were developing digital cameras. Look where they are now. Burroughs were committed to mechanical adding machines while all around them people were developing pocket calculators. Google started off as a non profit company but things change and those who are unwilling to change always fail.
And I am not trying to impose anything on anyone. Google's scaleable algorithm is mainly designed to determine the order of the SERPs. I am not suggesting that this should change. What I am suggesting is that the quality of their results could be dramatically improved by manually removing scrapers and offending sites. I don't think any right minded person can dispute this. Their algorithm may have been right six or seven years ago but the Internet is a vastly different place nowadays.
Just watch the first effective, new search engine that does this take off. A hybrid search engine with no scrapers or offensive material that uses a combination of algorithms and manual editing? Count me in!
What I am suggesting is that the quality of their results could be dramatically improved by manually removing scrapers and offending sites.
Aren't they already doing that to some degree? They just aren't trying to implement it on a large scale.
For the benefit of further discussion of the subject, I´m recalling (as usual ;-) ) a relevant post of our fellow member GoogleGuy.
[webmasterworld.com...]
==================================================
GoogleGuy
Senior Member
view member profile
joined-Oct 8, 2001
posts:2829
msg #:43 9:22 am on June 2, 2005 (utc 0)
inbound, we're pretty allergic to the suggestion when people suggest that we take money for reviewing sites (or reviewing sites for reinclusion). Our skin starts to break out and my shoulder feels all itchy. Several very smart people have suggested it, and I'm not saying we'd never do it, but the whole notion of pay-for-inclusion or pay-for-review or pay-for-rereview-after-you've-caught-me-spamming isn't something that I foresee us doing anytime soon. It's a slippery slope that could easily lead to a conflict of interest.
========================
Enjoy ;-)
Aren't they already doing that to some degree? They just aren't trying to implement it on a large scale.
Exactly! My point is that the should do it on a large scale and I know that Googleguy said this. I think the cryptic phrase, "I'm not saying we'd never do it", tells its own story.
It must at least have been discussed at a level so there is still hope for those of us who would like to see the SERPs effectively cleaned up.
There is a blindingly simple solution to this.
All google needs to do is introduce some kind of 'premium membership' in return for a fee. They would then be able to use the money that this raises to finance a team of support staff that would manually 'hand review' individual websites promptly.
Anyone with a serious online business would be happy to pay this - spammers would be happy to pay too - but their sites would not stand up to a manual review.
Google could then crank up the spam filters in their algo - knowing that serious sites will not be hit - this would also solve the sandbox issue - premium members would have their sites added to the index immediately.
In my opinion, most of Googles spam problems are caused by their obsession with automating everything in their algo - if you do that, you will always get some good sites thrown out with the bad.
The best possible SERPS would come from a combination of automation and manual hand editing.
* Spammers would find life a lot harder
* Important sites would find life a lot easier
* Surfers would see much better SERPS
* Google would earn extra income from premium membership
* Google would earn a LOT of extra income from having a far cleaner, more user friendly search engine
* Some unemployed people would get jobs
Everyone wins!
ALL of this is true until you bring in human corruption. What you are saying is that we let people do what machines make mistakes at doing. I hate to tell you this but with online business worth such an amount in the real world there would be corruption to a greater not lesser scale here.
Business in just about legal and immoral as an entity. You show me a 100% legitamate moral and honest multimillion turnover business and i will show you what granny missed out of that great book "Sucking Eggs the do's and donts"
I hate to tell you this but with online business worth such an amount in the real world there would be corruption to a greater not lesser scale here.
Sorry, but it is crazy to suggest that this proposal is not viable because of this. We already have a manual editing situation on Google that works despite any threat of corruption. It's called Adwords! Corrupt people in Adwords admin could ensure that certain businesses get unlimited exposure but this does not prevent Adwords from functioning.
Business in just about legal and immoral as an entity. You show me a 100% legitamate moral and honest multimillion turnover business and i will show you what granny missed out of that great book "Sucking Eggs the do's and donts"
What's your point caller?
I see few saying this only helps commercial sites and not-for-profit, info sites will lose out.
I think the real problem is that users would lose out, because Google would then be displaying search results for a limited pool of sites and pages.
EFV, I don't understand? Are you saying that Google is better for having the scrapers and spammers?
No, I'm saying the index would be worse if only registered sites (whether paid or unpaid) were included. Why? Because some high-quality information sites wouldn't be aware of the registration program or wouldn't be concerned enough about traffic to register.
I would have thought that this was extremely unlikely.
I think it would be extremely likely.
Also, it's important to remember that spam is a big problem for some keywords or keyphrases but not for others. It wouldn't make sense to gut the index of useful information results for thousands or millions of topics just so that spam results were being minimized in searches for home mortgages, anti-impotence drugs, and hotels.
No, I'm saying the index would be worse if only registered sites (whether paid or unpaid) were included.
EFV, I don't know if I got the pay-for-reviews issue guys here are discussing right. But, I thought it was only a review by Google, for webmasters who think their sites are not indexed or ranked the way they should be. Payment is not a guarantee for inclusion or ranks, but a more reliable/human site feedback service.
I also think that reviews would make life easier for SEOs (both "black hat" and "grey hat") by letting them know what they could or couldn't get away with. In effect, Google would be helping SEO clients gain an advantage over non-SEO'd sites. I just can't see Google wanting to do that.
Finally, algorithms and filters are in a constant state of evolution, and what might get a reviewer's blessing today could easily run afoul of a search-engine tweak next month or the month after. So offering reviews might be a bad idea--not just on practical grounds, but also in terms of legal exposure.
EFV, I think you mischaracterize the function of rules. If a player knows the rules, and acts in such a way as to receive no penalty and "get away with" a particular behavior, then that is an example of the player following the rules -- not breaking them.
Not a fan of manual reviews. However, a lack of transparency is never a good thing. When all players know the rules of the game, it is more likely that qualified players will excel. Simple as that.
Not a fan of manual reviews. However, a lack of transparency is never a good thing. When all players know the rules of the game, it is more likely that qualified players will excel. Simple as that.
The thing is, there's no reason to believe that Google wants to help SEOs "follow the rules." Unless I'm greatly mistaken, Google would prefer organic sites, and manual reviews would encourage and favor inorganic sites.
Most websites use one technique or another to signal to search engines and users what their pages are about.
If Google doesn't want to deal with that, maybe they should just stay home and bake cookies.
I think you are right there. But of course any site of any size will eventually start thinking of how they can get the max number of visitors and want to make sure their site is setup so it is easy for search engines to find, given they all have they quirks.
Indeed the new filters and what have you that everyone goes on about are most likely to make people who don't pay much attention to such things, do so. For instance during this big flux just now (for us) I am seeing sites that link to or even pinch our copy ranking above us. If it lasts more than a month I will be setting up some separate sites to aggregate my own content. Yet, up until lately I had not even thougth about SEO or other 'tricks'.
Personally I don't want say a link unless I am going to get traffic directly from it or a title will match the headline of the story because it looks better in the browser...
I agree on the problems with reviewing sites...a bottomless pit of maintainance I would think. Pay for review get accepted then site gets made all dodgy...when do you re-review etc.
I also have to say Google are unhelpful at times, very opaque...though perhaps they need to be to avoid abuse.
But for the moment I will wait for them to fix their side of things rather than changing anything on our site or launching a dozen spinoffs for no reason other than to get traffic back via a circular route.
Most websites use one technique or another to signal to search engines and users what their pages are about.
Sure, like the basic common-sense techniques that are covered in Google's Webmaster Guidelines. ("Make a site with a clear hierarchy and text links," "Make sure that your TITLE and ALT tags are descriptive and accurate," and so on.) But I don't think those are the techniques we're talking about here. What SEOs obviously want is for Google to tell them if the're going overboard on keyword density, if their crosslinking schemes are out of bounds, if they're pushing Google's limits of acceptability for duplicate content, etc.
It's hard to see why Google would want to do that, since the result would be to give a leg up in the SERPs to paying SEOs and commercial site owners at the expense of natural or organic results. (And what would be the next step? Google Search Submit Express?)
EFV, once again, you mischaracterize the function of rules. They are a bright line. You can not push them if they are enforced by the entity issuing them.
Those who use every possible method to promote their sites without breaking the rules are by definition following the rules.
There is no way that a "black hat" can use well written and well enforced rules to break the rules.
There is no way that a "black hat" can "push Google's limit of acceptability for duplicate content." Only Google can do that.
Of course the real problem is that Google is so buggy that sites are dropped for completely arbitrary reasons, some of which are outside the control of the site publisher. So Google can't publish rules because then it would be obvious that sites which are innocent of breaking any rules have gone supplimental or been dropped from the index entirely.
We obviously disagree, so I won't belabor the point, and I'll let you have the last word. Over to you. :-)
Google should cooperate much better with website owners: Google discloses minimal guidelines on their site. Then they have semi-official people who post on sites like this under (apparently) their real names with more information. Then they have people posting under aliases (maybe for denyability) sort of leaking yet more information. Only the most sophisticated site developers have most of the information. Wouldn't Google and legitimate site owners be better served by posting all this information on the official site? What benefit do they obtain from all the dancing around? Do they think spammers don't read this forum? Maybe it is a legal issue.
Regarding spam: My understanding is that most email spam is sent by fewer than 100 very knowledgeable people. Most of the rest is extremely amateur. The amateur stuff is relatively easy to detect and filter. I suspect web spam represents a similar situation. The real spammers already know all the tricks.
It is not too big a task at all and there is no argument that it could not only be self-financing but another highly lucrative source of income from Google. Many of us would be willing to pay our $nnn per annum or whatever it would take to be included in a "legitimate" index. Many of us whose sites are legit would also be happy to see the index being purged of all the spam and scrapers sites that are currently included. The algo is not fully effective at this and it never will be as long as people like those of us in this forum are in the business of manipulating it.
Who can deny that manipulation of human edited results is much more difficult. I cannot understand how some people cannot accept that the results could only be improved by this. To say that people would not bother to submit authority sites is just plain daft. Authority sites are created by intelligent people for the purpose of disseminating their information. The vast majority of them would do what it takes to get included. They are not stupid.
If the index was a tenth of its current size but all of the sites that remained were essentially legitimate and good sources of information how much more evvective the whole Internet experience would be. Granted, it would not happen overnight but all I am proposing is that we start to move towards this.
And there's the point.. many innocent sites are getting penalised due to the strict spam filters - I strongly agree, Google should actually consider doing something to stop this issue.
>>However, its also true, that with less of content its difficult to make it good with google.<<
Exactly! You need to have searched contents on a site, quality inbound links etc.. to rank high on the serps. Then you have a chance to make money on AdSense or any other PPC or affiliate program.
I think EFV is making a valid point here - I hadn't considered that possibility.
BUT - I repeat what I said earlier - isn't a system that's 99% perfect a lot better than the current system thats only 50% perfect?
I'm not claiming there isn't a loophole somewhere that spammers are going to try and exploit - of course they are going to try - and a few will get through the net I'm certain.
Sure, there will be a few quality sites not savvy enough to realise that they can get their sites manually reviewed to avoid being filtered so heavily - but when you think of the vast amount of scraper junk that's going to be removed from the index, can anyone seriously suggest that the SERPS would not be vastly better for ordinary users?