Forum Moderators: open
peace
Do you actually believe Google are implementing filters? Do you have any examples? In theory it should work, why do you think it doesn't?
I believe Google is working hard to try and address some long standing flaws in their algo. They have done, and still do a decent amount of hand editing, but I think they understand that a spam control system that relies solely on human response simply won't work long term.
I also think that the majority of Webmasters never come close to understanding how enormous the task of policing spam at that scale actually is.
In my little world, the fact that a competitor has beaten me out for the term creative underwater basket weaving may be a life-altering event. But in the big picture, when it comes down to deciding wich of the 150 million daily searches should be policed, my little complaint probably isn't worth the man hours involved to investigate and remove the offending content. The amount of time that would have to be allocated to a hand-review approach is better spent trying to prevent the content from showing up in the first place.
In the short-term that may mean spam will be able to stick around for awhile, but if they are successful at overcoming the issues causing the problem, everything will work itself out down the road.
In the future I see searching the web divided into..
Informational only -.edu, .gov etc. (less reason to spam)
Everybody else- a combination of algo + $$$$. (no more free ride regarding adv.!) just speculating....
imagine police would use the tactics similair GG use - taking notes on how to improve security based on reported crime and do not stop this reported crime - what would we have? Very criminalised world. Why? Simply because the fact you might get in prison for doing crime is the MAIN factor stopping people from crime. You will never stop all crime but showing your effort may bring it to reasonably low level.
And don't tell me GG can't do it - you don't need expensive PHDs to read and check spam reports - it's a routine job.
If GG don't realise the importance of spam reports reaction - more an more honest webmasters will convert to spamers - the relevance of SERPs will gradually decrease - GG users will move on to other SE. We all have seen this happen already to many great SE (can't resist to mention AV).
This topic is great indication that more and more honest webmasters considering to became spamers to beat spamers on GG - because beeing honest GG is not just important traffic source now days - it's the only one significant.
Fast and Google crawl the same pages with the same spam in it.
If Google is sensible to that spam, surely that is an algo issue not a marketshare issue?
And if we do blame the 75% marketshare, only the search results count for a user right? So why should they accept spammed results just because the search engine has a big market share?
[added. I still love Google]
imagine police would use the tactics similair GG use - taking notes on how to improve security based on reported crime and do not stop this reported crime - what would we have? Very criminalised world.
Your comments suggest that Google has completely stopped removing spam by hand. That simply isn't the case. Just because they didn't remove the particular offender you or I may have reported, doesn't mean that the don't do it.
It's just a matter of priorities. And it's much similar to how real law enforcement works. Here in Los Angeles County, the police no longer respond to home burglar alarms. The vast majority of them are false alarms, and responding to them consumes a huge amount of man hours each year. Those man hours are better spent focusing on more substantial crimes. But the fact that they don't rush out every time the cat trips the alarm doesn't mean they've stopped investigating burglaries all together.
It's just a matter of priorities. Google's hand editing resources will always be focused on the areas that will have the largest impact on the overall user experience.
An adult site showing up for the word cancer will be seen by several thousands of people every day. And there's a good chance that some negative press will develop over a SERP like that.
On the other hand, a guy who collects toys that happens to spam his way to the top for a term like johhny lightning slot cars isn't really having a negative impact on the overall user experience. He's just upsetting the other 9 toy collectors listed below him.
So given the fact that it would take about the same amount of time to research and penalize both sites, which one should get bumped to the top of the list?
If GG don't realise the importance of spam reports reaction - more an more honest webmasters will convert to spamers - the relevance of SERPs will gradually decrease - GG users will move on to other SE.
I agree completely. For the most part, the majority of items listed as violations of Google's quality guidelines are not detected and punished by any automated process. The current level of quality Google enjoys is due in large part to the fact that the majority of webmasters follow the rules.
I think they have a relatively small window of opportunity to prove to the honest webmasters that their honesty will be rewarded, and the cheaters will be punished. Which is exactly why they must back off the hand editing and focus more resources on algorithmic solutions.
If they don't begin making substantial progress on shorting the life-span of sites that violate their guidelines, more and more of the honest people will adopt the "If you can't beat 'em, join 'em philosophy, and the quality of SERPS will suffer dramatically.
And don't tell me GG can't do it - you don't need expensive PHDs to read and check spam reports - it's a routine job.
There are over 3 billion pages in the Google index. Assuming even just .001 of those pages get spam reports each month, then that would be 30,000 spam reports for Google to sort through monthly. If each report took one hour to process manually, that would mean 30,000 hours of spam work a month.
Assuming people work 160 hours a month, with 25% deducted for vacations, sick days, training and meetings, then that would leave 120 hours each month to process reports. So to process all of the reports by hand would take 250 people each month, not including any time to do anti-spam programming changes. Google has about 500 employees total.
It's simply not realistic to expect each spam report to get handled individually. It probably would not be feasible or cost effective for them to do so.
I think google does an unbelievable job at eliminating the type of spam actual users care about ...
[I do a lot of searches at google and sure I see hidden text redirects etc ... BUT I almost always end up at something that is relevent to what I searched for]
That is what would cause actual "users" to leave not being able to find information on what they were searching for, me as a user looking for a book about sand should get caught up in whether I hit
amazon.com
barnesandnoble.com
sand-books.com (an affiliate of amazon)
or
books-about-sand.com (an affiliate of barnes and noble)
Now if I'm selling sand books it is a different story, and I support fully sites getting bounced who practice these techniques ... BUT you're point that users will switch SE's because someone has hidden text on an otherwise relevent page I don't think is valid.
My bet is 90%+ of the people filing "spam" complaints are webmasters whose site is below the offender, and not actual users.
What can an honest webmaster realistically expect from Google then? What amount of effort can we expect from them in return for use of the 3 billion web pages that make up their index?
The spam that is affecting my positions would be easy to detect with spam filters. I can't understand why it's still there, perhaps other types of spam are causing more of a problem?
I'd be interested to hear the views of those convinced that spam filters are the answer. Where do the problems lie? How much effort do Google need to put into the solution? Are they likely to do it?
Google is able to have a huge impact on spam, and is able to do so very quickly and very cost effectively. It's a matter of whether they see any commercial value in doing so.
Spam is not about technology, its about human behaviour and that is quite easy to change. It's human behavior that drives a decision to spam in the belief there will be competitive advantage. It's human behaviour that drives a decision to keep doing it because there are no penalties being dished out.
When there is a clear and unambigous message that spammers will be penalised, that same human behaviour will, in most cases, decide to avoid the pain.... ie stop spamming.
If all Google did was to issue a press release to the effect that a campaign of spam removal is to be launched, that would travel through the web world like a brush fire and would trigger an immediate spate of de-spammed sites, especially in the amateur spammer ranks ("I did it because eeveryone else is doing it") That alone would remove a sizeable amount of crud.
Then they target just one particular type of spam.... hidden links. I'd be amazed if their technology can't already detect them.... let the algo loose on just that. Another huge chunk of spam disappears and there is now no longer any doubt that mess with Google and feel the pain.
In two pretty basic steps Google can alter a hell of a lot of human behaviour. Yes... I know, there is always the serial spammers with their disposable domains who will keep on spamming and none of the above will deter them. That need not prevent making a start.
What is going on here!?
1.) Any competitor that ranks higher than me is spamming!
NOT!
2.) I have the ability to be totally objective in evaluating a competitors web site (including how he places words).
NOT!
3.) I spend an inordinate amount of time obsessing over somebody else's ranks.
NOT?
Reporting someone 5 times is really just a waste of everyones resources (including google), no wonder they don't act on them individually (would you?) what is the point?
Maybe you should design better, quit crying, get a life, etc, etc.
Once again, I *really* don't see much spam in google.
G made its name from the free results but revenue is coming more from adwords. Until adwords starts tanking , G has no monetary incentive to take spam seriously.
Sure, it does. As long as e-commerce sites can use questionable SEO techniques to rank high on SERPs, there's less incentive for those sites to buy AdWords.
Also, complaints from webmasters are not seen as complaints from customers (they aren't BTW).
Google's forms for reporting spam and search-quality issues don't ask whether the complainer is a customer or a Webmaster. (Which makes sense, since Webmasters use search, too.)
In the future I see searching the web divided into..
Informational only -.edu, .gov etc. (less reason to spam)
Everybody else- a combination of algo + $$$$. (no more free ride regarding adv.!) just speculating....
IMHO, it's very unlikely that Google would use the domain suffix as an "information or commerce?" filter. Such an arbitrary distinction would drastically reduce Google's value to users. Let's say that I'm searching for tourist information on Elbonia. I'm more likely to find such information at Lonely Planet, Time Out, or the Elbonian convention and visitors bureau's visitelbonia.com than I am at a .gov or .edu site. And if I'm looking for information on something obscure (say, the Cathedral of the Holy Bones or the Bridge of Whispers in Elbonia City), I may not find such information anywhere except on a travel-related .com site.
I think it's far more likely that Google would add "e-commerce detectors" to its algorithm, which would allow it to give more weight to information pages and less to commerce pages in its search results. For example, it might look for shopping-cart links, e-commerce phrases, certain page-layout characteristics, etc. Such weighting would be in keeping with Google's stated mission, and--by skewing SERPS toward information pages--it would encourage e-commerce sites to buy AdWords.
Such an approach would be far easier (and better for users) than making arbitrary judgments based on a domain suffix. It wouldn't even be a bad thing for vendors. Affiliate and catalog pages selling "whatsit antivirus" might not come up in the top 10 for a search on that phrase, and neither would Whatsit Corporation's "order now" page--but Whatsit Corporation's home page for Whatsit Antivirus would have an excellent chance of ranking #1.
If they did that, what would differentiate them from overture or any other pure ppc provider? People who search for consumer related items probably want to find consumer oriented sites, not ads.
Who said google was Funk & Wagnalls?
I'd rather buy the encyclopedia on CD and run it locally than do a search on any SE for the same info.
You take it up with google, if they don't respond, you shouldn't use this board as a place to "try again".
By "try again", I mean identifying specific industries/serps that you are hoping beyond all hope that the google team will penalize.
By the way, how does your site get better if Mister 100 cross linked domains gets a penalty anyway?
I wasn't born yesterday. If Google don't do anything with detailed spam reports, they aren't going to look through a whole sector just because somebody refers to it in a forum. Unless I'm mistaken people often refer to sectors in these forums when discussing spam.
If the sites I refer to are removed from Google because they are breaking Google's rules then my site will be deemed more relevant by Google and I will receive more visitors. My site won't get any better but Google will rate it more highly.
If they did that, what would differentiate them from overture or any other pure ppc provider?
Surely anyone here can grasp the difference between a page of relevant search results and a page of PPC listings.
People who search for consumer related items probably want to find consumer oriented sites, not ads.
That's why AdWords are in the margins, not in the search results.
Google's stated mission is to "organize the world's information and make it universally accessible and useful." If achieving that goal makes AdWords more desirable to vendors (either because more people use Google or because non-information pages are pushed lower in the search results), what's wrong with that?
I have a site in my industry that employs some shady tactics and thus I reported them. Once I saw that nothing was done, I started thinking of ways to legitimately incorporate a similar strategy for my site.
I have a site in my industry that employs some shady tactics and thus I reported them. Once I saw that nothing was done, I started thinking of ways to legitimately incorporate a similar strategy for my site.
That may be a viable strategy if you're creating a "here today, gone (and replaced by another disposable domain) tomorrow" Web site. It's obviously unwise if your goal is to build a Web site--or a brand--that will thrive and prosper over the long term.
One competitor had hundreds of pages that are keyword stuffed (in a list format)with the same layout and a few nonsense paragraphs. Imagine creating a page like that about red widgets, and then creating another identical page where you replaced every instance of red w/ blue. Then do this for every color possible.
Initially I veiwed this as spam (although Google doesnt think so). Oh and these pages are not navigable through the site. Most likely cloaked.
I decided to create an individual page for every product manufacturer on my site, (Originally they all appeared on a manufacturer page). My users now have the option of using both navigation paths (the original being the most efficient), but I have 100 new pages for Googlebot.