Forum Moderators: open

Message Too Old, No Replies

Problem Results Reports

         

phish

8:58 am on Feb 21, 2003 (gmt 0)

10+ Year Member



Ive been reporting a site for the last 5 updates...I sent them the EXACT link to the site...when you do a search and this particular site comes up then you click on the "more from this site" link 360 pages of keyword stuffed junk pages come up, and they do nothing about it. BLATENT spam and google does zippo. Now I look thru some of these for about an hour so...and this guy is using text from competitors sites in some of these pages..actual company names and stuff. Why dont they listen? I read the thread Brett started last night and GG jumped in and asked what Google can do.."give us some suggestions" he said. Well if you look at Brett's original post it was that when he does a search on a certain phrase he really has to get deep into a search to find anything relavant. Why is this? Because of people like this.So if Google is looking for something to do instead of adding on more stuff..they should just take care of what they already have. Quit relying 100% on the algo, and do more human editing.

peace

WebGuerrilla

8:55 am on Feb 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Do you actually believe Google are implementing filters? Do you have any examples? In theory it should work, why do you think it doesn't?

I believe Google is working hard to try and address some long standing flaws in their algo. They have done, and still do a decent amount of hand editing, but I think they understand that a spam control system that relies solely on human response simply won't work long term.

I also think that the majority of Webmasters never come close to understanding how enormous the task of policing spam at that scale actually is.

In my little world, the fact that a competitor has beaten me out for the term creative underwater basket weaving may be a life-altering event. But in the big picture, when it comes down to deciding wich of the 150 million daily searches should be policed, my little complaint probably isn't worth the man hours involved to investigate and remove the offending content. The amount of time that would have to be allocated to a hand-review approach is better spent trying to prevent the content from showing up in the first place.

In the short-term that may mean spam will be able to stick around for awhile, but if they are successful at overcoming the issues causing the problem, everything will work itself out down the road.

1milehgh80210

9:05 am on Feb 27, 2003 (gmt 0)

10+ Year Member



maybe a few things at work here
G claims what, over 3bil pages indexed? If only .1% are spammed, thats still a lot to go over by hand.
G made its name from the free results but revenue is coming more from adwords. Until adwords starts tanking , G has no monetary incentive to take spam seriously.Also, complaints from webmasters are not seen as complaints from customers (they aren't BTW).

In the future I see searching the web divided into..
Informational only -.edu, .gov etc. (less reason to spam)
Everybody else- a combination of algo + $$$$. (no more free ride regarding adv.!) just speculating....

TylerDurden

10:08 am on Feb 27, 2003 (gmt 0)

10+ Year Member



WebGuerrilla and GoogleGuy

imagine police would use the tactics similair GG use - taking notes on how to improve security based on reported crime and do not stop this reported crime - what would we have? Very criminalised world. Why? Simply because the fact you might get in prison for doing crime is the MAIN factor stopping people from crime. You will never stop all crime but showing your effort may bring it to reasonably low level.

And don't tell me GG can't do it - you don't need expensive PHDs to read and check spam reports - it's a routine job.
If GG don't realise the importance of spam reports reaction - more an more honest webmasters will convert to spamers - the relevance of SERPs will gradually decrease - GG users will move on to other SE. We all have seen this happen already to many great SE (can't resist to mention AV).
This topic is great indication that more and more honest webmasters considering to became spamers to beat spamers on GG - because beeing honest GG is not just important traffic source now days - it's the only one significant.

ikbenhet1

10:32 am on Feb 27, 2003 (gmt 0)

10+ Year Member



Webguerilla,

Fast and Google crawl the same pages with the same spam in it.
If Google is sensible to that spam, surely that is an algo issue not a marketshare issue?

And if we do blame the 75% marketshare, only the search results count for a user right? So why should they accept spammed results just because the search engine has a big market share?

[added. I still love Google]

jamesyap

7:16 pm on Feb 27, 2003 (gmt 0)

10+ Year Member



Hope that the new FILTER won't be too agressive until the good citizens are also get kick off.

WebGuerrilla

9:19 pm on Feb 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



imagine police would use the tactics similair GG use - taking notes on how to improve security based on reported crime and do not stop this reported crime - what would we have? Very criminalised world.

Your comments suggest that Google has completely stopped removing spam by hand. That simply isn't the case. Just because they didn't remove the particular offender you or I may have reported, doesn't mean that the don't do it.

It's just a matter of priorities. And it's much similar to how real law enforcement works. Here in Los Angeles County, the police no longer respond to home burglar alarms. The vast majority of them are false alarms, and responding to them consumes a huge amount of man hours each year. Those man hours are better spent focusing on more substantial crimes. But the fact that they don't rush out every time the cat trips the alarm doesn't mean they've stopped investigating burglaries all together.

It's just a matter of priorities. Google's hand editing resources will always be focused on the areas that will have the largest impact on the overall user experience.

An adult site showing up for the word cancer will be seen by several thousands of people every day. And there's a good chance that some negative press will develop over a SERP like that.

On the other hand, a guy who collects toys that happens to spam his way to the top for a term like johhny lightning slot cars isn't really having a negative impact on the overall user experience. He's just upsetting the other 9 toy collectors listed below him.

So given the fact that it would take about the same amount of time to research and penalize both sites, which one should get bumped to the top of the list?

If GG don't realise the importance of spam reports reaction - more an more honest webmasters will convert to spamers - the relevance of SERPs will gradually decrease - GG users will move on to other SE.

I agree completely. For the most part, the majority of items listed as violations of Google's quality guidelines are not detected and punished by any automated process. The current level of quality Google enjoys is due in large part to the fact that the majority of webmasters follow the rules.

I think they have a relatively small window of opportunity to prove to the honest webmasters that their honesty will be rewarded, and the cheaters will be punished. Which is exactly why they must back off the hand editing and focus more resources on algorithmic solutions.

If they don't begin making substantial progress on shorting the life-span of sites that violate their guidelines, more and more of the honest people will adopt the "If you can't beat 'em, join 'em philosophy, and the quality of SERPS will suffer dramatically.

Jane_Doe

9:33 pm on Feb 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



And don't tell me GG can't do it - you don't need expensive PHDs to read and check spam reports - it's a routine job.

There are over 3 billion pages in the Google index. Assuming even just .001 of those pages get spam reports each month, then that would be 30,000 spam reports for Google to sort through monthly. If each report took one hour to process manually, that would mean 30,000 hours of spam work a month.

Assuming people work 160 hours a month, with 25% deducted for vacations, sick days, training and meetings, then that would leave 120 hours each month to process reports. So to process all of the reports by hand would take 250 people each month, not including any time to do anti-spam programming changes. Google has about 500 employees total.

It's simply not realistic to expect each spam report to get handled individually. It probably would not be feasible or cost effective for them to do so.

jbauder

10:35 pm on Feb 27, 2003 (gmt 0)

10+ Year Member



tyler ... have to disagree

I think google does an unbelievable job at eliminating the type of spam actual users care about ...

[I do a lot of searches at google and sure I see hidden text redirects etc ... BUT I almost always end up at something that is relevent to what I searched for]

That is what would cause actual "users" to leave not being able to find information on what they were searching for, me as a user looking for a book about sand should get caught up in whether I hit

amazon.com
barnesandnoble.com
sand-books.com (an affiliate of amazon)
or
books-about-sand.com (an affiliate of barnes and noble)

Now if I'm selling sand books it is a different story, and I support fully sites getting bounced who practice these techniques ... BUT you're point that users will switch SE's because someone has hidden text on an otherwise relevent page I don't think is valid.

My bet is 90%+ of the people filing "spam" complaints are webmasters whose site is below the offender, and not actual users.

dafthead

11:44 pm on Feb 27, 2003 (gmt 0)

10+ Year Member



OK so it's not realistic to expect Google to deal with spam reports other than when it affects the relevancy of results.

What can an honest webmaster realistically expect from Google then? What amount of effort can we expect from them in return for use of the 3 billion web pages that make up their index?

The spam that is affecting my positions would be easy to detect with spam filters. I can't understand why it's still there, perhaps other types of spam are causing more of a problem?

I'd be interested to hear the views of those convinced that spam filters are the answer. Where do the problems lie? How much effort do Google need to put into the solution? Are they likely to do it?

austtr

1:28 am on Feb 28, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I agree 100% with those who say that spam removal can never be a manual process.... I disagree 100% with those who say do nothing because its not really a problem.

Google is able to have a huge impact on spam, and is able to do so very quickly and very cost effectively. It's a matter of whether they see any commercial value in doing so.

Spam is not about technology, its about human behaviour and that is quite easy to change. It's human behavior that drives a decision to spam in the belief there will be competitive advantage. It's human behaviour that drives a decision to keep doing it because there are no penalties being dished out.

When there is a clear and unambigous message that spammers will be penalised, that same human behaviour will, in most cases, decide to avoid the pain.... ie stop spamming.

If all Google did was to issue a press release to the effect that a campaign of spam removal is to be launched, that would travel through the web world like a brush fire and would trigger an immediate spate of de-spammed sites, especially in the amateur spammer ranks ("I did it because eeveryone else is doing it") That alone would remove a sizeable amount of crud.

Then they target just one particular type of spam.... hidden links. I'd be amazed if their technology can't already detect them.... let the algo loose on just that. Another huge chunk of spam disappears and there is now no longer any doubt that mess with Google and feel the pain.

In two pretty basic steps Google can alter a hell of a lot of human behaviour. Yes... I know, there is always the serial spammers with their disposable domains who will keep on spamming and none of the above will deter them. That need not prevent making a start.

awoyo

8:56 pm on Feb 28, 2003 (gmt 0)

10+ Year Member



Just for grins, and really not trying to keep this dusty old thread alive, but... When I do a search for web hosting top ten the number ten spot belongs to an Auto Mechanic site. Of course there's quite a few hidden links to web hosting sites. Now, as a webmaster I understand perfectly, what's up. But as a "normal user" I might be a shade confused by these results. ;)

john316

6:22 pm on Mar 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I do a lot of google searches and really don't see that much spam at all.

What is going on here!?

1.) Any competitor that ranks higher than me is spamming!

NOT!

2.) I have the ability to be totally objective in evaluating a competitors web site (including how he places words).

NOT!

3.) I spend an inordinate amount of time obsessing over somebody else's ranks.

NOT?

Reporting someone 5 times is really just a waste of everyones resources (including google), no wonder they don't act on them individually (would you?) what is the point?

Maybe you should design better, quit crying, get a life, etc, etc.

Once again, I *really* don't see much spam in google.

john316

6:55 pm on Mar 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Here's a novel concept for acquiring a penalty:

Folks who spam the spam report should get a PR 2.

"We're sorry, but due to the fact that you have no respect for our time or resources, we have to lower your PR until you start behaving."

[edited by: john316 at 7:05 pm (utc) on Mar. 1, 2003]

dafthead

6:58 pm on Mar 1, 2003 (gmt 0)

10+ Year Member



john316,

come on admit it, you're bored and you want to wind everybody up.

Of course you could be right I can only speak for the UK mortgage and loan sectors which are dominated by one company interlinking many domains with hidden links resulting in their sites having over 1000 backlinks.

Yidaki

7:03 pm on Mar 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>the UK mortgage and loan sectors which are dominated by
>one company interlinking many domains with hidden links
>resulting in their sites having over 1000 backlinks.

... it's ok, as long as the rates are as low as they tell me twice a day by email.

LOL

john316

7:03 pm on Mar 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



dafthead,

You mention an industry in the UK, hoping to get the google crew to apply some oversight on those serps (and hopefully get your competition penalized, I'm sure).

Who is winding who?

That stuff should be deleted by a mod.

europeforvisitors

7:10 pm on Mar 1, 2003 (gmt 0)



G made its name from the free results but revenue is coming more from adwords. Until adwords starts tanking , G has no monetary incentive to take spam seriously.

Sure, it does. As long as e-commerce sites can use questionable SEO techniques to rank high on SERPs, there's less incentive for those sites to buy AdWords.

Also, complaints from webmasters are not seen as complaints from customers (they aren't BTW).

Google's forms for reporting spam and search-quality issues don't ask whether the complainer is a customer or a Webmaster. (Which makes sense, since Webmasters use search, too.)

In the future I see searching the web divided into..
Informational only -.edu, .gov etc. (less reason to spam)
Everybody else- a combination of algo + $$$$. (no more free ride regarding adv.!) just speculating....

IMHO, it's very unlikely that Google would use the domain suffix as an "information or commerce?" filter. Such an arbitrary distinction would drastically reduce Google's value to users. Let's say that I'm searching for tourist information on Elbonia. I'm more likely to find such information at Lonely Planet, Time Out, or the Elbonian convention and visitors bureau's visitelbonia.com than I am at a .gov or .edu site. And if I'm looking for information on something obscure (say, the Cathedral of the Holy Bones or the Bridge of Whispers in Elbonia City), I may not find such information anywhere except on a travel-related .com site.

I think it's far more likely that Google would add "e-commerce detectors" to its algorithm, which would allow it to give more weight to information pages and less to commerce pages in its search results. For example, it might look for shopping-cart links, e-commerce phrases, certain page-layout characteristics, etc. Such weighting would be in keeping with Google's stated mission, and--by skewing SERPS toward information pages--it would encourage e-commerce sites to buy AdWords.

Such an approach would be far easier (and better for users) than making arbitrary judgments based on a domain suffix. It wouldn't even be a bad thing for vendors. Affiliate and catalog pages selling "whatsit antivirus" might not come up in the top 10 for a search on that phrase, and neither would Whatsit Corporation's "order now" page--but Whatsit Corporation's home page for Whatsit Antivirus would have an excellent chance of ranking #1.

dafthead

7:13 pm on Mar 1, 2003 (gmt 0)

10+ Year Member



john316,

so you would prefer the "oh yes it is!", "on no it isn't!" form of debate?

OK you don't see much spam but nobody else has expressed that opinion on this thread, not even GoogleGuy.

john316

7:19 pm on Mar 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Such weighting would be in keeping with Google's stated mission, and--by skewing SERPS toward information pages--it would encourage e-commerce sites to buy AdWords.<<

If they did that, what would differentiate them from overture or any other pure ppc provider? People who search for consumer related items probably want to find consumer oriented sites, not ads.

Who said google was Funk & Wagnalls?

I'd rather buy the encyclopedia on CD and run it locally than do a search on any SE for the same info.

john316

7:27 pm on Mar 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Actually dafthead, these threads always start innocently enough, but inevitably lead to some postings that are intended to damage the competition.

You take it up with google, if they don't respond, you shouldn't use this board as a place to "try again".

By "try again", I mean identifying specific industries/serps that you are hoping beyond all hope that the google team will penalize.

By the way, how does your site get better if Mister 100 cross linked domains gets a penalty anyway?

dafthead

7:40 pm on Mar 1, 2003 (gmt 0)

10+ Year Member



john316,

I wasn't born yesterday. If Google don't do anything with detailed spam reports, they aren't going to look through a whole sector just because somebody refers to it in a forum. Unless I'm mistaken people often refer to sectors in these forums when discussing spam.

If the sites I refer to are removed from Google because they are breaking Google's rules then my site will be deemed more relevant by Google and I will receive more visitors. My site won't get any better but Google will rate it more highly.

europeforvisitors

10:00 pm on Mar 1, 2003 (gmt 0)



If they did that, what would differentiate them from overture or any other pure ppc provider?

Surely anyone here can grasp the difference between a page of relevant search results and a page of PPC listings.

People who search for consumer related items probably want to find consumer oriented sites, not ads.

That's why AdWords are in the margins, not in the search results.

Google's stated mission is to "organize the world's information and make it universally accessible and useful." If achieving that goal makes AdWords more desirable to vendors (either because more people use Google or because non-information pages are pushed lower in the search results), what's wrong with that?

OneTooMany

10:26 pm on Mar 1, 2003 (gmt 0)

10+ Year Member



My philosophy on the whole spamming issue is that if you cant beat them join them (or learn from them).

I have a site in my industry that employs some shady tactics and thus I reported them. Once I saw that nothing was done, I started thinking of ways to legitimately incorporate a similar strategy for my site.

europeforvisitors

11:26 pm on Mar 1, 2003 (gmt 0)



I have a site in my industry that employs some shady tactics and thus I reported them. Once I saw that nothing was done, I started thinking of ways to legitimately incorporate a similar strategy for my site.

That may be a viable strategy if you're creating a "here today, gone (and replaced by another disposable domain) tomorrow" Web site. It's obviously unwise if your goal is to build a Web site--or a brand--that will thrive and prosper over the long term.

OneTooMany

12:01 am on Mar 2, 2003 (gmt 0)

10+ Year Member



Let me clarify a bit:

One competitor had hundreds of pages that are keyword stuffed (in a list format)with the same layout and a few nonsense paragraphs. Imagine creating a page like that about red widgets, and then creating another identical page where you replaced every instance of red w/ blue. Then do this for every color possible.

Initially I veiwed this as spam (although Google doesnt think so). Oh and these pages are not navigable through the site. Most likely cloaked.

I decided to create an individual page for every product manufacturer on my site, (Originally they all appeared on a manufacturer page). My users now have the option of using both navigation paths (the original being the most efficient), but I have 100 new pages for Googlebot.

This 85 message thread spans 3 pages: 85