| 10:14 pm on Apr 16, 2003 (gmt 0)|
|'spam' has as much right to the top positions as any of your sites. |
Google obviously doesn't share your viewpoint. And that's what counts, isn't it?
| 10:28 pm on Apr 16, 2003 (gmt 0)|
It should be noted that Google doesn't encourage spam. It is just that they can't stop all of it from getting through. Also, what some webmasters call spam, Google doesn't agree. For example, if a small e-commerce merchant is #1 with fuzzy-blue-widgets.com, and you are buried because to choose brandname.com, that ain't spam.
| 11:07 pm on Apr 16, 2003 (gmt 0)|
I felt very much the same as yourself on this topic.
That being said here is what mey experience has taught me.
1. Google may not be ignoring your spam reports. They may not consider what you reported as spam.
2. The machine grinds slowly. It may be that Google does think that what you have reported is spam and are just now adding the code they need, to the algo, to stop it.
A tip: Turn in as detailed a record of the "spam" as possible. Fill in the report completely. Include the tactic or code in question. reference your nickname, WW and Googleguy as well.
| 12:41 am on Apr 17, 2003 (gmt 0)|
I have to agree about how to build a number one site. I believe I could do it easily now and in a relatively short time. The trouble is, for me, that would be not practising what I preach. And I am worried about wanting more, and what that does to karma (don't laugh - karma pays you back). So I am waiting and hoping that Google is the tooth fairy after all.
I know that reporting Spam to Google is frustrating. Although I am personally grateful to GoogleGuy and his help in getting some Spam dropped (please GG, drop those other 15 sites they have and that I found - can you believe that, 20 sites total for one area). I *really* wish Google had an effective Spam reporting mechanism, and actually repied "Yes, this is Spam" or "Sorry, we do not agree". Nothing more. No explanation needed. Perhaps where they insist upon a report by email from your real domain (relative to that industry). It is up to the Spam reporter to get their report right. If they mess up or are just trying it on, then tough.
| 1:00 am on Apr 17, 2003 (gmt 0)|
Hey mrguy, I'm showing that your spam report went through last week sometime--have you checked it recently? The search for widget verbication that used to bring up the cluster of sites that you complained about doesn't bring them up any more.
c1bernaught, I'm seeing that your spam report went through late last week too..
| 1:14 am on Apr 17, 2003 (gmt 0)|
Looking back at it, I think the last spam report I filed was a mistake since as I zip through the webmaster guidelines I don't see "don't sign a zillion guestbooks" as a specific reccomendation (although a generous reading of the doorway pages part might cover it).
I hope you guys are scaling some algorithms to deal with guestbooks as a site is managing to rank #1 for a very competitive phrase basically just from guestbooks (as their doorway pages seem to have been squished). I don't suggest penalizing, I suggest completely ignoring "vizbook.htm" type pages, if you can scale that mountain.
| 1:20 am on Apr 17, 2003 (gmt 0)|
Still waiting to see Googleguy if you squish that site using hidden text I sent in a couple days ago. You said you wanted hidden text examples. And c'mon. A massive block of text with color #000000 on a #000000 background? And even a hidden link to a child porn site that also has hidden text? (Although, the child porn site is grey toolbar; dunno if because it was banned or is new.) I'd have thought the algo could pick up things this obvious.
| 1:25 am on Apr 17, 2003 (gmt 0)|
I have tried to follow your advice regarding spam submissions and have reported 2 sites several times over the past several months. One with hidden links and the second with doorway pages used to automatically open a window to a site with a grey bar - obviously penalized so they created a doorway to get around the penalty. I will try to submit one more time using this nick. I would appreciate if you could take a look.
| 1:34 am on Apr 17, 2003 (gmt 0)|
>Looking back at it, I think the last spam report I filed was a mistake since as I zip through the webmaster guidelines I don't see "don't sign a zillion guestbooks" as a specific reccomendation (although a generous reading of the doorway pages part might cover it).
How do you know it wasn't one of this site's competitors that signed those guestbooks? A guestbook spamming service is even advertising using Google Adwords to have their bots sign thousands of guestbooks for a low fee. Just whip out your credit card, and voila. This service even will spam message boards of sites for a fee. Amazing that Google wants them as an advertiser.
| 1:40 am on Apr 17, 2003 (gmt 0)|
rfgdxm1, I know that one of the sites with high PR that you mentioned got checked out before the new crawl. I was keeping the widgetline.com site and widget.tv sites out because they were good testing examples--hope you don't mind. steveb, we did tighten things up a little more for the next crawl. Remember that just because a site shows lots of guestbook backlinks doesn't mean that all those links are actually doing any good..
| 2:02 am on Apr 17, 2003 (gmt 0)|
|How do you know it wasn't one of this site's competitors that signed those guestbooks? |
That happened to me. I don't know if it was a competitor or just an idiot with an axe to grind.
| 2:11 am on Apr 17, 2003 (gmt 0)|
Hello Google Guy,
I just reported hundreds of identical domains containing a single set of links. The only difference between the sites was colors of the pages, and order of links. The funny thing is these sites have a gray PR bar but show up #1 in searches.
Also googleguy could you make a new thread about what you consider blatent spam that should be reported? This would be a great reference thread for new people coming to the site...
| 2:19 am on Apr 17, 2003 (gmt 0)|
Since the sites I reported GG are not "competitors", I really don't care if they stay or go. That .tv one in particular is more a case of one where the issue is tightening up the algo. I was just surprised that something as obvious as that example could somehow make it by a filter.
As to europeforvisitors, the fact there are people out there advertising to spam thousands of guestbooks for a low fee explains it. The fees are so low it doesn't have to be a competitor. Someone who just doesn't like you can do it with minimal cash outlay. My guess is the days of guestbook signing paying off with Google are over, or near an end. These should be easy enough to spot with an algo. Since most guestbooks are standard scripts with obvious identifying features, the only guestbooks Google should have trouble spotting are those that run custom scripts the webmaster wrote himself.
| 2:19 am on Apr 17, 2003 (gmt 0)|
I am going to nominate GoogleGuy for the Nobel Peace Prize.
As I see it, both parties are benefiting here by allowing GG to refine their techniques by working with real world examples and the Webmasters benefit..... Need I say more.
Thanks GoogleGuy for taking on some of the issues and being so proactive (I'm sure this could be a full time job itself)
| 2:20 am on Apr 17, 2003 (gmt 0)|
That is a very good suggestion, RE: Spam definitions thread.
| 2:25 am on Apr 17, 2003 (gmt 0)|
"How do you know it wasn't one of this site's competitors that signed those guestbooks?"
How do I know a turtle from a yak?
The site I'm talking about only has internal links, a standard dmoz link all its competitors has, and some doorways not showing up as backlinks. Yet it somehow manages to be #1 for a primo key phrase. Keyword signed guestbooks vault it to the top.
| 2:33 am on Apr 17, 2003 (gmt 0)|
>The site I'm talking about only has internal links, a standard dmoz link all its competitors has, and some doorways not showing up as backlinks. Yet it somehow manages to be #1 for a primo key phrase. Keyword signed guestbooks vault it to the top.
Then, how can you know it wasn't some idiot who he keeps flaming on Usenet that did this to ruin his online business? Spite, malice and revenge are good enough reasons to do things like this for many.
| 3:25 am on Apr 17, 2003 (gmt 0)|
How many webmasters are hurt by spam?
Since G is by far the most popular engine, they might not see a big bottom line incentive for spam reduction.
what if 5000 or so webmasters formed an anti-spam association?
The dues to join were $30/month.
The assoc. made a deal with G to give them 75K/month to hire basic anti-spam people. (maybe 25-30?)
The other 75K/month would be divided in half.
For each spam report sent from a member to G that resulted in a TOS penalty that month, G would get a small bounty.(37.5K maximum/month total) and...the member would get the same amount as a dues rebate (up to 1/4 of the $30/monthly)
An incentive for all to reduce spam.
Im sure my idea has 100s of holes I havent thought about....
Are there 5000 non-spamming webmasters out there? :)
| 3:48 am on Apr 17, 2003 (gmt 0)|
I don't see any reason for a vigilante force or posse. Google doesn't need huge numbers of reports. It simply needs enough to provide a sampling of questionable SEO techniques that can be detected and neutralized by its algorithms. GoogleGuy himself has said that Google prefers algorithm-based filtering methods that are scalable. That makes sense, since Google is a search engine, not a human-edited directory.
| 4:01 am on Apr 17, 2003 (gmt 0)|
I've just sent you a spam report.
| 7:12 am on Apr 17, 2003 (gmt 0)|
There are still holes in the system. PR9 non-adult site using lame User Agent + no cache cloaking technique to link to huge adult sites imperium has been apparently ninja banned, however those linked sites kept their PR7. Moreover even if it seems they have been "deepbot banned" they are picked by freshbot over an over again, occupying 10 of 10 top positions on certain keywords. Also that already greyed out site seems to me being picked by freshbot regularily...
| 3:56 pm on Apr 17, 2003 (gmt 0)|
The problem sites have been fixed!
| 5:22 pm on Apr 17, 2003 (gmt 0)|
googleguy, I have sent in reports as per the outline here in webmasterworld, with my nick in the message, yours in the subject line, but it seems to no avail. They had disappeared for awhile, but they are back, and their 'hidden' link scam managed to get a new completely unrelated site of theirs some good PR. I have recently sent in some new turkey's in to you. This is all blatent spam I'm talking about.
What I am wondering is, for you to find our messages, should we include our nick in the subject too?
(btw, they aren't present in the www-sj serps, which may be of interest to you)
| 5:41 pm on Apr 17, 2003 (gmt 0)|
Yes I did notice that the spam was gone after this most recent index.
I think everyone in that market appreciates the removal.
However, There are a couple things could be done that would alleviate much of the frustration you hear here. For one, you could send an automated email reply confirming that the spam report was recieved and is in que to be reviewed.
Secondly, an email that says that the spam report had been reviewed and is either accepted or denied. You wouldn't have to give a reason.
These two things would "close the loop" on spam reporting. I can't imagine that this would be difficult to automate.
This would be a benefit to all Google users.
From my experience I can say that my frustration came from not knowing what was going on. It seemed as if Google just didn't care that the serps were being dominated by what I considered to be spam. I know differently now. However, having a different, more interactive spam reporting process, would have made this a much better experience.
| 6:44 pm on Apr 17, 2003 (gmt 0)|
I agree with c1bernaught. I've yet to have the loop closed on the
|Thank you for writing to Google. |
We read all of the email we receive and try to send personal responses to each message.
This note is just to let you know that we've received your letter, and you'll hear from us soon. We appreciate your taking the time to contact us.
statement. Though it has only been about six weeks...
| 9:02 pm on Apr 17, 2003 (gmt 0)|
I appreciate the suggestions. I understand that the current process feels open-ended. I'll see what I can do about that. The main thing to bear in mind is that we can't promise to take action on a particular complaint though. I am excited about the quality of data that we're getting for testing the next generation of algorithms, though.
| 9:11 pm on Apr 17, 2003 (gmt 0)|
My suggestion would be to have an automated response to e-mails, and a page that pops up after filing the spam report, that tells people that the spam reports are generally used as data for improving the algo. And that manual intervention is only used in extreme cases. Don't promise people that you will get back to them until you have the manpower to actually do it.
If you can change the expectations at the time the reports are filed, instead of trying to do it on WW, weeks or months after the fact, would improve Google's PR.
| 9:43 pm on Apr 17, 2003 (gmt 0)|
I feel like the guy without a prom date.
Is my example spam?
Am I a whiner?
I want to "fight clean". To me the techniques of the site I filed a spam report for are definitely "dirty tricks":
"Don't create multiple pages, subdomains, or domains with substantially duplicate content."
That is what I see in the case I have outlined.
In general my websites have always performed excellently in Google and other SEs because I don't create pages optimized for "penguin popstar nude pictures" when I am really trying to promote "[lodging type] in [destination]". Nor do I create 100 separate pages with the same content on it in order to create high page rank from massive internal links from 6000+ pages.
Feedback very much appreciated.
kind regards -
| 9:48 pm on Apr 17, 2003 (gmt 0)|
Setting expectation is important and a pop up page after filing the report would be good. This page could also indicate that any action taken after the review of a spam report may take X weeks/months because of Google's desire to automate the process.
The next piece, after the report is reviewed, should be a second email simply saying "Your spam report has been reviewed". Further explanation would not be necessary.
The submitter would then simply wait to see if anything changes in the serps. At least the expectation has been set and the loop has been closed.
| 10:01 pm on Apr 17, 2003 (gmt 0)|
| 10:06 pm on Apr 17, 2003 (gmt 0)|
Hmmm, for what it's worth, I reread my question in msg#53, and oops, but my that was a dumb question:
|What I am wondering is, for you to find our messages, should we include our nick in the subject too? |
As if you're gonna go a searchin' for each individual nick. Whoo boy, time to get some actual sleep...
|I am excited about the quality of data that we're getting for testing the next generation of algorithms, though. |
Lookin' forward to it! How long until the next generation comes out to play? (what scale of time are we talking about?)
| This 193 message thread spans 7 pages: < < 193 ( 1  3 4 5 6 7 ) > > |