Forum Moderators: open
They will return, but it is impossible to completely control, though I will hang my head in shame and agree google strives to combat more than any other, and would suggest they do a far better job than any other.
I do now know google treats spam reports seriously.
Many, myself included doubted this, and maybe took google for granted.
But sit back and realy analyse the situation, it cannot be easy to unscramble the flak.
I now realise it is unrealistic for spammers or whatever you want to call them to be booted overnight on a say-so.
So they eventually do get lost in the serps, but next month another comes to the forefront.
Perhaps like trying to empty the Atlantic Ocean with an egg cup.
The problems are really for SEO's to explain to clients, with questions "Hey this guy did this, did that blah" ' but you said don't do it" etc etc.... which can be difficult to explain.
I will take a look at how we can add more info about what to expect on a spam report. Most of the effort will probably still be on using this data for our new algorithms, but I'd like people to know what to expect when they fill out a spam report.
I do think that 3 months is pretty reasonable to allow someone back in if they clean up their site. Actually, that raises as interesting question. Suppose that we detect hidden text (for example). What's the right amount of time for that site to be penalized? Does anyone have thoughts/votes?
In general, I don't think we are all trying to put competitors out of business, but are instead trying to level the playing field.
I think if a site straightens up, they should be allowed back in the next update.
To make it so it is not so tempting to spam if your just going to be bumped from the index for month or two, perhaps institute a 2 strikes rule.
Get caught once, shame on you, clean it up and we will let you back in. Get caught twice, and your banned for good no exceptions.
I think that would stop a lot of the repeat offenders.
Thanks for taking the spam serious Googleguy!
My reasons, why would a site use hidden text to a none-relevant site deploy such tactics, other than a PR gain?
Way too low. I'd say one year at least.
I think it's too low though - no incentive not to try.
I tend to agree. If Webmasters believe that using hidden text, etc. is the equivalent of fouling someone and getting five minutes in the penalty box, they'll continue to think that the spammer-vs.-Google game is just that: a game.
OTOH, I can understand why Google might be more forgiving to an obvious clueless amateur who hadn't heard that a little knowledge is a dangerous thing, and that hidden text isn't the bright idea that some people thought it was back when THE COMPLETE NITWIT'S GUIDE TO MANIPULATING SEARCH ENGINES was written in the days of Windows 95 and FrontPage 1.1.
Bottom line: Google needs to use good judgment when applying manual penalties, and it can afford to be more gentle with mypersonalsite.com than with cheapo-herbal-viagra-and-hotel-bookings-disposable-domain-of-the-month.com.
I've actually seen this with clueless amateur sites. To the extent that I have seen them use big blocks of *visible* keyword stuffed text on the bottom of the page thinking that this is a good idea for search engines. Almost as if they think "It's my site, I can put on it what I want, and if search engines rank me higher than they should, that's their problem, not mine."
I do think that 3 months is pretty reasonable to allow someone back in if they clean up their site. Actually, that raises as interesting question. Suppose that we detect hidden text (for example). What's the right amount of time for that site to be penalized? Does anyone have thoughts/votes?
Im with the 'out for one update brigade'
If they are using hidden text to get a high serp then they are pretty clueless anyway. When re-included, without the use of their hidden text, the likelihood is they would find themselves buried in the serps somewhere.
I think Google would also be more popular if they sent an offending site, (whatever its misbehaviour or oversight) a prior warning that sent a clear unambiguous message.
<begin awfullly written example>
Dear webmaster
Your website is behaving in a way contrary to our TOS. Please be advised that Google seeks to provide a resource that is blah bla blah etc.....If you wish to remain in the google index then you should have a look at your code and ensure that it complies with our guidelines at google.url.
Google is unable to provide precise feedback on your domain blah blah etc....Webmasters are reminded that repeated abuses by a particular webmaster or SEO company may lead to the removal of all identifiable domains associated with the SEO company or webmaster.
Sincerely
The Google team
</end awfully written example>
Im sure a little program at the plex could be developed to be able to automate a remove or stay directive, some 7 days later -
I know this view may be unpopular but, if a Google employee has taken the time to consider manually banning a domain, then its not exactly a big job for them to send an email and give the offender one last chance.
Some people work damn hard on their sites and are often ignorant of certain facts, or for economic or peer pressure reasons, may be compelled to push things just that little bit too far. Im not saying its right that they do so, I'm just trying to say that its not *always* cut and dry spam, and that some people may be a little ignorant. I dont think its fair to treat all offenders as murderous criminals :) some have just parked illegally and didnt see the sign :)
My point is that by giving these people a final opportunity to look at their domains before a ban is applied, that both google and the webmaster have an opportunity to create a positive. Ill feeling is avoided, google is seen as fair. The google - webmaster relationship is enhanced - the SERP is improved.
Sure, the algo way is a better way, but until a way is found of detecting simple hidden, and indeed more complex methods of manipulation, then fwiw, I think that it serves all constituencies to be able to have a drink in the last chance saloon. :)
They should know the rules of the road before they get into the car, if not, then they should learn the hard way. If Google would write a drivers manual, I think these issues might go away; but of course you will always have the occasional speeder, drunk driver...
If I was a cheater, I would just park in the handicap spot all day.
That is not fair.
[edited by: Anon27 at 7:51 am (utc) on April 19, 2003]
A few months ago a spam thread would expire very quickly, now we have one over the 100 mark... there is a message there.
I'm sure that Google's algorithmic approach will prove to be quite effective in filtering out a lot of techniques that breach their guidelines... and I am quite sure there will be techniques that will slip past any form of automated detection simply because the computational effort required to find it could not be sustained.
I suspect many serial spammers bank on just that fact.. its their unpatrolled border crossing by which they deliver their contraband into the marketplace.
What should the penalties be? Depends on the message you want to send. Perhaps 3 months as a wrist slap for minor infringements ramping up to 12 months for major league serial spammers?
GG... on the question of "impossible to detect" spam, there are spam reports under my name and referred to you. Reports are several months old, sites are still there as of latest update. Would appreciate it if you can give some feedback as elsewhere in this thread.
I edit for DMOZ and I can tell you that as good as these ideas are they don't seem to ever get put into practice.
I think the best and simplest way is to make it clear to all that the spam days are over - clean up your site or it will be out for 12 months after which time you can resubmit.
The index won't suffer as there are plenty of good sites out there to take their place.
It would be great to have a "pay for" service where you can have your site checked by Google and it gets a "no spam" tick or you get advice on how to fix it.
Some mechanism would have to be in place to make sure that you stay "Google friendly" As long as the fee was reasonable I'd be in it tomorrow even just for the peace of mind.
Google could even sub contract this work out.
Commercial sites would be glad to pay and non commercial sites can just follow the guidlines - it's not rocket science especially for a small non commercial site.
Rookie league is small scale hidden text etc. An email warning would probably suffice. A stern warning from the Google spam police.
Major league is affiliates with 3,000 page sites, using expired reputable domain names, redirected with javascripted framesets to high paying affiliate sites. And several coming out of one IP, usually involving the travel/ prescription medicine, casinos, p0rn etc industries.
With the major league guys - the warnings should be focussed on the 'beneficial recipient' of the spam traffic. He gets a message telling him he needs to get his affiliates into line with the Google TOS - or else the only traffic he'll ever see is PPC.
I think that its really the magnitude of the issue - make the punishment fit the crime. Cleaning up the major league guys would have a major immediate impact on the quality of the serps. Just look at the travel example posted a few pages back in this thread by onionrep - one spammer (acting as an affiliate) had destroyed a huge chunk of the first 50 results. Only the spammer and his affiliate partner win - searchers and competitors playing by the rule book lose.
So - I think
1. Everyone gets a warning
2. Rookies - do it again - and you're out for an index cycle
3. Major league - immediate 3 month ban for the affiliate AND after 3 affiliates get banned - a 3 month ban for the 'Beneficial recipient' of the spam based affiliate traffic. Unless - of course - you can establish the link from the affiliate to the 'benefical recipient' - in which case 1 warning - then all gone for 3 months.
Assuming that you can algorithmically detect hidden text when the bot comes by, if it is found, then that domain doesn't get listed in the next update. Any PR that that domain provides to other domains is ignored. Next month, the bot comes back, if the offending text is removed, then you're back in.
This removes the incentive to use hidden text, as well as the question of whether the offender is a 'serious' spammer or not. I've had clients who have had hidden text on their sites, not with the intent to spam, but because it helped them maintain proper table column widths.
While I agree with the idea of penalizing the ultimate beneficiary of spammy techniques, I don't see how you can do something like this fairly without hand checks. Otherwise, I could put in a bunch of hidden links to my competitior, and wipe him out. If I did this from a domain I didn't care about, the cost to me would be small.
(a repost of an earlier q)
A question. A spam report has been sent in, the site shortly threafter disappears from the serps, but returns now and again after freshbot visits, and since the last dance is solidly back in the serps - unchanged.
Is this normal behavior as a site is phased out? Should I re-send something to indicate that there is still a problem?
It would be nice to know on the surface what to expect, and I am sure this would benefit more than me. If it is the norm that the site fluctuates in and out of the serps before it's eventual demise from google, then we could calm the panic filled posters - myself included - the same we we say 'don't worry, it's just everflux'.
However, i second that depenalizing a site if the site is clean again is the best strategy. IMHO the needed technical ressources are the same if you give them a penalty time stamp or recheck their behaving frequently. The moment a site is clean again, it should be back in the index and take the ranking they deserve to have without cheating.
Don't want to complain but, again, the first step is to automatically check and penalize spam - old fashion spam and the over discussed guestbook phenomenon (also proof reported) are sooo easy to spot by algo ... clueless why nothing happens ...
PR does follow a 301. A couple of times it seems like it took 2 updates for the PR to follow it, but it usually makes it in one index.
BigDave,
This didn't work for me. 301 redirect at old site using .htaccess and PR7 dropped to PR5. New site can't see the 200+ back links I could see from the old site.
It's noncommercial, so not a big deal, but still disappointing.
- Ash
Google should flag offenders sites and check up on them periodically - run a script through their site or something - automated - to ensure that they aren't re-offending.
I like this idea. If the spam report were fed into a database, it could schedule periodic "spam scans" just like a googlebot run.
- Ash
Just look at the travel example posted a few pages back in this thread by onionrep - one spammer (acting as an affiliate) had destroyed a huge chunk of the first 50 results.
Hi Chris_D
I didn't actually look at those pages for any evidence of spam, so can't comment on whether they were or not (they've now gone so I can't check them)
My point on that particualr serp was that I found it diffcult to understand why google did not/could not/would not ensure that a single domain does not receive multiple positions for a single query.