Forum Moderators: Robert Charlton & goodroi
since google introduced their new infrastructure BIGDADDY and their new bot crawl mode depending on incoming links the SPAM of our guestbook has nearly become 10 times higher.
Nearly every day there are SPAM entries on our page. I´m running a none profit page for a home for animals.
Did google took any thoughts about it as they introduced their bot crawl mode? I would like google to revert it back and make some better algorithm instead of shifting off the problem to webmasters who do now the work for google.
The best way to stop guestbook spammers is to rewrite your form; substitute giraffe for email in the form field code, cheesecloth for name, and so on ... autospam (which is most of it) either appears as empty posts - or doesn't appear at all.
You can also help by having other antirobot measures; have submitters enter a number that robots cannot read, etc.
Or have messages premoderated.
You've already installed nofollow, of course.
Bottom line: BigDaddy - good, spammers - bad.
Want to have internet site be ready for spam. Does not take much to kill it. Simply have a static image "222" on the guestbook form and ask person to enter that number into input box. If it does not match then it's a bot.
I'm fed up with this ... Google are the ones with the millions, let them sort it out.
I'm fed up with this ... Google are the ones with the millions, let them sort it out.
I don't get it. What's google responsability in this matter? People spam your site and they somehow should create some incredibly advanced AI to detect this instead of you keeping your site in order? Do you want them to sort your emails too?
Search the web for "captcha" to stop robots polluting your guestbook.
Thanks for that - I can never remember that word!
I'm fed up with this ... Google are the ones with the millions, let them sort it out.
You can just as usefully ask Google to brush your teeth. As someone's said above, increasing guestbook spam is usually a result of the guest book being found more (kudos to Google). What you allow those visitors to do once they arrive is down to you.
In fact, blaming google is like getting your bricks'n'mortar store in Yellow Pages, getting many more customers, then blaming Yell for the shoplifters.
It's tough life, isn't it? ;)
[edited by: Quadrille at 10:06 pm (utc) on Aug. 29, 2006]
But back in the day they loved "fresh", which turned out to be stupid so they set up the sandbox. The sandbox turned out to be stupid with freehost pages now being blog spammed to a massive degree by the folks who used to set up new domains instead.
Until Google comes up with an algo based on quality rather than volume they'll create this back and forth cycle where their previous dumb thing is addressed by their next dumb thing.
Who'd have thunk their were so many ebbs and flows in a battle to remain medicore...
Google said reciprocal links are dead (after google said links were good).
Google now loves blogs and blog comments. What do you expect the spammers to do google. Keep exchanging links?
The truth is before google came along links were good for their own sake. Google ruined that.
Now blog spam is endemic and google are responsible for that. Blogs will soon be useless just like reciproical links.
The game google plays is always steps behind the spammers and the only people who lose out in the long run are honest webmasters. The spammers know the risks and have already mitigated for them.
Google need to spend some of that cash pile on improving spam detection without favouring certain types of sites...like wikkis and blogs. All the favoritism does is move the spammers along to these vehicles that were once usueful but now being ruined by google.
Where link popularity is the main factor to rank over quality of the site you are going to get this scenario where spammers will dump links left right and centre. And as much as they say it's link quality over quantity, if that was the case then blog spamming and from comments would not be an issue as they would be worthless. The fact that blackhatters and spammers seem to target this suggests otherwise.
So im my opinion thanks to Google's algorithm we old / stangnant data due to the sandbox, and many sites with inflated serps thanks to link spamming.
for the suggestions how to deal with spam entries.
But it seems that the entries are not inserted by bots but by humans.
Its not only a guestbook, its forms for seaching for missing animals and forms for having a look at interceded animals that they are spamming.
And i still think that if google wouldn´t have switched their algo to a new link-count-system, the spammers would not do so massive link-spamming.
Google should, instead of now, crawl every site they find like they used to do and not depending on how many links that site gets.
A good technique to deter human idiots is simply to set the form up with extra fields, set to bounce if every field is not filled in - including, for example, a working email address.
Registration (where relevant) should always require the member to confirm their email address.
Serious posters will not be detered by an extra field or two, and safe registration - but most human form abusers are teenage idiots with little patience or aptitude for 'work'.
Also do be sure to have a simple 'delete member / ban member / ban IP' to help your moderators.
And you do have nofollow, of course.
If you use a cheap or free off-the-shelf guestbook program (as I do, and most people, probably), it's worth shopping around for one with better security built in.
[edited by: Quadrille at 12:10 pm (utc) on Aug. 30, 2006]
But that's a discussion for another thread and doesn't change anything in regard to guestbook spam. Your options are pretty simple; human moderate the posts, software filter the posts, remove the guestbook or ignore it and let them run wild. Whoever's fault it is you still have to make the best choice for your site.
:)-