Forum Moderators: open
I summitted a spam report to google a few months ago, our main competitor was ignoring all of the google guidlines, doorways, hidden txt and links, a fully meshed 10 domain optimized entry - heavily interlinked etc.
And now....
They have all been thrown out (all 10 domains) - PR greyed out and no keywords in the index!
what can I say --- thankyou google!
Play fair:)
A note worth mentioning though: Make sure your own site is now spotless, including all the sites you link too.
Not only are you red-flagging your domain/cat to Google but your competitor probably isn't stupid and if I were him I'd be out to get you...
Nick
[edited by: Nick_W at 11:39 am (utc) on Mar. 26, 2003]
How exactly are they to tell what domain belongs to the person reporting the spam.
>if I were him I'd be out to get you
You couldn't be him Nick, you don't do those things right? ;)
>but has it resulted in you moving greatly up the serps for your ideal keywords
I don't think that is the point here. If the person was cheating that bad they deserved to be reported regardless of who benefits. Maybe the general public benefits with better serps.
>I'd almost bet it was not you turning the sites in, but the latest round of penalties handed out for crosslinking
Could be that or could be that multiple complaints have been lodged, enough to draw human attention. I've only used the spam form once but it was put there for a reason so there is nothing wrong with using it for a legitimate reason. Sorry to be so disagreeable with general consensus ;)
Seems to me by reading this forum for nearly a year some people will jump on any flimsy excuse to attempt to oust someone whos ahead of them in the serps - even to the point of reporting one minor discrepency on one page amongest possible thousands of pages on a site thats maybe a highly informative site far out doing yours.
What about the ma and pa affiliate sites that by a coincedence with key word density etc. hit in the top 10? You report them too cos they dont understand jack about SEO and just happenned to jag it?
All these posts of dobbing in others for one minor infridgement GRRRRRRRR.
(absoloute blatant spam fair enougth - hidden 50K text of keywords etc)
All I can suggest is:
Get a life. Do your site better.
All these posts of dobbing in others for one minor infridgement GRRRRRRRR
Do you seriously believe that Google hands out penalties for one minor infringement? It's hard enough to get their attention with the really big and bad stuff. :-)
Just like BETA testers for software, SEs rely on the power of the masses to find the bugs and crud they miss, thereby delivering a cleaner, cheaper product in the end.
Do you seriously believe that Google hands out penalties for one minor infringement?
Exactly. If you put a page through the rinser, and it doesnt come out the other side, then well, it's just not a good enough page. simple as that.
Get a life. Do your site better.
lol. should we cloak too. oooo. i think i should generate a million pages of garbage and count how many rise to the top, then when people complain about how useless the web has become I can just say "i told you so" ;)
On the other hand, filling out the report can be seen as a good thing because if enough reports are filled out regarding a certain tactic then G can tweak the algo to compensate.
On the other hand, I've been running into a great many "spammy" sites with hidden links and hidden text, but it's my belief that these sites were done by amateurs, and that their sites are higher up in the serps in spite of their spam. In other words, they have relevant content (and their content is quite relevant) and G is putting them up in the serps for that, and not their spam.
So if you don't see a spammy result go away, it might be because the site is relevant to the search.
No I'm not saying that at all what I'm trying to say is why report one minor link inadvertently misplaced or one link java scripted on say a 20000 page site just to acheive better ranking for your own site? By recent posts on here it appears people are willing at a drop of the hat to report left, right and centre the slightest thing that THEY dont agree with.
I think google guidelines are a "rougth" outline of how to lay out your site - do you think the average joe blow that never reads this or like forums or searches for guidelines on how to design a site would actually even know about the google guide to site design?
Hidden text when it is the same color as the background
That's not as straightforward as you may think. I use css to style colors, and I could see a bot looking at my code and maybe thinking my links are hidden, though they're not.
And the reverse is true. You can have text out there that is styled to blend with the background, something the bot wouldn't be able pick out.
In my view, CSS makes it harder to separate the spam from the legit.
I have my doubts about the efficacy of hidden text. I don't think repeating text has all that much effect, at least not as much as other, important factors. Repetitive and hidden text, to me, is a sign of an amateur, more than anything else.
As I said in my earlier posts, some of these sites rank well in spite of their spam. In other words, Google is ignoring it, and their score is based on the site's actual relevance, and not the spammy techniques.
As I said in my earlier posts, some of these sites rank well in spite of their spam. In other words, Google is ignoring it,
Ignoring certain types of spam is a perfectly good response on google's part. Some known places where they do this would include meta keywords and comments. Why should they not do that with some other factors.
How exactly are they to tell what domain belongs to the person reporting the spam.
In my case they could tell several ways. I included my WW nick, and when you go to my profile you can just click on the link. I also put in my e-mail address which includes my domain.
I didn't report the site to improve my own traffic. The keyphrase is a manufacturers name, and I get around 60 times as much traffic from the direct links on that manufacturers home page as I do from google searches on the manufacturers name. That search accounts for 0.01% of my traffic.
The problem is that this was blatant spamming by one of the really big e-commerce sites. I got to check over half the boxes on the spam report form, including cloaking. The search term where I noticed them wasn't really important to them, but with the amount of items they represent, and the number of domains that they use, they must be filling up thousands of other SERPs where many other people are trying to earn an honest living.
I also read that WW World is one of the TOP 400 most popular sites on the web and gaining?
A bunch of VERY.. SLOW.. LEARNERS? lol
If you play football - and break the published rules of the game - you get penalised/ get a proferssional foul/ get sent from the field - depending on the severity of the breach of the rules. Your choice.
If you don't like the referee/ umpires decision and punch him out - you get the book thrown at you - and get banned for life.
We tend to expect that 'google's rules' are similar. But they aren't - the Google referee's are either asleep; looking the other way; or non existant. So we start to think 'if you can't beat-em - join-em'.
I don't know the answer. I just think that there are so many posts about this topic because human nature says that the majority want to do the right thing.
After all - it is Google which has positioned itself as the 'moral SE'. Google has taken a stance on what is good, and what isn't.
I'm sure FUD (fear, uncertainty and doubt) also drives a lot of this 'compliance'. But recurring, unchecked and blatant spam in the index, and a perceived to be 'broken'/non-existent spam reporting system - is a path one other SE headed down...
My advice to Google - don't let history repeat - remember AltaVista? At the Sydney SES conference yesterday, one of the Google people was asked 'who is your greatest threat?' - Answer - 'none of the 'current players' - a little start up with a better way'...
Finally - remember the recent discussion [webmasterworld.com...] about Jacoob Nulson (isn't that how you spell it?) having a page of tiny text with his name mis spelt? Its still there: -
<FONT SIZE=1><SMALL CLASS=microtext>
Here are some common misspellings of my name (included to help search
users):
Jacob Nielsen, Jakob Neilsen, Jacob Neilsen, Jakob Nielson, Jacob Nielson,
Jakob Neilson, Jacob Neilson, Jakob Nilsen, Jacob Nilsen, Jakob Nelson,
Jacob Nelson</SMALL></FONT>
And that page is still pagerank 9 - no penalty. How many people submitted reports on that one!?! No action taken? I think that 'spamreport' is an office over a mineshaft.....
Thats my rant
Chris_D (also often mis-spelt as 'britney spears nude', 'google' and sometimes as 'yahoo' - just to help any search users who are looking to find me.... )
:)
Ignoring certain types of spam is a perfectly good response on google's part. Some known places where they do this would include meta keywords and comments. Why should they not do that with some other factors.
Google ignores meta keywords and comments, period. So spamming of those items wouldn't have any beneficial effect if it weren't penalized (unlike hidden text and links, which could benefit the spammer).
I agree that it makes sense to ignore rather than to penalize in many if not most cases. Links in guestbook entries and forum posts are good examples: It wouldn't make sense for Google to penalize those, since doing so would enable Webmasters to sabotage their competitors' sites by indiscriminate posting of the competitors' URLs. But it wouldn't make sense for Google to credit those links, either, because the links aren't true "votes" for a site and can easily be created en masse for spamming purposes.
Another example would be text links that were bought to acquire PageRank (a topic that was discussed in another recent thread). Instead of penalizing sites that bought or sold PageRank via text links, Google could simply ignore links from suspect domains and let the market (and word of mouth) take care of the problem.
There are no tattle-tales in the business world.
Ha! Perhaps you are unfamiliar with lawyers... :) Lawsuits are a widely used weapon to harass competitors or to enrich oneself. Sometimes there is a good basis for a suit, in other cases a suit may be generated on the flimsiest of excuses. Any corporate transgression, perceived transgression, or imagined transgression can be the basis for a suit by someone claiming they were harmed.
This system is deeply flawed, but it does serve to enforce "rules". Most senior execs (so-called insiders) I know worry about selling any company shares they own, no matter how good their reason. They know that if the firm's stock drops in the months following their sale, they will almost certainly be sued by some lawyer claiming the sale was based on insider information. It makes spam reporting look rather benign... :)
If Google was able to identify, as part of the algorithm, which parts of the file were visible and which weren't, it would be a perfectly acceptable response on their part to only pay attention to those parts of the file that were actually visible to the user.
If the invisible text is at the end of a large page, it may actually have very little value in the grand scheme of things. I don't know. But if someone is attempting to spam Google with an ineffective method, why penalize them, why not let them continue to put their effort into ineffective spamming instead of something that might be tougher to catch.
...if someone is attempting to spam Google with an ineffective method, why penalize them, why not let them continue to put their effort into ineffective spamming instead of something that might be tougher to catch.
I see your point, and I agree that Google shouldn't penalize things that the Webmaster may have no control over (e.g., guestbook links), that are ignored by Google in any case (such as meta keywords), or that may or may not be attempts to subvert Google's search results (text links on a PR9 site).
But in cases where there clearly has been an attempt to subvert search results (e.g., hidden text, cloaking, or massive and artificial crosslinking patterns between domains), Google needs to use penalties or bans for a simple reason: If it simply ignores clear-cut attempts to subvert its search results, there will be no incentive for Webmasters and SEOs to honor its guidelines.
Let's say, for example, that I use white text with hidden links against a white page background. If I get a penalty for that, I may think twice before trying another spam technique. But if my trickery is simply ignored, I'm likely to experiment with a different trick, because I have nothing to lose by trying.
Simply put: They need to improve on their algo.
I think we can assume the algorithm is always being refined.
Basing a method to accurately represent the web on snitches and penalties is just no viable option in the long term.
"Snitches" and "penalties" are two different matters. Google doesn't need help from "snitches" (now, there's a loaded word) to assign penalties, just as it can welcome reports from "snitches" to help in refining its algorithm irrespective of penalties.
Sure.... that's exactly what the spam report is for. It's actualy a feedback on temporarily not absolutely perfect search experiences, which altrusistic members of the great internet family use to help a wonderful and also altruistic company making our lives even more perfect. Cough.
No, this is not about wether we think Google is an excellent search engine or not - they are without a doubt - it's about relying on webmasters ratting each other out as a means to fight spam.
Google uses this as a method to fight spam. Period.
Note that Google didn't need spam reports like 12 months ago. No other search engine needs such a huge systematic employment of snitching.
Google's search quality is still top, the other engines can sometimes match them but not surpass.
But in order to stay at the top Google needs to be better, way better than the others simply for one reason: they are target #1.
Everybody knows exactly how to spam Google. It's a snap. You don't even need to be a SEO to know how to do it, think google bombing.
It's all on the table, the links, the anchor text, and the basic onpage stuff.
Of course Google always tunes the algo, all engines do. But the large waves of penalties, and the constant pitching for spam reports, that's what has become Google's main weapons in their fight against being outwitted.
No, sorry, but that doesn't sound like a viable long term strategy. Just take a look at the frontpage here to see only one reason why not.
Google IMO needs to get back to where they came from: they need to be able to rely on their algo.
They have been untouchable. Those times are long gone.