Forum Moderators: Robert Charlton & goodroi
[google.com...]
However no action was taken and site is happily #1 still.
Why Google doesn't hear spam reports?
What can you do in that case?
I've said it for years: 95% of the stuff we hear as "spam" is NOT spam to search engines.
SPAM: Sites Positioned Above Me...
The spam report appears to be one of those ways. The way I've learned to spot dirty tricks is by looking at the code of sites above me in the searches. Most of what I've seen wasn't unintentional.
I've seen Google appear to take action on a report, although it took several weeks. The site vanished and came back later with the same page minus the dirty tricks.
What would you call a site with network of sites just made to provide links to the main site. All are interlinked and some of them only contain links to different pages of the network sites.
Links and links and only links.
This website is ranking TOP 10 for every keyphrase targeted and their SEOs boast of their knowledge.
One of their sites is copied template of another site known in same business.
Though reported, still no action taken by google.
Regards
I've said it for years: 95% of the stuff we hear as "spam" is NOT spam to search engines.SPAM: sites positioned above me...
I quite agree. What are people like? They sound like spoiled children crying "mummy mummy that site cheated!"
Google is a search engine, it collects a list of web pages and presents them as a searchable directory, it's not awarding public money to finance them!
Matt
;)
feel free to <b>spam</b>, use spammy disposable sites, feel free to hide spammy keywords. Just make sure the spam domain is disposable, and that it's more than a year old. Remember, <h2>spam can equal money</h2>. <h3>Spam is good</h3>. There's lots of reasons to spam, I mean, I can think of a hundred reasons to make this text pale on white, it's not spam, spam is a good thing, it helps the user, and spiders... but most of all it helps my on page optimization for keywords such as spam. So spam away, spam happily, this stuff works. LSI, don't make me laugh... google can't even catch pale on pale CSS...
Maybe you can think of a legitimate reason to hide text like the above block, I can't. silverbytes, all you can do is keep trying, contact msn and yahoo, you might have more luck with at least one of them.
Search engines should work with webmasters more so we can all have a happy medium on how to create sites for the engines so they can have better results.
It's stupid of them to keep us all guessing at their vague guidelines and innocent sites getting banned for one little mistake they made in their code that triggered a penalty on their site.
There are so many things that can cause a ban that I can think of right now I could type all night. It's ridiculous!
However no action was taken and site is happily #1 still.
Bad analogy alert:
As a programmer, I have an option of coding dozens of standalone applications, or incorporating them all into a decent enough finished project. I prefer the latter, even though it may give the appearance that I'm not doing anything.
End alert
Why Google doesn't hear spam reports?
What can you do in that case?
It's all one vicious cycle :)
<edit>Opps - realized I posted in Google News. I'll shut up again</edit>
[edited by: grandpa at 10:17 am (utc) on Feb. 23, 2005]
Search engines should work with webmasters more (...)
Search engines are a medium for the webusers, not the webmasters. A spam report mentioned in the first post is just one vote against a page. Google will look at it eventually, but the action to be taken is their decision, not ours. They will look at many factors, including but not limited to spam reports and the vote buttons in the toolbar.
As Brett already mentioned the definition of spam varies with search engine rank.
If a #1 position is held by a 100% spammy site, not only you as a webmaster will recognize it but also the average Joe Surfer. He will just ignore that entry and select the first interesting page in the SERPs instead.
Don't worry, you did what you felt needed to be done ... now its Google's turn. You submitted the report and they will eventually get around to looking at the page or pages reported.
If they really are doing something outside of Google's webmaster guidelines, it will likely be taken care of ... sometime, but as others have mentioned, Google has lots of other, slightly more important problems on their plate.
The problem is that every time Google has a major algo/filter change, some of these same pages/sites reappear in the SERPS. Its frustrating I know, but people will figure out the "real" content sites and reward them by visiting in spite of the inability of the search engines to filter out the crap.
i dont think theyve made any secret that for the most part they prefer to use a filter to remove spam rather than do it by hand....they say they use the spam reports to get a picture of what technique is being used and trying to automate a filter for itI think soapystar got it right: Chances are probably slim that Google will actually go in and do a customized nuke of the offending site (unless it's a DMCA issue where they're risking legal liability for complicity in violation of copyrights), but if you've brought the site to their attention, they might analyze it to see how it was able to rank so well, and that use that feedback in their algo tweakings.
I still think if it wasn't for SEO, search engines wouldn't have as good of results as they have now.Search engines should work with webmasters more so we can all have a happy medium on how to create sites for the engines so they can have better results.
It's stupid of them to keep us all guessing at their vague guidelines and innocent sites getting banned for one little mistake they made in their code that triggered a penalty on their site.
I agree completely. I have always held the opinion that the search engines would be much better off over all if they posted explicit guidelines as to what is acceptable and what is not. It would be much easier to spot true spam and eliminate it because it would be no secret whatsoever what actually constitutes spam.
Webmasters would know where the line was and if they edged too close to it and stepped over it they could be dealt with harshly.
Of course the old stand-by come-back will pop up within the next couple of posts that telling spammers where the line is will simply help them find a way over it. I don't buy it. Never have and never will.
The spammers will ALWAYS try to take shortcuts, get caught, then move on with another domain. Clear guidelines wouldn't change that for the better OR for the worse.
Clear guidelines would however give most honest webmasters (the other 95%) an incentive to err slightly on the side of caution which would actually reduce the level of spam overall. Clear guidelines would also make it a lot easier to detect the spammers faster so they can be dealt with BEFORE they have time to rack up boatloads of revenue before getting axed.
automation is the name of the game..
Perhaps google has already automatically applied a penalty to the page. It could be that even with a hidden text penalty that page is still better than the others for that query for googles value system. :-) ]
I don' think so. This is an amateurish photography related site that is number one for a particular search. My client's (clean) site is below this.
The guy is using techniques that should be dead easy to spot. There is a large space at the bottom of his home page that contains nothing but a massive great list of keywords the same colour as the background and written in plain HTML.
If I employed a real person to do this and they couldn't find something as obvious as this I would sack them! The algo is clearly incapable of capturing some of the most simple spamming techniques.
People think they understand "se spam" based on confusing, conflicting, and intentionally obtuse documentation by search engines. Or, they hear someone that has studied se's for an hour - put up a shingle with "Search Engine Expert" under it who has no clue how to optimize.
Again, so called "hidden text" is not a huge issue to a search engine. 95% of it is innocent, but 95% of those who scream "hidden" text...are not.
<added>
fixed typo on is to isn't.
</added>
[edited by: Brett_Tabke at 8:33 pm (utc) on Feb. 24, 2005]
Smug comments like ...
If you are getting beat by hidden.text and doorway pages - you really suck at optimization
The fact is that Google tells everyone what is not allowed, they seek people to submit reports when they see spammers, then they go on allowing it. That is the real issue.
What I was referring too was blatant spam. We are not talking about mentioning a few locations in your text to pick up some traffic. We are talking about blatant spam and I know it when I see it.
Hidden text was one of the earliest and most basic ways of spamming the search engines. Google still cannot deal with it, even when it is reported to them. Why ask people to waste their time submitting spam reports when they can do nothing about them?
Do i have this right?
We can't see the text but the spiders will so its just like being visible anyways. So if the text is repetitive and against a spiders TOS then it risks getting dropped eventually and thats why it should become a non issue to webmasters who come across them?