Welcome to WebmasterWorld Guest from 188.8.131.52
My question is a simple one ... did anyone see any results from reporting and if not what do you think Google is/will be doing with the information that was sent?
All the Best
The search results for one sector I'm interested in are totally dominated by one company. They have a standard layout on dozens of pages with loads of links at the bottom of each page split into four sections. Three sections are entirely internal links - the fourth links to around 50 utterly bogus domain names, each of which is a page branded with the main company name and with a dozen links pointing back to the main site - "Contact Us", that kind of thing. These 50 "sites" masquerade as high value links and lift the PR of the main site. Google has been notified twice and has done nothing, leading to the search results in this sector being as bogus as this site.
On the first SERPS, of the ten presented, I reckon five are quite obviously infringing the guidelines and possibly seven.
Linking off to dummy sites that link back seems popular - but two-man companies with 5,000 inbound links? Some are incredible, yet Google faithfully lists them all. My worry is that - if I compete with these guys using these techniquea - I'll eventually get caught in the great roundup.
All the Best
And in the meantime, seach users get fed crap. The first page of the results I was talking about are barey useable at present, and I suspect the same is true for many competitive keywords.
What's tough is trying to work out how to compete with the cheats - cheat yourself, and perhaps get caught with them?
I filled out the spam form a few weeks ago and included GoogleGuy in the subject and my WW ID as well.
The issue was a network of sites which were all created on Blogspot Blogs. They look like they are semi-legitimate at first but when you look at the code, they are all linking to a phone sex site using
It has been quite a while and the links are still there, the sites are all still there and there has been no response.
I find turning in others for minor things, is like being a Gomer. Let the search engines police their own territory in this area and make the necessary changes to minimize these problems.
Cases, such as a site sending users to something that is completly outsite of their search criteria, or a like blatant misrepresentation, should be reported.
Really, who wants to be Gomer Pyle or Barney Fife? LuAnn was sexy in an odd way, I would chose being Barney.
Both reports were detailed and stuck to the facts.
Still a few others with no change but G did get the big, bad ones.
I'm sorry to say but I think your experience was coincidental and had nothing to do with the quality of the report you filed.
I too got caught up in the moment when Matt Cutts asked for spam reports on this forum. I twice reported a site that uses "nearly hidden text" as Matt himself calls it. My reports were very detailed and exactly per instructions, yet that sight continues to enjoy top rankings.
Matt even wrote a post in his blog about"SEO mistakes" that specifically refers to "nearly hidden text."
Either the ax has yet to fall on these sites or Google is blowing smoke.
I did as six of the top 10 were doing this, all redirecting to the same place for certain keyphrase.
Still happening too, they seem to remove the domain but they just reappear exactly the same under another domain, all using the same IP which leaves me baffled.
Ive given up bothering.
First off, when you reach an advance age you realize that kissing certain spots of other people bodies wouldn't bring you anywhere. It might bring temporary results, but on the long run, you ar going to be a big loser. Better to kiss your love ones and show them how much you care, and thats should be an ideal win-win "long-term strategy". I'm writing this post with that exact strategy in mind ;-)
As webmasters, we all are interested in a Google index with less spam sites. The less spam the more chances for a whitehat site to reach the top of the serps. Lets call it a "mutual interest" among webmasters and Google WebSpam Team.
When our kind fellow member GoogleGuy and later Matt "Inigo" Cutts called for reporting spam, they in fact were seeking co-operation with the webmaster communities to deal with a very critical issue which Google WebSpam Team hasn't been able to resolve alone. Therefore, I have supported that call and posted "Spam-Reporting" spots here on forum 30, as you might recall.
Colin asked in his first post on this thread: "did anyone see any results from reporting and if not what do you think Google is/will be doing with the information that was sent?"
I recall some fellow members posting that their spam reports resulted in removing the sites they had reported.
Of course, that doesn't mean Google WebSpam Team has been able to deal with ALL spam reports in accordance with the spam-reporters expectations. My kind fellow member steveb has just mentioned in his post few reasons for that. And I wish to add that we really don't know the size of Google WebSpam Team and whether there are enough members of the team to deal with spam-reports in good time.
One thing for sure, Google is paying more and more attention to spam. Also spam in other languages than En. Matt wrote recently twice about that. First in a post and then in a remark.
Feedback: Webspam in 2006?
"It has been a successful “spam falling” week on the Pacific Rim. When I talked about going all i18n in 2006, people didn’t expect that we’d start out and go clockwise instead of counter-clockwise."
IMO, we should keep supporting such efforts from Google's side, even if our spam-reporting results don't show as fast as we wish.
Thanks for listening ;-)
Why should we be supporting Google and Matt Cutts? Google are a large commercial company, hell bent on controlling the worlds information. Your insistence that we cut them some slack about something they seem so bad at handling is naive.
Like a few people who have posted on this topic I too have reported sites for clear violation of Google's own rules. Since they have a "report spam" type page I fully expect it to be dealt with. To date this has not happened, with the worst offender in position 1 on all the major SEs. Therefore this is not an issue of Google not doing it as fast as we would like, rather it is Google seemingly doing nothing at all.
If we assume for a moment that the sites I am talking about, and the ones the other guys mentioned, were actually fairly clear-cut in terms of violations, then there shouldn't be any issue. Once discovered they should be dropped. I personally don't care if they do this with some kind of special investigative algorithm, or a real live human. It's their job, not ours. We have already gone to the trouble of reporting after all.
If SEOs and Webmasters are prepared to go easy on things like this then they will never get fixed. More importantly the casual references to Google as if they were some kind of benevolent student experiment ran by really nice guys is equally naive. They are a company who count their revenue in $Billions, so the idea that they might not have enough people to cope is pathetic. Would you use a bank that didn't have quite enough bank notes?
What this implies is a lack of will to address the issue (as seen with canonical issues) and, I would speculate, a lack of concern. I genuinely think they don't care. I have no doubt some of the data is handed over to the algorithm developers to see if they can do something, but beyond that I am sure it doesn't get looked at.
Interestingly some of these issues have been seen with other big entities online over the last few years. Most notably DMOZ and Yahoo! Directory lacked the manpower they needed to do the job properly. Like Google the issue wasn't so much in the detail, i.e. how good were they at dealing with it; rather the issue was that they simply couldn't cope. Although webmasters still submit to them, they have slipped in terms of their once-held position at the top of the tree. I doubt Google's demise will be anytime soon, and won't happen because a few webmasters are annoyed about their inability to handle spam, but it is a sign that their focus is moving in different directions than they'd like us to believe.
Put simply when someone hands you this data on a plate by submitting it directly to you it should take a skilled individual about 5 minutes to decide if it is spam or not. It would take most people on this forum about 30 seconds after all. So why shouldn’t we expect the worlds biggest employer of PhDs to do a better job?
10 gets you 1 - the top hit on any SERPS for highly competitive keywords (real estate, pensions, life assurance, car insurance, etc.) is most likely getting their positiuon not on the merit of their company, but on the stength of their SEO.
So it's easy - once a day a Google employee carries out some of the most common searches - Google, after all, knows what they are - and checks out the top three for conformance to the guidelines.
I followed directions exactly. My example had 8,192 words below the fold (on their home page). Of those over 3,000 were autogenerated and clearly spam.
I reported two sites - the second had blue on blue autogenerated text. They both continue to dominate the SERPs.
Granted, Google may have a very good reason for ingoring my reports (although I cannot imagine why), but without any feedback mechanism, I'm wasting my time.
Not just that. It is a hidden link to an adult site from a network of sites run on Google's own spam engine called "Blogspot.com" and they have still not removed those sites from blogspot or from their index and the phone sex site is ranking very well with them also.