homepage Welcome to WebmasterWorld Guest from 54.166.53.169
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 193 message thread spans 7 pages: < < 193 ( 1 2 3 4 5 [6] 7 > >     
Google Not Acting on Problem Results Reports
JudgeJeffries

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 6:09 pm on Apr 16, 2003 (gmt 0)

I keep seeing spam stealing the top positions so I report it to Google time and again and nothing happens, so what do I do?
If you cant beat them join them.
This is my living we're talking about after all. The bread on my table.
Not cloaking yet but it may become necessary if Google don't start taking the spam reports more seriously.
Why do they ask for them and then take no action?
How about some serious answers.

 

chrisfoot

10+ Year Member



 
Msg#: 11856 posted 7:03 pm on Apr 21, 2003 (gmt 0)

I know that the last one sounds like I am whining but that was after a lot of e-mails using the SPAM forms. This was the last one I sent to one of their e-mails listed on their site. I don't intend to send any more.

I always used the SPAM report page. I would send one then wait a few weeks. Tried to send another wait a few weeks. One more - wait a month. I did it just like I was told by some folks here. State the facts - make it short. Tell them what site it was, the problem and that was it. Six months of SEO work I finally got to the first page. At least I won't get fired now. It still bothers me that everyone above me cheats, so, I sent them this one and am giving up.

markusf



 
Msg#: 11856 posted 7:16 pm on Apr 21, 2003 (gmt 0)

I reported 101 duplicate domains, 30 of which are coming up in the first 100 results for multipul keywords. The pages all have a grey toolbar and are beating PR5-6 sites. Short of hand editing these out i don't see much there is that google can do.

JudgeJeffries

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 7:16 pm on Apr 21, 2003 (gmt 0)

Chrisfoot,
This is exactly my point. Google effectively encourages spam by failing to take appropriate action. OK so it costs a little and may need some bods on the ground doing the leg work if the algo cannot cope but at least it would keep the results sweet and ensure Google keeps ahead of the game which it surely will not do if the results continue to be corrupted.
Over the years there have been a few pre-eminent SE's but they have all come a cropper when the new kid on the block brings out his toys and that may well be Googles fate if they dont get their act together amd fast.

markusf



 
Msg#: 11856 posted 9:09 pm on Apr 21, 2003 (gmt 0)

JudgeJeffries google has over 3 billion pages.. Google will never be able to come up with a algorithm that is FAST and can check 3 billion pages to see if they are spam. The problem is the computer needs to know what the difinition of spam is.

GrinninGordon



 
Msg#: 11856 posted 12:49 am on Apr 22, 2003 (gmt 0)

GoogleGuy

"I agree, iJeep. One year would be too much."

I think the ban needs to take account of the "crime". In particular I refer to;

a) Duplicate content sites. What is the point of only banning these sites for a certain period? The ban gets lifted, and back they come in the search results!

b) Cloaked sites. This means webmasters who know what they are doing. Hardened "Spammers". Is it wise to allow their known guises back into the community at all?

I agree, if people keyword stuff, etc. that is a different thing. And a ban should get their attention. Allow them to make changes. And then the ban should be lifted after your bots / algo check the site out again. And if they repeat offend on that domain, make the ban longer next time.

But there should be a HUGE differential here between the bans. Please do not allow duplicate content and ex-cloaked sites back into the fold. They will only degrade your search results and cause your staff more work (having to ban them again).

BigDave

WebmasterWorld Senior Member bigdave us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 11856 posted 1:12 am on Apr 22, 2003 (gmt 0)

a) Duplicate content sites. What is the point of only banning these sites for a certain period? The ban gets lifted, and back they come in the search results!

I don't think of duplicate content as a penalizing offence, I think of it as keeping the SERPs clean. As soon as they put unique content on the page, bring it back in. There are even some very good valid reasons to have mostly duplicate content for the users. But google just doesn't want to be showing duplicate content in the results.

b) Cloaked sites. This means webmasters who know what they are doing. Hardened "Spammers". Is it wise to allow their known guises back into the community at all?

Of all the things that can draw a penalty, cloaking bothers me the most. Yes, there are obvious cases where cloaking is a Really Bad Thing, but it is almost always combined with other spammy techniques. I would consider anything *added* to the page using cloaking to be hidden text, but I would like to be able to use cloaking to take things off the page that I serve up to google. Instead of using JS links for pages that I would like to not have crawled, I would just notinclude those links when I send them to googlebot.

Instead of identifying specific techniques that should get the death penalty, I would concentrate more on the recidivism of the spammer.

GrinninGordon



 
Msg#: 11856 posted 1:20 am on Apr 22, 2003 (gmt 0)

BigDave

"I don't think of duplicate content as a penalizing offence"

So, a small band of stetson hat wearers has 20 sites, all sites selling the exact same thing. Often using the same html. That is not cause for a ban?!

And then they put a few different words here and there. Change the images on each site. But still sell the same things at the same prices. And you think they should remain in the search returns right?!

Please, if you ever build your own search engine. Please call it spamnotham.com

BigDave

WebmasterWorld Senior Member bigdave us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 11856 posted 1:31 am on Apr 22, 2003 (gmt 0)

No, it is reason to remove the duplicate pages from the SERPs. If they come up with unique content for the new site, then I am all for letting them back in. I will leave it up to google to decide how they will define duplicate.

daroz

10+ Year Member



 
Msg#: 11856 posted 1:41 am on Apr 22, 2003 (gmt 0)

GrinnGordon,

One point I'd make regarding dupe content...

Some times webmasters duplicate content innocently. (a .com .net and mabye .us version of the domain or 2. This is especially important as some/many webmasters don't know to use a 301, and instead use 302 redirects.)

The ability to have a ban lifted (and traffic directed to the proper site) is important.

Also, it's one thing to penalize a site when it's acting improperly, but after it corrects its ways the ban should be lifted.

markusf



 
Msg#: 11856 posted 2:44 am on Apr 22, 2003 (gmt 0)

I think when people create 100 duplicate domains etc they should be baned forever..

It costs you nothing to go
domain.us
keyword.domain.us
keyword2.domain.us
keyword3.domain.us
keyword4.domain.us
keyword5.domain.us

chamade

10+ Year Member



 
Msg#: 11856 posted 10:26 am on Apr 22, 2003 (gmt 0)

A senario for you.

A webmaster comes across a stumbling block with his hosting company. They have a policy that limits traffic, and he's approaching that limit.

i.e.
Hosting plan 1 - 1gb/month 50.00 dollars
Hosting plan 2 - 10gb/month 350.00 dollars

1. He could change to the higher plan to allow much more bandwidth.

2. He could break down his site to more sizeable chunks, stretch it over multiple plan 1's which would cost a lot less than the higher plan and assign a relevant domain name to each plan.

Now, say for example he hosts widgets.com and that is getting to the traffic limit. He opts for the cheaper hosting option and breaks his site down in to 3 parts and assigns them widgets.com aims for the uncoloured widget market, red.widgets.com and blue.widgets.com aimed at the specific coloured widget market.

He links them to each other, with one link on the bottom of the index page that says "If you want red widgets go here, or here for blue widgets", and so on. Only one link per site per index page.

Googlebot comes along and follows the new links, sees the sites conform to it's algo and include the new sites. Note: he didn't add the sites, googlebot did.

widgets.com is popular and currently ranks a PR8. The dance happens and here's what results.

Because he's good at optimising pages, searches for "red widgets" mean that red.widgets.com comes up #1, as does blue.widgets.com for searches on "blue widgets". However various factors, including the PR of the original site, the good page optimisation, mean that searches for "widgets" returns widgets.com #1, red.widgets.com #2 and blue.widgets.com #3.

From previous posts it appears some think this site should be banned because he turns up on the first 3 listings and they're all owned by the same company.

Yet, these sites don't have the same content, they are specific about the type of widgets they sell and for the end searcher who's looking to buy widgets, the pages give them what they want. He's not doing anything against googles TOS - no hidden text, no cloaking or any other tricks other than good page optimisation.

Is they guy a spammer, who should be banned for life or other penalty? Is there a breach of google TOS for having multiple domains? Or should he have gone with the 1st option of having huge traffic and only one domain and upgraded his hosting plan?

onionrep



 
Msg#: 11856 posted 12:18 pm on Apr 22, 2003 (gmt 0)

Chamade

With the provisio that each site is unique in content and widgets.com doesn't contain the same content as red widgets or blue widgets, then I don't see a problem.

As for the need to do this because of the bandwidth agreement, then I'd suggest that the fellow either finds another host, renegotiates what he has, or rethinks his business model, so that he converts his traffic to sales and can afford the increased bandwidth charges.

JudgeJeffries

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 11:49 pm on Apr 22, 2003 (gmt 0)

If I sell the same product as another site, say a very technical widget and we both *must* use the same tecnical description ie power, capacity, construction etc as supplied by the manufacturer then how can duplicate content, to a large degree, be avoided and will one or all of us selling the same product be penalised?
This is a real scenario as I need to quote some large chunks of statutes on some of my pages and I know that others in my field do exactly the same.

steve128



 
Msg#: 11856 posted 12:31 am on Apr 23, 2003 (gmt 0)

JudgeJeffries
That is where PR comes into it my friend, you are not alone.

Use pop ups/unders java etc if you must make cc of other site info.

lindavh

10+ Year Member



 
Msg#: 11856 posted 1:15 am on Apr 23, 2003 (gmt 0)

I think this thread is great!
I have been very concerned about sites that are coming up the ranks that are built primarily as directories for certain industries. I don't think they are considered as "spam" but who can beat all those inbound links they gather from all the sites that pay to be included in their directory?
Anyone with any insight or experience with this, I'd love to hear what your thoughts are and a sticky is always welcome.
Now....I know this may seem very obvious, but what is a "Nick" and where do you use it when reporting spam?
Thanks all...Linda

peterdaly

10+ Year Member



 
Msg#: 11856 posted 1:30 am on Apr 23, 2003 (gmt 0)

Thanks GoogleGuy!

I have been watching this thread with great interest, but have yet to post. I have put in a couple spam reports for my only targeted keyword, and action has been taken against the affending site. Not only that, a positive side affect is I my site has rolled onto the first page of results.

The number 1 site conisted on a couple of small pages (this is an e-commerce area) They had no online catalog, or even simple product listing. It was basically a splash page for a bricks and mortar with a phone number to call for more info.

All in all a pretty poor first result content wise. It hit number 1 through hundreds of keyword stuffed alt tags. It did ok PR wise from chamber of commerce site type links, the site's been around for a while. It was very hard not to just "join the spam club" after a couple google dances, followed by a spam report from me showing no progress.

After the last googledance, the site was still at number 1. Granted I am not sure it had finished settling. Today I checked, and the site is no longer on the first results page.

For all of the "maybe I should join the spam club" people, the spam report has payed off for me. Whether the page change will translate into profits, we will see.

Anyway, Google _is_ acting on these things, and I without a doubt came out better for reporting, and keeping to non-spammy pages rather than joining the dark side.

THANK YOU GoogleGuy!

-Pete

chamade

10+ Year Member



 
Msg#: 11856 posted 2:13 am on Apr 23, 2003 (gmt 0)

lindavh,

Your "nick" is the nickname you use on WW. In your case lindavh. Googleguy prefers to have your nick somewhere in the subject or body of the spam report so he can search for your specific request amoungst the millions of others.

CCowboy

10+ Year Member



 
Msg#: 11856 posted 2:33 am on Apr 23, 2003 (gmt 0)

When your on top your clean as a new born baby... When your not it's Spam...

Dam the Luck!

GoogleGuy

WebmasterWorld Senior Member googleguy us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 11856 posted 4:24 am on Apr 23, 2003 (gmt 0)

peterdaly, glad to hear that the spam report went well for you. Other people may be seeing the results of automatic hidden-text detection algorithms soon. One of the Google engineers (Matt Cuts) might talk about it more at the WebmasterWorld pub conference in Boston on Saturday.

gilli

10+ Year Member



 
Msg#: 11856 posted 4:53 am on Apr 23, 2003 (gmt 0)

--serious--
Coming in very late in the game here but in answer to the question about how long bans should be imposed for....

Minimum ban for a minor first offense - 3 months
Minimum ban for a more blatant first offense - 6 months

Ban for subsquent offenses - 12 months or permanent

Exceptions:

If appears to be the result of an accident or ignorance rather than a deliberate policy. I have gumbies editing pages that don't really know right from wrong - they might put <h1> around a paragraph for example. This is up to the discretion of Google.

This may sound harsh, but seriously look at the posts in this thread ("if you can't beat them join them", "its all about who plays the game the best") people are more than willing to take the risk at the moment. The penalties need to be a serious incentive to stay clean.

--light hearted--
Recent developments with a large Teleco in Australia (51% owned by the government mind you) resulting in 18 million dollars worth of programming contracts being shipped off to India have given me the idea that Google could simply hire an army of web monkeys from the developing world to review all pages in the index by hand.

I know this is likely to be slightly more expensive than maintaining a large flock of pigeons but I believe the quality of work is likely to be better. Or at least I hope so because if its not telecommunications services in Australia are going to be in serious trouble.

aus_dave

10+ Year Member



 
Msg#: 11856 posted 1:20 pm on Apr 23, 2003 (gmt 0)

This interesting thread has re-inspired me to report some more spam :). I filled in a spam report but after hitting the submit button I get the default Google search page coming up.

I seem to remember previously a confirmation screen. Can anyone confirm this for me please?

mrguy

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 3:02 pm on Apr 23, 2003 (gmt 0)

- - they might put <h1> around a paragraph for example. --

Since when did using <h1> tags become spam? Just because some people have no clue or eye for design, does not mean it's spam.

If we start down that road, I think Google will have hire quite a few people to review all the sites that get dinged for poor design.

roundabout

10+ Year Member



 
Msg#: 11856 posted 3:24 pm on Apr 23, 2003 (gmt 0)

Why does there need to be a time duration for penalties? Wouldn't the best way to handle this be that the algorithm simply ignore the offending feature. For example, if Google just ignored hidden text so that it didn't help a site's SERP in any way ... odds are the hidden text problem would go away.

All this talk of penalties makes Google seem more like a cop and less like a tech company.

mipapage

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 3:42 pm on Apr 23, 2003 (gmt 0)

GoogleGuy said:

Other people may be seeing the results of automatic hidden-text detection algorithms soon.

I am happily seeing new results! Though a couple still have to go, they were nothin' compared to the *r*p that is history!

Good job, and keep it up.

BTW one of the bad one's has had it's pr removed and is no longer in the directory (although the new sites it had hidden links to maintained their PR)
- is this all done with the algo?

[edited by: mipapage at 5:02 pm (utc) on April 23, 2003]

mrguy

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 3:52 pm on Apr 23, 2003 (gmt 0)

aus_dave

Yes, after filling out a spam report you get dropped back to the Google home page.

It works!

I've seen the results first hand!

RawAlex

10+ Year Member



 
Msg#: 11856 posted 4:12 pm on Apr 23, 2003 (gmt 0)

MrGuy: Yeah, poor design policing would be the end all... there are some seriously ugly sites out there!

GoogleGuy: Does google do regular manual "human eye" checks on the SERPs? Is there any software in place to see if a domain (or domain owner) comes up too often in individual SERPs?

As for penalties, as long as the site offends, it should be blocked out. It should then take 1 or 2 more updates to prove the crap is gone before it should make it back into the SERPs... maybe a PR1 just to show it exists again... then a real PR and real listings a couple of updates later?

Alex

Chris_D

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 4:14 pm on Apr 23, 2003 (gmt 0)

Geez - Peterdaly - You've just made me feel like a leper!

I have submitted numerous detailed spam reports over the past 2 years. I don't make up the rules about 'spamming the serps' - Google does. Who is supposed to enforce these rules - we are. Its like telephoning the police to tell them someones parked across you driveway - and nothing ever happens.

I'm now getting close to where the Judge is - being made a fool of isn't a great way for Google to continue to 'win friends and influence people'. I'm going to have to start letting down tyres - or start parking across other peoples driveways. Pretty poor state of affairs when we have to take justice into our own hands, isn't it?

I've just spent another heap of hours trying to be a good netizen - the vast majority of my spam reports appear to have be ignored by Google - or the spammers I've reported are back - and they are committing exactly the same blatant breaches of the published Google TOS as before.

Rules are only as good as the citizens. The last spam report I sent 10 days ago was personally addressed to GG - and referred specifically to a bunch of scumbags - who often use the names of disabled institutions in sports scams. These results are still in Google - so far GG hasn't even acknowledged that I submitted these reports - and Google hasn't yet acted on them - or the others I submitted. And - like all my spam reports - they are reasonably comprehensive.

I'd be over the moon if I got a heads up - "hey - read your report - thanks - we'll look into it".

I give up heaps of my personal family time to try to make the internet a better place. Why? - because I really get upset when I see eg. blatant commercial abuse of charitable institutions names - and I selfishly hope that there is a better answer to this manual reporting of cheats to compensate for an algiorithm which clearly can't 'self police' - for my 8 year old daughter's sake soon.

Pissed off? Yes I am. But I'll get over it.

So Happy Anzac day to all of you from Australia for ANZAC Day on Friday. And if you don't know what ANZAC Day is - and if Google doesn't put a slouch hat on one of the "L"s and explain it - I'm sure that FAST or .....

Chris_D
Trying to be a good guy - but currently pretty pissed off.

Bio4ce

10+ Year Member



 
Msg#: 11856 posted 5:08 pm on Apr 23, 2003 (gmt 0)

Thanks GG for taking care of that spam report. Also, I sent you about 10 cloaking sites that you might be interested in looking at. :)

See everyone. Google does act on spam reports!

JudgeJeffries

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 7:04 pm on Apr 23, 2003 (gmt 0)

Makes you wonder how many spam reports GG is now receiving with WW nics on them.
Its obviously impossible for the algo to be updated quickly to cover every possible situation so why do they not just employ a few more geeks to specifically sort out the spam reports till such time as they are totally on top of it. It amazes me that its taken so long to just deal with the hidden text problem so how long will it be before the rest is mastered. Perhaps Google should just accept the reallity and spend some cash to blast the spammers.

Jakpot

10+ Year Member



 
Msg#: 11856 posted 7:49 pm on Apr 23, 2003 (gmt 0)

"automatic hidden-text detection algorithms"

Must be working - the sites above me in 2 different SERPS
are history.

Tamarick

10+ Year Member



 
Msg#: 11856 posted 10:05 pm on Apr 23, 2003 (gmt 0)

What about standard compliant sites? My site is 100% standard compliant and I use a hidden <div style="display:none;"> to tell "non-compliant browser users" to update their browser. This is an industry norm and is invisable to "standard-compliant" browsers..

Will google look for this stuff?
If so will people like Jeff Zeldman, Eric Meyer, and Tantek Celik be banned from google?

These guys are the leading web designers in the world... If they say "do it" I usually do...

This 193 message thread spans 7 pages: < < 193 ( 1 2 3 4 5 [6] 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved