homepage Welcome to WebmasterWorld Guest from 54.161.246.212
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 193 message thread spans 7 pages: < < 193 ( 1 2 3 [4] 5 6 7 > >     
Google Not Acting on Problem Results Reports
JudgeJeffries

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 6:09 pm on Apr 16, 2003 (gmt 0)

I keep seeing spam stealing the top positions so I report it to Google time and again and nothing happens, so what do I do?
If you cant beat them join them.
This is my living we're talking about after all. The bread on my table.
Not cloaking yet but it may become necessary if Google don't start taking the spam reports more seriously.
Why do they ask for them and then take no action?
How about some serious answers.

 

steve128



 
Msg#: 11856 posted 12:34 am on Apr 19, 2003 (gmt 0)

GrinninGordon
My thoughts exactly, the way it is and will remain so.
Spam never goes off, because it is not 100% real meat.

They will return, but it is impossible to completely control, though I will hang my head in shame and agree google strives to combat more than any other, and would suggest they do a far better job than any other.

I do now know google treats spam reports seriously.
Many, myself included doubted this, and maybe took google for granted.
But sit back and realy analyse the situation, it cannot be easy to unscramble the flak.
I now realise it is unrealistic for spammers or whatever you want to call them to be booted overnight on a say-so.

So they eventually do get lost in the serps, but next month another comes to the forefront.

Perhaps like trying to empty the Atlantic Ocean with an egg cup.

The problems are really for SEO's to explain to clients, with questions "Hey this guy did this, did that blah" ' but you said don't do it" etc etc.... which can be difficult to explain.

GoogleGuy

WebmasterWorld Senior Member googleguy us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 11856 posted 1:03 am on Apr 19, 2003 (gmt 0)

PatrickDeese, I checked out the two examples you sent in. One was pretty clear (the one with redirects, and we detected CSS hidden text on the dest)--they'll be gone soon. The other might be better to handle algorithmically.

I will take a look at how we can add more info about what to expect on a spam report. Most of the effort will probably still be on using this data for our new algorithms, but I'd like people to know what to expect when they fill out a spam report.

I do think that 3 months is pretty reasonable to allow someone back in if they clean up their site. Actually, that raises as interesting question. Suppose that we detect hidden text (for example). What's the right amount of time for that site to be penalized? Does anyone have thoughts/votes?

PatrickDeese

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 1:07 am on Apr 19, 2003 (gmt 0)

Thank you very much GG.

Hooray. I got a date to the prom!

:)

PS Now help me beat out my legit competitor :P

mrguy

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 1:11 am on Apr 19, 2003 (gmt 0)

I think if you detect hidden text, they should be penalized until the hidden text is gone.

In general, I don't think we are all trying to put competitors out of business, but are instead trying to level the playing field.

I think if a site straightens up, they should be allowed back in the next update.

To make it so it is not so tempting to spam if your just going to be bumped from the index for month or two, perhaps institute a 2 strikes rule.

Get caught once, shame on you, clean it up and we will let you back in. Get caught twice, and your banned for good no exceptions.

I think that would stop a lot of the repeat offenders.

Thanks for taking the spam serious Googleguy!

steve128



 
Msg#: 11856 posted 1:24 am on Apr 19, 2003 (gmt 0)

In my mind if hidden text is used and directs to a relevant site no probs, minimum or no ban. (naive webmaster)
But hidden text used to create a PR increase to a none relevant site is not on.

My reasons, why would a site use hidden text to a none-relevant site deploy such tactics, other than a PR gain?

rfgdxm1

WebmasterWorld Senior Member rfgdxm1 us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 11856 posted 1:28 am on Apr 19, 2003 (gmt 0)

>I do think that 3 months is pretty reasonable to allow someone back in if they clean up their site. Actually, that raises as interesting question. Suppose that we detect hidden text (for example). What's the right amount of time for that site to be penalized? Does anyone have thoughts/votes?

Way too low. I'd say one year at least.

dodger

10+ Year Member



 
Msg#: 11856 posted 1:29 am on Apr 19, 2003 (gmt 0)

<<I do think that 3 months is pretty reasonable to allow someone back in if they clean up their site>>

I've heard of people being banned for 12 months or more - but if that's the rule and it's applied consistently across the board ok.

I think it's too low though - no incentive not to try.

Anon27

10+ Year Member



 
Msg#: 11856 posted 1:37 am on Apr 19, 2003 (gmt 0)

Steve128, I agree with you.

One the later of the two, I think the offending webmaster should be "tar-ed and feathered" for life if they are taking food off my table.

europeforvisitors



 
Msg#: 11856 posted 1:43 am on Apr 19, 2003 (gmt 0)

I think it's too low though - no incentive not to try.

I tend to agree. If Webmasters believe that using hidden text, etc. is the equivalent of fouling someone and getting five minutes in the penalty box, they'll continue to think that the spammer-vs.-Google game is just that: a game.

OTOH, I can understand why Google might be more forgiving to an obvious clueless amateur who hadn't heard that a little knowledge is a dangerous thing, and that hidden text isn't the bright idea that some people thought it was back when THE COMPLETE NITWIT'S GUIDE TO MANIPULATING SEARCH ENGINES was written in the days of Windows 95 and FrontPage 1.1.

Bottom line: Google needs to use good judgment when applying manual penalties, and it can afford to be more gentle with mypersonalsite.com than with cheapo-herbal-viagra-and-hotel-bookings-disposable-domain-of-the-month.com.

dodger

10+ Year Member



 
Msg#: 11856 posted 2:06 am on Apr 19, 2003 (gmt 0)

Google should flag offenders sites and check up on them periodically - run a script through their site or something - automated - to ensure that they aren't re-offending.

3 months is a slap on the wrist and if that's the only penalty why not give it a shot?

rfgdxm1

WebmasterWorld Senior Member rfgdxm1 us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 11856 posted 3:30 am on Apr 19, 2003 (gmt 0)

>OTOH, I can understand why Google might be more forgiving to an obvious clueless amateur who hadn't heard that a little knowledge is a dangerous thing, and that hidden text isn't the bright idea that some people thought it was back when THE COMPLETE NITWIT'S GUIDE TO MANIPULATING SEARCH ENGINES was written in the days of Windows 95 and FrontPage 1.1.

I've actually seen this with clueless amateur sites. To the extent that I have seen them use big blocks of *visible* keyword stuffed text on the bottom of the page thinking that this is a good idea for search engines. Almost as if they think "It's my site, I can put on it what I want, and if search engines rank me higher than they should, that's their problem, not mine."

dodger

10+ Year Member



 
Msg#: 11856 posted 3:40 am on Apr 19, 2003 (gmt 0)

At the end of the day it's still spam and too difficult to try to distignuish between the bad spammers and the ignorant spammers -
Google should have very clear instructions at the add url page that plainly states what is acceptable and what isn't and the penalties.
If they break the rules they should do the time.

cindysunc

10+ Year Member



 
Msg#: 11856 posted 4:06 am on Apr 19, 2003 (gmt 0)

Maybe on the home page of google there can be a link to WebmasterWorld so all webmasters can have a fair chance at getting some quick action on sites that resort to cheating methods? Or maybe there can be more Google reps visiting other forums so webmasters of other forums don't have to wait for months for somebody to look at spam reports? I would be very happy the day all the sites with hidden text and other cheating methods be removed from the index.

mrguy

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 5:59 am on Apr 19, 2003 (gmt 0)

--Maybe on the home page of google there can be a link to WebmasterWorld so all webmasters can have a fair chance at getting some quick action on sites that resort to cheating methods?--

Wow! I bet Brett would like that!

A PR10 home page link from the Google God themselves!

GoogleGuy

WebmasterWorld Senior Member googleguy us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 11856 posted 7:04 am on Apr 19, 2003 (gmt 0)

europeforvisitors, I strongly agree with your sentiment. The issue is that I'm looking for automatic criteria--if there's not a human judgment involved, it makes it much more consistent and fair. How would you decide how long of a penalty to do?

Anon27

10+ Year Member



 
Msg#: 11856 posted 7:26 am on Apr 19, 2003 (gmt 0)

GG: I would put in place a point system:

Hidden text, with no links, = 1 month.
Hidden text, with no links, but packed with keywords, = 3 months
Hidden text, with links, = 3 months.
Hidden text, with links, and blantantly trying to push a false PR = 6 months.

etc, etc

onionrep



 
Msg#: 11856 posted 7:35 am on Apr 19, 2003 (gmt 0)

I do think that 3 months is pretty reasonable to allow someone back in if they clean up their site. Actually, that raises as interesting question. Suppose that we detect hidden text (for example). What's the right amount of time for that site to be penalized? Does anyone have thoughts/votes?

Im with the 'out for one update brigade'

If they are using hidden text to get a high serp then they are pretty clueless anyway. When re-included, without the use of their hidden text, the likelihood is they would find themselves buried in the serps somewhere.

I think Google would also be more popular if they sent an offending site, (whatever its misbehaviour or oversight) a prior warning that sent a clear unambiguous message.

<begin awfullly written example>

Dear webmaster

Your website is behaving in a way contrary to our TOS. Please be advised that Google seeks to provide a resource that is blah bla blah etc.....If you wish to remain in the google index then you should have a look at your code and ensure that it complies with our guidelines at google.url.

Google is unable to provide precise feedback on your domain blah blah etc....Webmasters are reminded that repeated abuses by a particular webmaster or SEO company may lead to the removal of all identifiable domains associated with the SEO company or webmaster.

Sincerely

The Google team

</end awfully written example>

Im sure a little program at the plex could be developed to be able to automate a remove or stay directive, some 7 days later -

I know this view may be unpopular but, if a Google employee has taken the time to consider manually banning a domain, then its not exactly a big job for them to send an email and give the offender one last chance.

Some people work damn hard on their sites and are often ignorant of certain facts, or for economic or peer pressure reasons, may be compelled to push things just that little bit too far. Im not saying its right that they do so, I'm just trying to say that its not *always* cut and dry spam, and that some people may be a little ignorant. I dont think its fair to treat all offenders as murderous criminals :) some have just parked illegally and didnt see the sign :)

My point is that by giving these people a final opportunity to look at their domains before a ban is applied, that both google and the webmaster have an opportunity to create a positive. Ill feeling is avoided, google is seen as fair. The google - webmaster relationship is enhanced - the SERP is improved.

Sure, the algo way is a better way, but until a way is found of detecting simple hidden, and indeed more complex methods of manipulation, then fwiw, I think that it serves all constituencies to be able to have a drink in the last chance saloon. :)

Anon27

10+ Year Member



 
Msg#: 11856 posted 7:37 am on Apr 19, 2003 (gmt 0)

I add to my post, that this is my issue: hidden text (keyword packing) and hidden links for the purpose of gaining false and deceptive PR.

Other people might have other spam issues.

Anon27

10+ Year Member



 
Msg#: 11856 posted 7:47 am on Apr 19, 2003 (gmt 0)


I agree, except when that car is illegally parked, I have to pay Google Ads to get my parking space back.

They should know the rules of the road before they get into the car, if not, then they should learn the hard way. If Google would write a drivers manual, I think these issues might go away; but of course you will always have the occasional speeder, drunk driver...

If I was a cheater, I would just park in the handicap spot all day.

That is not fair.

[edited by: Anon27 at 7:51 am (utc) on April 19, 2003]

rfgdxm1

WebmasterWorld Senior Member rfgdxm1 us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 11856 posted 7:48 am on Apr 19, 2003 (gmt 0)

For hidden text GG, one year. However, in the case of amateur sites only, if they clean up their site, and e-mail Google repenting all sins, and declare that even though they are unfit to walk the Earth, begging forgiveness from Google, some mercy is reasonable in lessening the penalty.

austtr

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 8:18 am on Apr 19, 2003 (gmt 0)

Arriving late to this particular party... seems we now have a thread on spamming that is almost as big as some of the Google update threads. Maybe more and more folks are finally starting to recognise the extent of the problem.

A few months ago a spam thread would expire very quickly, now we have one over the 100 mark... there is a message there.

I'm sure that Google's algorithmic approach will prove to be quite effective in filtering out a lot of techniques that breach their guidelines... and I am quite sure there will be techniques that will slip past any form of automated detection simply because the computational effort required to find it could not be sustained.

I suspect many serial spammers bank on just that fact.. its their unpatrolled border crossing by which they deliver their contraband into the marketplace.

What should the penalties be? Depends on the message you want to send. Perhaps 3 months as a wrist slap for minor infringements ramping up to 12 months for major league serial spammers?

GG... on the question of "impossible to detect" spam, there are spam reports under my name and referred to you. Reports are several months old, sites are still there as of latest update. Would appreciate it if you can give some feedback as elsewhere in this thread.

dodger

10+ Year Member



 
Msg#: 11856 posted 8:47 am on Apr 19, 2003 (gmt 0)

<<Im sure a little program at the plex could be developed to be able to automate a remove or stay directive, some 7 days later >>

I edit for DMOZ and I can tell you that as good as these ideas are they don't seem to ever get put into practice.

I think the best and simplest way is to make it clear to all that the spam days are over - clean up your site or it will be out for 12 months after which time you can resubmit.

The index won't suffer as there are plenty of good sites out there to take their place.

It would be great to have a "pay for" service where you can have your site checked by Google and it gets a "no spam" tick or you get advice on how to fix it.

Some mechanism would have to be in place to make sure that you stay "Google friendly" As long as the fee was reasonable I'd be in it tomorrow even just for the peace of mind.

Google could even sub contract this work out.

Commercial sites would be glad to pay and non commercial sites can just follow the guidlines - it's not rocket science especially for a small non commercial site.

Chris_D

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 10:22 am on Apr 19, 2003 (gmt 0)

Regarding the penalty system, I agree with the sentiment that there should be two 'leagues' of penalty we could call them the rookie league - and the major league.

Rookie league is small scale hidden text etc. An email warning would probably suffice. A stern warning from the Google spam police.

Major league is affiliates with 3,000 page sites, using expired reputable domain names, redirected with javascripted framesets to high paying affiliate sites. And several coming out of one IP, usually involving the travel/ prescription medicine, casinos, p0rn etc industries.

With the major league guys - the warnings should be focussed on the 'beneficial recipient' of the spam traffic. He gets a message telling him he needs to get his affiliates into line with the Google TOS - or else the only traffic he'll ever see is PPC.

I think that its really the magnitude of the issue - make the punishment fit the crime. Cleaning up the major league guys would have a major immediate impact on the quality of the serps. Just look at the travel example posted a few pages back in this thread by onionrep - one spammer (acting as an affiliate) had destroyed a huge chunk of the first 50 results. Only the spammer and his affiliate partner win - searchers and competitors playing by the rule book lose.
So - I think
1. Everyone gets a warning
2. Rookies - do it again - and you're out for an index cycle
3. Major league - immediate 3 month ban for the affiliate AND after 3 affiliates get banned - a 3 month ban for the 'Beneficial recipient' of the spam based affiliate traffic. Unless - of course - you can establish the link from the affiliate to the 'benefical recipient' - in which case 1 warning - then all gone for 3 months.

jonrichd

10+ Year Member



 
Msg#: 11856 posted 11:38 am on Apr 19, 2003 (gmt 0)

How long should a hidden text penalty last? I would say the site is banned until none of its pages contain any hidden text.

Assuming that you can algorithmically detect hidden text when the bot comes by, if it is found, then that domain doesn't get listed in the next update. Any PR that that domain provides to other domains is ignored. Next month, the bot comes back, if the offending text is removed, then you're back in.

This removes the incentive to use hidden text, as well as the question of whether the offender is a 'serious' spammer or not. I've had clients who have had hidden text on their sites, not with the intent to spam, but because it helped them maintain proper table column widths.

While I agree with the idea of penalizing the ultimate beneficiary of spammy techniques, I don't see how you can do something like this fairly without hand checks. Otherwise, I could put in a bunch of hidden links to my competitior, and wipe him out. If I did this from a domain I didn't care about, the cost to me would be small.

mipapage

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 12:03 pm on Apr 19, 2003 (gmt 0)

Hey Googleguy,

(a repost of an earlier q)

A question. A spam report has been sent in, the site shortly threafter disappears from the serps, but returns now and again after freshbot visits, and since the last dance is solidly back in the serps - unchanged.

Is this normal behavior as a site is phased out? Should I re-send something to indicate that there is still a problem?

It would be nice to know on the surface what to expect, and I am sure this would benefit more than me. If it is the norm that the site fluctuates in and out of the serps before it's eventual demise from google, then we could calm the panic filled posters - myself included - the same we we say 'don't worry, it's just everflux'.

Yidaki

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 11856 posted 12:20 pm on Apr 19, 2003 (gmt 0)

IMHO, discussing the time frame of a penalty is the second step. First step is to act on reports and refine the algo. It's useless to discuss the second step before the first step is done. May be my reports automatically goe to the nirvana instead of going into the google reports queue - but in my reported cases i currently don't see big actions (by algo) taken against spam. To me it looks like *if* a site ever gets a penalty, the system just puts them back into the index after a special time frame - no matter if it's clean or not (proofs reported). Others don't even get penalized (also reported).

However, i second that depenalizing a site if the site is clean again is the best strategy. IMHO the needed technical ressources are the same if you give them a penalty time stamp or recheck their behaving frequently. The moment a site is clean again, it should be back in the index and take the ranking they deserve to have without cheating.

Don't want to complain but, again, the first step is to automatically check and penalize spam - old fashion spam and the over discussed guestbook phenomenon (also proof reported) are sooo easy to spot by algo ... clueless why nothing happens ...

anallawalla

WebmasterWorld Administrator anallawalla us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 11856 posted 12:27 pm on Apr 19, 2003 (gmt 0)

PR does follow a 301. A couple of times it seems like it took 2 updates for the PR to follow it, but it usually makes it in one index.

BigDave,

This didn't work for me. 301 redirect at old site using .htaccess and PR7 dropped to PR5. New site can't see the 200+ back links I could see from the old site.

It's noncommercial, so not a big deal, but still disappointing.

- Ash

anallawalla

WebmasterWorld Administrator anallawalla us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 11856 posted 12:45 pm on Apr 19, 2003 (gmt 0)

Dodger,

Google should flag offenders sites and check up on them periodically - run a script through their site or something - automated - to ensure that they aren't re-offending.

I like this idea. If the spam report were fed into a database, it could schedule periodic "spam scans" just like a googlebot run.

- Ash

anallawalla

WebmasterWorld Administrator anallawalla us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 11856 posted 12:54 pm on Apr 19, 2003 (gmt 0)

Onionrep,

I could not reproduce your air tickets search - lots of .uk domains but no single one hogging the SERPs. Perhaps it was sorted?

onionrep



 
Msg#: 11856 posted 12:55 pm on Apr 19, 2003 (gmt 0)

Chris_D said

Just look at the travel example posted a few pages back in this thread by onionrep - one spammer (acting as an affiliate) had destroyed a huge chunk of the first 50 results.

Hi Chris_D

I didn't actually look at those pages for any evidence of spam, so can't comment on whether they were or not (they've now gone so I can't check them)

My point on that particualr serp was that I found it diffcult to understand why google did not/could not/would not ensure that a single domain does not receive multiple positions for a single query.

onionrep



 
Msg#: 11856 posted 12:59 pm on Apr 19, 2003 (gmt 0)

anallawalla

Yes, it must have been fixed. It was there since the update.

This 193 message thread spans 7 pages: < < 193 ( 1 2 3 [4] 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved