Forum Moderators: open

Message Too Old, No Replies

Another bit of Google FUD I had missed before.

Why not come out and say it?

         

rfgdxm1

12:04 am on Sep 25, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



[google.com...]

We will not comment on the individual reasons a page was removed and we do not offer an exhaustive list of practices that can cause removal.

Like, wouldn't it be a whole lot easier to follow the rules *if we knew them*?

[edited by: WebGuerrilla at 12:33 am (utc) on Sep. 25, 2002]
[edit reason] trimmed quote [/edit]

WebGuerrilla

12:39 am on Sep 25, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member




Publishing the rules will never happen because all it would do is make Google's job 10 times as hard.

Search engines only penalize the techniques that are successful at exploiting a weakness in the algorithm. Publishing a list of those techniques would be the equivalent of writing a "How to spam Google" manual.

rfgdxm1

12:51 am on Sep 25, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Baloney, and I don't refer to the sausage. What if the same technique is A-OK at other search engines, but Google is the only one to object? This is the old "there is always some collateral damage" argument.

dvduval

1:04 am on Sep 25, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I would add that many techniques involve a "threshhold".

If you have 2 domains linking to one another on every page, that's usually not a problem, but 18 domains is a problem.

or

repeating a keyword 12 times on a page is probably ok, but how about 400 times.

To reveal the exact threshhold would lead to people being as "spammy" as possible. I think it's best to keep thing grey.

rcjordan

1:07 am on Sep 25, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>What if the same technique is A-OK at other search engines, but Google is the only one to object?

That's always been a problem, no matter matrix of engines you were working. Cloaking used to be the answer, or you took your lumps with one-size-fits-all seo and hoped it averaged out.

A punchlist of what not to do would be great because everything else would be legit, right? And, what about the "effective date" of the particular technique/exploit in question? "But GG, I built this site on July 12th, and it was legal then!"

Sasquatch

1:14 am on Sep 25, 2002 (gmt 0)



So, should google be limited to only penalizing sites after they have updated their official list of no-nos?

They actually do have a list. Concentrate on improving your content and getting listed in yahoo and DMOZ. Get legitimate links because your content is so good, not links that are meant just to improve PR. Anything more is risky.

Just look at what has happened with all the games people play with PR since they told the world about that and gave them the little green bar.

Now what would be real fun (I hope GG is listening) would be if the google penalties for bad behavior were applied to the appropriate portion of their calculation. Messing with link farms affects your PR. But keyword stuffing the title would get your title value zeroed. Same goes for alt attributes.

That way people could be looking at their PR7 site and wondering why the heck they ain't showing up till page 9. Then they can try some more tricks and bury themselves even deeper.

Then you add the relevance calculation in before transfering PR to the other sites.