Forum Moderators: Robert Charlton & goodroi
Would this enable this site to effectively side-step a lot of "lightweight" Google filters targeted mainly at spammers and scrapers (if the site remains on topic or near it) - ie you would not get penalized for the odd "gray" bit of seo - or in other words "unexperienced seo" - too many keywords everywhere etc?
The reason I ask this as I'm sure many people like myself "go back" and eradicate outdated on page seo they may have done in the past when less experienced (what i call grey-hat) as they learn more and more about what works for them.
I'm seeing a couple of sites that should be perhaps penalized (a little) but aren't - and the only difference is that they have thousands of back links (and are on the whole "well built" apart from a bit of unexperienced seo that might attract filters.)
Having said that, some filters seem to be yes/no, while others may be triggered by a certain level of bad practice - the number of reciprocal links may be one such area; but I'd not gamble on it.
I'm one of those who never bothered to create individual, unique meta descriptions, because 'it didn't matter'. Now, when Google has to deal with sites churing out spam and cr*p pages by the million, they DO matter.
As I had good, clean pages, all with unique content, I still didn't worry. Until I realised my pages were going 'supplementary'. I'm now three quarters through going back and building meta descriptions; far enough into the task to know it is worth the effort - and I was a fool to take the short cut.
No one can force you to keep up to date with SE guidelines; but take it from me, they ain't there for fun; your site will be the richer for doing it.
Think about it; imagine the scene ...
Once upon a time there were two sites, yours and your deadliest rival's. She keeps up to date with latest SE guidelines. You don't think it matters. Take a wild guess who'll do better in Google serps ;)
OK, best of three? :)
[edited by: Quadrille at 2:49 am (utc) on Feb. 22, 2007]