Everywhere we turn we read pages and pages of authorities like Google telling us what good SEO practices are.
Myself, I really try to follow them as close as possible. I feel content when I manage to complete a project which is user friendly and highly SEO optimized.
But many times (too many) I stumble upon competitors that really make a joke out of SEO and are ranking extremely well.
Just recently I was researching ranking on a very competitive keyword for my region, several million results show up.
TOP 5 are sites that are a real grotesque from SEO, code and design viewpoint.
Let me just address a few details I found at number 1:
- title tag is a whole paragraph that can't even fit the screen. Main keyword in it is mentioned 58 (!) times
- keywords meta tag is a whole novel of words..
- main page has at least 3 identical duplicates (!) with no 301 redirects or whatever
- there is hidden text all over (same background, same font color)
- majority of code is made in sloppy tables
- URLs are not rewritten and contain number of dynamic numbers, variables and whatnot
- no image has an "alt" attribute, same goes for "title" attributes in links
So how can this monstrosity rank so high? Apparently it was created back in 2001.
Can that single factor account for all the flops I just listed? Seems that Google thinks so.
I presume these guys also put a let of "effort" in making this page, I just cant bring myself to squeal them off by sending reports of bad practices to Google.
And we shouldn't - why should we be local sheriffs for blackhat SEO?
Googlebot should see and understand such obvious things on its own !
Don't about you guys, but these situations really bring me down. It really makes it almost impossible for us one-man-army programmers to make any success on the web.
Sure, a big company would tackle this issue by pumping millions in ads, but what about us small people?