Forum Moderators: open
Do you think a book that has no cover, dust jacket, bio, credits or table of contents is weighted by a reviewer or reader as better?
Of course not.
However, do I think the same book that is over optimized to just grab attention (i.e. trying to make the book find reader, rather than the reader finding the book)is often frowned on.. Yes.
Optimizing a site is good, it improves quality for the WM, SEO, and average searcher. It's sort of like writing a paper with proper composition. I think there is a grey-line where a person can over-optimize, and thus destroy any decency the page in question originally had.
Therefore, I think not, by definition.
Now, whether or not Google may penalize you for some of the stuff people do to their pages in the name of improving their rankings is another question. I think it's pretty obvious that they do.
As you can see from the current update, many optimizers just got a much different result.
Optimization to one SEO may be SPAM to another SEO
Define Over Optimization [webmasterworld.com]
This kind on discussion makes me smile because it draws attention to not what we shouldn’t do, like I am sure google would like, but what we can do, will continue to do, and are always learning more about.
I like this comment….
Optimizing a site is good, it improves quality for the WM, SEO, and average searcher. It's sort of like writing a paper with proper composition. - argusdesigns
But then this…..
I think there is a grey-line where a person can over-optimize, and thus destroy any decency the page in question originally had. - argusdesigns
While I may agree with this I consider it a subjective call. Once again, how do we and how do search engines determine over optimization and is that possible?
From this discussion we have been known to jump on to ethics. Now that’s fun. Eventually those discussions get crushed because we can’t control ourselves and we start pointing fingers and pushing each other instead of focusing on the theories.
I think it comes down to what boundaries you want to set on the work you do, what you consider risky, and when you feel compelled to inform your clients of those risks. Staying informed, which is what the Webmaster World addiction actually comes down to, is of course the key then to making those decisions.
Let's keep talking about this. As an industry these discussions can bring us closer and perhaps that knowledge will help raise our personal standards.
But then this…..
I think there is a grey-line where a person can over-optimize, and thus destroy any decency the page in question originally had.
and..
While I may agree with this I consider it a subjective call. Once again, how do we and how do search engines determine over optimization and is that possible?
You are definately right about this, it is a subjective call. I guess the point I am trying make is this;
A search engine is basically being programmed to get as close to what an actual searcher deems to be the best SERPS. The algos will constantly be improved as new ideas and methods are engaged. The reason you cannot simply state a black and white set of rules (or clear boundary) of what is or is not not proper is fairly clear. You cannot predict fractoral geometry or the basis to chaos theory in ANY way. You can only approximate and hope to get close (i.e. SERPS). Weather this is the method employed by bots, I do not know..and it misses the point to simply try to solve a search engines behavior or algo.
The point is that it may be the safest thing to focus on the searcher, and what types of SERPS they would most useful. The optimization then becomes a bit more clear. It is fairly easy to know when an SEO, or simple webmaster has over done it. But again, like paynt mentioned, this is a subjective call, with boundries that differ amongst us.
BTW Paynt - awesome post, I defineately agree with everything you mentioned. :)
That is just plain commonsense and how people have been witing good documents for years before even Time Berners-Lee was born.
My feeling is where SEO principles and good publishing principles converge use them. I beleive that that is 90% of good "SEO", and that is where most efforts should be for long term success. 5% is other stuff that takes advantage naturally of the hyperlinking nature of the Web. The final 5% is being smart without trying to deceive the robots but these usually only work short term.