|does google award points for non-optimization?|
Do you think Google awards weight for non-optimization?
Do you think a book that has no cover, dust jacket, bio, credits or table of contents is weighted by a reviewer or reader as better?
Of course not.
However, do I think the same book that is over optimized to just grab attention (i.e. trying to make the book find reader, rather than the reader finding the book)is often frowned on.. Yes.
Optimizing a site is good, it improves quality for the WM, SEO, and average searcher. It's sort of like writing a paper with proper composition. I think there is a grey-line where a person can over-optimize, and thus destroy any decency the page in question originally had.
My understanding is that optimization is defned as stuff done to pages or sites that improves the ranking, so anything that improves the ranking is optimization by definition, and anything that doesn't is not.
Therefore, I think not, by definition.
Now, whether or not Google may penalize you for some of the stuff people do to their pages in the name of improving their rankings is another question. I think it's pretty obvious that they do.
I absolutely agree with amoore. But we are talking about cause vs. effect here. If you make the optimization of pages or sites match the content (without over emphasizing or exagerating), you have made it easier for a searcher and/or search engine to find you. Thus (effect), you tend to rank higher. If you do not optimize at all it is like having a blank card catalog in a library (back when they used to do that). No one will be able to find you unless they shuffle through the books on a shelf individually. You have in effect created a website in the middle of the desert with no roads leading to it.
>My understanding is that optimization is defned as stuff done to pages or sites that improves the ranking, so anything that improves the ranking is optimization by definition, and anything that doesn't is not.
As you can see from the current update, many optimizers just got a much different result.
Optimization to one SEO may be SPAM to another SEO
Optimizing for non-optimization - ONO ;)
If you haven’t read it yet, this was a pretty good discussion held on what we think Google means when they say they discourage ‘over optimization’, ha.
Define Over Optimization [webmasterworld.com]
This kind on discussion makes me smile because it draws attention to not what we shouldn’t do, like I am sure google would like, but what we can do, will continue to do, and are always learning more about.
I like this comment….
|Optimizing a site is good, it improves quality for the WM, SEO, and average searcher. It's sort of like writing a paper with proper composition. - argusdesigns |
But then this…..
|I think there is a grey-line where a person can over-optimize, and thus destroy any decency the page in question originally had. - argusdesigns |
While I may agree with this I consider it a subjective call. Once again, how do we and how do search engines determine over optimization and is that possible?
From this discussion we have been known to jump on to ethics. Now that’s fun. Eventually those discussions get crushed because we can’t control ourselves and we start pointing fingers and pushing each other instead of focusing on the theories.
I think it comes down to what boundaries you want to set on the work you do, what you consider risky, and when you feel compelled to inform your clients of those risks. Staying informed, which is what the Webmaster World addiction actually comes down to, is of course the key then to making those decisions.
Let's keep talking about this. As an industry these discussions can bring us closer and perhaps that knowledge will help raise our personal standards.
Optimisation is dual Usability [webmasterworld.com] thanks for making that ethically acceptable Tedster!
|But then this….. |
I think there is a grey-line where a person can over-optimize, and thus destroy any decency the page in question originally had.
|While I may agree with this I consider it a subjective call. Once again, how do we and how do search engines determine over optimization and is that possible? |
You are definately right about this, it is a subjective call. I guess the point I am trying make is this;
A search engine is basically being programmed to get as close to what an actual searcher deems to be the best SERPS. The algos will constantly be improved as new ideas and methods are engaged. The reason you cannot simply state a black and white set of rules (or clear boundary) of what is or is not not proper is fairly clear. You cannot predict fractoral geometry or the basis to chaos theory in ANY way. You can only approximate and hope to get close (i.e. SERPS). Weather this is the method employed by bots, I do not know..and it misses the point to simply try to solve a search engines behavior or algo.
The point is that it may be the safest thing to focus on the searcher, and what types of SERPS they would most useful. The optimization then becomes a bit more clear. It is fairly easy to know when an SEO, or simple webmaster has over done it. But again, like paynt mentioned, this is a subjective call, with boundries that differ amongst us.
BTW Paynt - awesome post, I defineately agree with everything you mentioned. :)
MY feeling is that 90% of what SEO people call optimzation (that works) is just traditional publishing good sense. It does not take for a Web designer to know or have heard anything about "SEO" to use descriptive titles, link to relevant sites and get links back, put summary info at the top of the document and at the bottom, have a clear navigational system and use headings to organize data using the key keywords.
That is just plain commonsense and how people have been witing good documents for years before even Time Berners-Lee was born.
My feeling is where SEO principles and good publishing principles converge use them. I beleive that that is 90% of good "SEO", and that is where most efforts should be for long term success. 5% is other stuff that takes advantage naturally of the hyperlinking nature of the Web. The final 5% is being smart without trying to deceive the robots but these usually only work short term.