Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: mademetop
The adjustment of html page entities and content for the express purpose of ranking higher on search engines. eg: Search Engine Optimization is the manipulation of search engine rankings systems.
Those optimizers that have been barred or removed from various search engine advertising programs need to reread the previous two sentences. That is why you were denied. Search engine optimization is against the TOS of most major search engines.
I bring this up, because I've been reading a great deal lately from seo "experts" who are confused about what we do for a living.
I'm sorry, I thought you knew.
The definition implies that any change to the HTML to improve the site could be defined as search engine optimisation and therefore could get the site thrown out of a Search Engine.
If what is done can get sites, of those who come to us, thrown out of most major search engines, then why do we do what we do?
On the other hand most people who come to us have sites where it doesn't matter whether they are thrown out or not - they don't get listed where anyone can see them in the first place.
Another thing I have never read is that searchengines are allowing or even helping some website owners to optimize their pages to the point of completely failing to meet the searchers query, because those website owners in some way pay for that, while other website owners, who meet the searchers interest perfectly are not even allowed into the searchengine, because they don´t pay. Ít still is true.
>eg: Search Engine Optimization
>is the manipulation of search
>engine rankings systems.
.... that isn't really correct. We don't actually manipulate the SE's 'ranking systems'. We can only manipulate what we control.... our pages (or pages on other domains.) In other words, we don't manipulate the algorithms, just how we rank when applied to a standard....
It does not imply 'manipulation' to me. Of course, you can cheat by cloaking or other 'risky' methods, and that is clearly manipulation.
My most succesful 'optimisation' (at least as far as increased traffic is concerned), consisted of inserting correct titles on 200 pages and making sure that the keyphrase was mentioned roughly the right number of times.
In this case I was merely helping the search engine to correctly identify the content of the site.
Sure, I manipulate sometimes, but not always.
An 'optimised' page is one that performs as well as it is able. If 'manipulation' techniques have an associated risk of banning, it could be argued that those pages are not optimised.
Even if there were, would this make things any easier to find? Nope. We've seen the whole ###1 AAA company stuff in Yahoo! directories...and this system is somewhat similar to a library classifying by the alphabet...or author.
So, how can any search engine haven any problems whatsoever with SEO? Again, my view on that is that for years, before I got into the business, many SEO firms made lots of money...the search engines, eg, Inktomi, Excite, Altavista, Lycos, got greedy, and wanted that kind of money for themselves. Now, look what we've got: some engines don't care that SEO's make money, and they actually have some relevance a lot of people like, eg Google, Wisenut, Teoma.
If each engine came forward and said, "this is exactly what you can do to rank higher with us" then our industry would be gone. They can't, because many of them like to pretend that they are creating an algo based on some inherent structure in the web...how much of that very structure they use to create an algorithm was made by the search engine optimization community? I believe probably a lot of it.
Again, nice post. I am hoping that some search engines will read this, think twice about their ways and realize it's not us that causes them problems: it's their perspective that's doing it, and their adversarial stance, eg, "we don't want those guys manipulating our results".
We couldn't 'manipulate' them, if we had nothing to work with. And some engines, leave that door way to wide open for us not to just stroll right through.
1) The dumbing down of the technology of a web site that a search engine views.
2) Translation of the language used in a web site to reflect the manner in which people perceive and search for that which the site offers.
3) Getting it linked and listed in appropriate related sources of information.
The definition that you supply here implies the changing of HTML to improve rankings and therefore completely ignores the fact that often the best optimisation comes from setting up a new domain with completely new html pages. Which by the definition given would be completely outside of search engine optimisation.
I like this thread - all the posts have been thought provoking. Just perfect for that nice saturday afternoon think
1. Developing a website to be in harmony with the Search Engine.
2. Developing a website to fool the Search Engine.
I imagine the SEs like SEOs who do No1 as it makes the SEs results more accurate - and that the SEs hate SEOs who do No2 as it makes their resulst less accurate.
I see search engine optimization as a positive thing and far from being against the Terms of Service of any major search engine.
Provided webmasters do not use any of the tactics or tricks which the search engines *expressly advise are against their Terms of Service*, the manipulation of one's *site* (as opposed to ranking systems) in order to achieve higher rankings can only serve the purposes of the search engines, the searchers and the entire WWW.
I see optimization as:
1) Providing *in depth content* relative to the subject matter of the products being sold or the topical theme of the site.
2) Ensuring that individual pages are designed in such a way as to let the robots know exactly what information is to be found within that specific page by using pertinent keywords in the title and within the first few sentences of the page.
The dilemma webmasters and search engine optimizers face is making the choice between providing relevant content or employing some of the dubious pratices available which SE's have (as yet) been unable to detect and punish.
The use of cloaking and other unseen *tricks* which serve to water down the relevancy of search results have forced those of us with *clean sites* to keyword load our pages in order to be found. I find it distasteful to keyword load ... but at least it is done openly.
One method of SEO is covert and the other overt. The covert methods allows webmasters to design exactly what they want regardless of current robot technology and their inherent limitations ... and to make up the *relevancy factor* through the use of tactics specifically designed to tell the robots what the page is all about, despite the fact that the searcher cannot see it and the robots cannot detect it.
The overt method of SEO is to have everything in plain view for the surfers (and robots alike) to read what is on the page.
To my way of thinking, if a webmaster or search engine optimizer is using any covert methods in order to tell a robot to read that which does not exist on the page a surfer sees ... then your site deserves to be punished or banned.
Its true that current robot technology does not allow the creative webmaster a free hand in his or her design ... which is understandably frustrating for many. However, if the point of a web site is to sell something, disseminate information or get an idea across, then why should it be so difficult to accept robot limitations and work within the guidelines outlined by the search engines.
SE warnings of things which "might" get a site banned really rot my socks, because there is no follow through on their part. I do not employ any of the methods they *say* they don't like ... but it is clear they have no way to detect and punish. If ratting out your competitor doesn't work (and it hasn't in my case) ... then what are the alternatives?
I think it is ultimately up to the individual webmaster. Does he/she want to risk being banned for the sake of higher rankings or not? I realize that the majority of you would scream a resounding YES. I don't agree ... but that is my choice.
As jeremy_goodrich so rightly pointed out ... >there is no standard of web publishing, no standard algorithm, and no standard way of classifying information.<
That is why this forum exists. That is why SEO exists. Once robot technology catches up to design technology (if that day ever comes) ... we will all know the hard and fast rules and Terms of Service of the various SE's. Until then, it is anybody's game and the arbitrary banishment from SE's as well as new and improved ways to trick the SE's into seeing that which is not there will go on and on and on ...
The procedure or procedures used to make a system or design as effective or functional as possible, especially the mathematical techniques involved.
1.To move, arrange, operate, or control by the hands or by mechanical means, especially in a skillful manner
2.To influence or manage shrewdly or deviously
3.To tamper with or falsify for personal gain
I think it boils down to personality types:
1) Some people enjoy creating within a structure.
2) Others enjoy pushing the envelope and rebelling a little bit.
3) And some are extremely self-centered and only care about achieving their own purposes, no matter what.
I think I'm a 2, although I've been accused of being number 3! (It all started with my dad, LOL)
Tedster, I really enjoyed your point of view.
I think of it this way.....Search Engine Optimization has one definition, but the definition of it changes on a case to case basis.
If you can optimize your client's site and adhere to the SE's rules so that there are no risks involved, and no chance of one of the SE's catching on and banning you, you will do it. If you can optimize the HTML, put in all of the titles, apply all of the things that you need to, but not use any spammy techniques that the SE's have taken a stance against, and know that the site will be received well by the SE's....then why do anything else?
But......if you get a new client in, with a site that is all Flash or has tons of images, then I think that you are going to have to pull some tricks. Let's say that all of the content is in Flash or images, but this client is a leader in the market. The user should be able to get to this site.....it is in the SE's best interest to feed this site to the user. But, the search engines can't find it because all of the content is in the Flash or image files. It is your job to bridge the gap between the user and the website.
This should not be against the SE's TOS. I understand that some of the things that we do as optimizers can be a bit sneaky sometimes, and sometimes the definition of SEO can be to manipulate the results of a search engine. But, I also think that as optimizers it is up to us to draw the line in the sand that we can not cross. If your client's website is a commerce-enabled shoe store, and he wants to be found for "shopping", this is okay. But, if this same clients wants to be found for pants or a keyword that is irrelevant, it is the job of the optimizer to tell the client No.
I think that this is where Tedster's comment comes in.....
- More High Band Width connections.
- Video (as high band width becomes the norm).
- 3D (that actually works).
- More Pay for inclusion.
- SEs providing SEO for a fee.
- New and more complex OS (with new weaknesses).
- New viruses.
I know we have some of these in a basic form now but imagine what it might be like in 2006. SEO and SE will evolve by harmony and conflict with each other.
This analogy holds true to the extent that the SEO expert is encouraged to take maximal "legal" means to have his/her page ranked well for a given keyword, e.g., ensuring that the keyword appears in the TITLE tag, ensuring that there's ample visible text with the keyword in it, ensuring that there are sufficient overt links from that page to other related pages on or off the site, etc. It's stretching the analogy, but you might call this "irrelevancy avoidance". There's also a range of techniques which most SEs concur constitute "illegal" means for high ranking, including cloaked page-jacking, non-visible links, "cookie-cutter" doorway pages, etc. You might call this "irrelevancy evasion".
Using this analogy, an SEO expert's role is to prepare a client's Web pages for maximal keyword "irrelevancy avoidance" while scrupulously avoiding keyword "irrelevancy evasion".
The big difference: tax consultants have reams of official rules and examples they can learn and consult. SEO consultants can only reverse engineer the systems by constant experimentation or by using 3rd-party resources such as WebMasterWorld.com.
You may not use the Google Search Services to sell a product or service, or to increase traffic to your Web site for commercial reasons...
Be very careful about allowing an individual consultant
or company to 'optimize' your web site. Chances are they will engage
in some of our "Don'ts" and end up hurting your site.
Sounds like they consider us Tax Evaders to me Winooski.
'chances are' and 'be very careful' are the key phrases here.
If my Mum was planning a site to provide a nice pension, I would give her the same advice.
Pick an SEO from the Net at random and the 'chances are' you are running those risks.
Pick one from our fora, and the 'chances are' you get a better SEO.
'Be very careful' (ie pick ME) and you are guaranteed to get a good one :)
1) those who have their own website which they focus their efforts on entirely.
2) those of us who do SEO for a living, work with sites we didn't build, can't alter the structure too much and are under an obligation to increase traffic asap.
now i don't deny there might be some crossover between what people in each camp may think, but it's quite simple:
those who do their own website answer to themselves.
those who do other people's websites answer to a client.
and therein lies the rub, and the qualitative difference between personal optimisation and professional optimisation.
personally, as long as the phrases are directly related to the content, I don't think there's any ethical "manipulation" involved. technically we could talk about "manipulation" until the cows come home.
Ive used all that, but also realised that as Search engine algos have improved they reward good content in exactly the same way as good content is rewarded in any book, printed document etc. ie: A good structure When writing a book I make sure i have a good structure, with good citations, subject based chapters, an intriductory chapter and a summariseing concluding chapter. I bold important material I want people to remember.
I have some sympathy for brett's view, but tend to agree with others in that if "optimizing" is aimed at helping the reader and making as clear as possible the actual content and benefits of a site, you are helping both search engines and readers. If thats being a spammer, Im pleased to be one!
The first time you change anchor text from More Info to More Widget Info because you know the engines are looking for Widgets in anchor text, you are no longer optimizing for the surfer. According to the SEs, that is Spam. If you have something to be said that can be said in a paragraph, but you know you need two-three pages of content to get your paragraph ranked well in the engines, and you develop that content, no matter how useful it ends up being, you're spamming.
If you take almost any book about search engine optimization, you could change the title to How To Spam And Not Get Caught and the title would fit the book. :) Wouldn't sell as many copies, but the title would be accurate. It's all about perception. SEO is noble, Spam is, well, Spam.
SEO is guiding clients through the process of creating content that is useful to visitors, and providing clients with information about what their clients are searching for.
SEO is creating streamlined html pages so that they load quickly, contain useful content, and give the visitor a reason to stay.
Finally, SEO is beating the crap out of the competition on the search engines by simply doing a better job at building websites.
If that's not what you're doing, then you're not in it for the long haul.