I am sure there a lot of us like me that do SEM & SEO professionally. Thing is – not like being a doctor or a lawyer we have no recognition. Just like doctors and lawyers their profession could easily be destroyed by BAD EGGS – e.g. doctors using there license to supply illegal drugs to the public. Because it’s a profession that is recognized they cannot get away with it.
G has been asking and is open to ideas on how to improve its SERPS. The fundamental problem is it relies on either off-page or on-page factors to rank the pages. Black Hat SEO’s also have exactly the same ability as White Hat SEO’s to extract what is favored by G’s algorithm over a period of time – in fact the black hat SEO’s probably have an advantage in that they have more capital available to meet G’s attempts like 10 year registrations to game it opposed to white hat sites. G’s answer to this has been to use methods like DATE to clean out the SPAM – not an ideal situation because its taken everything out of the SERPS and prevents fresh content to become available in their SERPS.
Some sort of G qualification or verification is necessary in my opinion. Some sort of license needs to be created where a verification code can be placed on sites who have – written, paid for and passed an exam? – or submits IP addresses, domains etc under a QUALIFIED license. Just like doctors who practice illegally SEO’s would be at risk of loosing their license - part of the responsibility of owning a licence would be to report BLACK HAT sites to saveguard and protect your profession. To a degree G is encouraging us to do this via SPAM reports. To a certain extent this has been attempted by dmoz – G has fortunately not used dmoz completely as I am sure they must recognize how corrupt that directory really is. The first place I would look for black hat spammers would be the editors at dmoz who don’t list sites.
The question I am suggesting in this topic is ideas to help meet G’s goal of listing only quality sites. Maybe it’s not a registration process. Here is the problem – what is the solution?
What methods for on-page and off-page algorithms are necessary to prevent webmasters from gaming Google’s algorithm? Remember that whatever you suggest – look at it again from a black hat perspective – IE “OK so that’s what they are doing – OK so now how can I fiddle it and make my site appear to look like they are conforming?
In principle not a bad idea as I think most of us don't want to try and hood wink Google or the other engines but just want to be able to maintain a good honest site. Trouble is its a bit like trying to drive on a road system but your'e not quite sure how fast to go and your'e not told the highway code. All we are told is to drive safely.
There are some third party's that are establising training programs and certifications. I actually attented Bruce Clay's SEO Toolset training and found it very helpful. He does training in the Los Angeles area, and it is a 2 or 3 day seminar. You can check it out at <snip>.
You are learning his methods and his software, so it is not exactly what you are talking about. Google does have an adwords professional. So who knows, maybe we'll see an seo professional coming out.
[edited by: lawman at 12:40 am (utc) on Nov. 3, 2005] [edit reason] No URLs Please [/edit]