homepage Welcome to WebmasterWorld Guest from 50.17.7.84
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
Forum Library, Charter, Moderators: mademetop

General Search Engine Marketing Issues Forum

This 43 message thread spans 2 pages: < < 43 ( 1 [2]     
Search Engines' Stance Against SEO Is Shaky At Best
agerhart




msg:232665
 6:23 pm on Feb 26, 2002 (gmt 0)

It is an ongoing saga, only known by those that are either on the side of the search engines or the side of the search engine optimization professionals. On one side, you have the SEO professional, whose main goal is to work hard and do whatever possible, ethically is desired, to obtain higher rankings within the search engines. On the other side you have the search engines who's job is to provide their users the highest number of most relevant results. Somewhere along the line (long before I came into the picture I assume) the relationship between the two professions went sour. Why did this happen? The search engines see SEOs as spammer's.

The first case that I saw of this was back in September, when Brett Tabke found the Inktomi Spam database. This caused quite a stir within the SEO community, and with due reason. Inktomi had compiled a database of over 1 million URLs that were blacklisted from the Inktomi database, and classified as spammer's for one reason or another.

So, what was Inktomi trying to accomplish with this Spam Database? Inktomi was trying to clean up their database, eradicating spammer's, porn sites, spam domains and sites, affiliate sites, and SEO companies. It is conceivable that Inktomi did not wish to eliminate all SEO companies, but only the ones that were spamming the database heavily. During this crusade Inktomi would identify a site that they considered a spammer, according to their guidelines, and would ban that site as well as other sites that were within the same Class C IP. Inktomi was attempting to eliminate all sites that were produced by each detected spammer.

Let's remember that Inktomi's algorithm was based on older on page criteria. The algorithm weighs title tags, description tags, some on-page content, some keyword density, and link popularity. With an algorithm of this nature, it makes it very easy for the search engine rankings to be manipulated.

What was the outcome of all of this? One thing that happened, whether it be a good thing or bad, is make SEO companies and professionals more alert to the fact that they need to be careful on the Internet. Thousands upon thousands of sites were banned from the Inktomi database, and all banned as a result of what guidelines? None. Sites were being added to the blacklist of websites from outside sources. So, what happens to the site that has been on the Internet for years, has accumulated a number of domains, tons of links, and other things that Inktomi is using as criteria for the blacklist? You go on one of two lists: blacklist or whitelist. The blacklist is the spammer's list, and the whitelist is the untouchables.

You are probably wondering why I am brining up this stale topic. Stay with me.

Another situation has arisen lately, not to the same magnitude as the Inktomi Spam Database incident, but one of the same relation. Over the past few months, Google has been testing a way of penalizing web sites that are over-optimized, or just plain optimized to rank well. Google. The "king of search engines", the search engine that was supposed to be so "webmaster friendly". So, how was this discovered?

GoogleGuy, a member at WebMasterWorld and Google employee, recently stated in a thread [webmasterworld.com], "Let me address that point a little more directly. We tried a new way to detect site optimization. It nabbed plenty of bad guys, but it also caught lots of smaller people who read SEO boards. We're slowing backing off some of that particular penalty now……….There is an important message here though, especially for the smaller web master. Let's define that as: you manage a single-digit number of domains, or you read here to promote your own personal site, or you don't do SEO for a living. The message is pretty simple, and it's one that full-time SEO's should already know: SEO can at times be dangerous to the health of your site. Please be careful out there on the net, alright?"

Google admits to hunting down optimized sites, sites optimized by SEO companies, and although this Google representative did not imply it, probably SEO company's websites. Then Google admits that during their hunt for optimized sites that had an edge over their competitors, innocent bystanders were caught in the crossfire. There is one thing that differentiates the Inktomi hunt from the Google hunt: Google was not only going after spammer's and porn sites, they were going after legitimately optimized sites. While the Google representative declined to offer information into what techniques exactly they were looking for, the techniques could not have been overly complex if ordinary webmasters and legitimate SEOs were caught in the hunt.

At WebMasterWorld, new member after new member appeared wondering what happened to their website. Why was their PageRank taken away? What happened to their links? What happened to their rankings? When Google implemented this new technology to detect optimized sites and nab "bad guys", many innocent web masters and companies that have hired SEO firms were hit with the same penalty.

Why does history repeat itself like this? Why do the search engines feel the need to hunt down optimized sites?

Compare two companies. Company A and Company B are in all aspects the same. Company A does not pursue a search engine optimization campaign, or any online or search engine marketing campaigns for that matter. Company B, realizing the potential, decides to hire a search engine optimization firm for their optimization campaign, as well as their online and search engine marketing needs. The search engine optimization company only employs ethical practices, focusing on the site's content, the web site structure, obtaining quality links, and reworking the titles and keyword density of the pages. Company B's website then gets hit with a penalty from Google, while Company A's website goes untouched on the 20th page of the search results. Why? Would this happen in offline marketing or advertising? Would Company B get hit with a penalty of fine for pursuing a traditional marketing campaign?

The answer is obviously No.

What is the solution to these problems? In the Inktomi case, the obvious and easy solution would be for Inktomi to review sites more carefully or to have a stronger algorithm for indexing and ranking websites. In the Google case, the solution is a bit more difficult. One thing that Google could have done to prevent the situation was narrowing the scope of their target. Instead of targeting web sites that are properly optimized according to search engine and browser standards, why not solely target the web sites that use spam techniques? Another solution is for Google to change their indexing process. The latter solution would be much more difficult, and would involve either implementing paid submissions or some new technology.

I have an even more simple solution that would have applied to both situations: realize that SEO is an important part of every website and marketing your website and company on the Internet. Lately there has been a trend of the term SEO changing to SEM, or search engine marketing. Why the change? There are not many free search engines left on the Internet, and is being replaced with paid-submission, paid-spidering, and paid-placement options on the search engines. Let me break it down: SEO = SEM. Let me break it down any further: offline marketing company = SEO or SEM company, and the marketing avenue (magazine, newspaper, etc.) = search engines.

Copyright 2002 Top Site Listings [topsitelistings.com] - First usage rights donated to WebmasterWorld.

 

chiyo




msg:232695
 5:00 am on Feb 27, 2002 (gmt 0)

Paynt.. your question I thought was absolutely central to all this.. "Define Optimization!" I forgot that was one excellent question to follow GG's bombshell.

Somehow I dont think GG will respond. The massive subjectivity of the statement works in Google's advantage is their aim was to scare many away for optimization. Agree very much with you in WANTING to know however.

For example putting descriptive words of your content in the title is definately optimization. Putting a summary at the top (As Alta vista themlselves suggest) is optimization. Reducing code bloat is optimization. Even theming is optimization! And navigation systems and internal links are optimization. Providing links to other relevant sites is optimization.

All the other techniques above also increase readbility for the reader, and are the basis of any good document off and on line.

But what is over-optimization for god's sake? We dont know do we, and GG's exhortation "not to optimize" (Really if you read his posting, he suddenly was inserting "optimizing" where he was using the term "spamming" was before - even more general than "over-optimization") was supremely useless for us, but sublimely useful for Google if the hidden agenda was to get rid of a good proportion of pesky optmization by scaring the Buddha outta us...

digitalghost




msg:232696
 6:07 am on Feb 27, 2002 (gmt 0)

I don't think anyone will ever get an answer about what Google, or any of the other engines consider "over optimization".

I do know that when a group of SEOs begin to critique a site's optimization methods that it doesn't take long for terms like "questionable' and "spam" to start popping up. This indicates that while the SEOs might not know what Google considers those terms to mean, SEOs tend to have a way of defining it. Obviously this doesn't apply to all sites as SEOs will just as quickly agree that a site has been "optimized well".

I also tend to believe that the SEs have no desire to punish sites that are optimized but have a definite desire to weed out the "spammers". The crux is how closely the engine's definition of spam agrees with the SEOs' definition.

The sheer number of variables involved makes it nearly impossible for us to check all the information the engines incorporate in their algo. As a result of this, we develop an "algorithm voodoo".
Relying on experience, we relate the information we gather to others, either in the form of optimizing sites or forums like this one. Most of the information we pass on is tried and tested, unfortunately, that variable problem keeps us from stating much in the way of absolute fact. We use "general rules", "rules of thumb", "conventional wisdom", etc. to state the information assimilated into our algorithm voodoo as "safe seo pratices."

The engines delineate what they consider to be spam and sometimes we run afoul of that definition. Sometimes, this is the fault of the SEO, and in some cases, the Search Engine. The PR0 debacle is a case for a search engine making a mistake. The problem I see with this is that they created the need for link popularity and then had to change their own view of that policy as soon as it became apparent to SEOs that links, and not content was what was driving the rankings. Link farms sprang up all over and suddenly link popularity was sky rocketing.

So, they changed the algo. Tomorrow if they decide that titles don't count nearly as much as they originally thought, they'll change the algo again. Far from shifting the focus to developing content and ignoring optimization, what they are doing is driving the SEO business.

People will continue to pay others that have the time and desire to keep track of all the search engine changes to keep their sites listed near the top of the SERPs.

If Google decides tomorrow that headers and titles are meaningless for determining relevance a large number of sites will spiral downward in ranking. The focus of optimization will shift to whatever is then deemed pertinent. Clients will call, business will boom and optimization will go on. We'll all gather up our chicken bones and continue with the voodoo. Grumbling, but getting paid well while we grumble.

I certainly don't want Google defining optimization, or explaining how their ranking works. (at least not publically, a private email would be great ;) )

No one visits the voodoo guy unless he has something mysterious to offer and search engine optimization is indeed a mystery to the uninitiated.

Now, I'm off to gather up some stray chicken bones and a few candles

DG

vitaplease




msg:232697
 6:25 am on Feb 27, 2002 (gmt 0)

There has never been a time with more need for professional SEO than now the only pity is that for the moment it mainly has to be geared towards Google.

I am very glad the hunting down of "over-optimised" sites has happened - who wants to revert to the mess of one or two years ago?

It is because of this hunting down that you need a professional SEO or months of learning from this forum. And part of the learning has to do with stopping the over enthousiastic webmasters not ruining their site with tricks.

Never mind calling it optimisation. If a spade is a spade, call it a spade.

A journalist knows how important title's/headings and body text is.
An advertising copywriter knows how important catchy text is.
A scientist knows his reputation is how many people cite (link) towards him.
A librarian needs to develop a clever searching, lookup/navigation system.
A marketing manager has to figure out where to advertise or in which business directories to be listed.

In my opinion an SEO'er has to be all the above on the WWW and more - and indeed - to figure that out is quite a specialisatic optimisation.

Everyone here knows that you can still come up high within e.g. Google with a good deal of the above effort - even though there are no quick fixes.

If there is any valid reason for SEO'ers to complain it is that it can be difficult to explain towards your clients or yourself that this takes time, the above mentioned effort and quality content.

All the PPC advertising etc not only reinforces the value of good SEO, it even gives a Dollar value to it.

Concerning navigation:

Linking with every page linking to every page will not be a problem within Google for a small site. For a bigger site (>50 pages) it will and rightly so, because who is helped with 50 links on one page? If you still need it, use a pull-down menu or a java-script navigational menu which is more helpful to the surfer/searcher anyways.

chiyo




msg:232698
 7:56 am on Feb 27, 2002 (gmt 0)

Copy of email correspondance today..

Enquirer: Heh, where is your link to blah.com? Can you give me the link? Why did you change ***.com?

A: We took it off as we are worried that some search engine indexes penalise your rankings if you link to other related sites that are also owned by the same group.

Enquirer: That is crazy.

*******************
The problem so interestingly shown here is that if we get too "scared" we actually reduce the value and usability for the reader, and reduce the hyperlinking driven nature of the Web that is at it's very basis. While Page Rank in theory supports this basis of the "web" (and I have been a reguaar advocate of the value of that in many posts), it has got to a stage where, as practiced by Google, it may actually attack the very basis of hyperlinking that makes the Web a web.

incywincy




msg:232699
 9:36 am on Feb 27, 2002 (gmt 0)

don't mean to googlebash but there's so much hypocrasy with respect to spam. i reported a site that had over 7,000 cookie-cutter doorway pages each identical to one another except for a town name.

i reported this to google using their spam report page and guess what they're still ranked number 1 for <generic search term> <town name> for each of those 7,000 pages.

so why bother developing sophisticated filters, hurting small-time webmasters, yet ignore reports of commercial spam?

there that feels better!!!

backus




msg:232700
 9:55 am on Feb 27, 2002 (gmt 0)

Okay, let's look at the past SEO Scams...the old switcheroo, hidden text, etc. Okay, let's say a firm does that, then they get banned. All of a sudden, they now have the option to make themselves number one thanks to PPC engines. It doesn't matter if the site is good or not so long as they have money. So the smaller companies with sites on engines such as Fast and Google have to do something to work their way to the top. These companies don't have the money to compete on Overture. Their only solution is SEO, and everything that goes with it, cloaking etc.

Then let's look at another side of SEO, a site which is designed as a presentation, such as a fully flash site. Okay, all of you are thinking, "You shouldn't build a site fully in Flash!" Yeah, well okay, you tell that to my boss; he won't listen to me! The boss wants a Flash presentation, not a sales site. I have to then promote that site. The only way is cloaking. Now in that cloaked section of the site, I don't put anything irrelevent, I just build an HTML version of the Flash site, with news sections, etc. However, it all remains cloaked because the boss doesn't people to see that, he wants people to see the flash site. Why? For the shock element at the beginning. I would be extremely peeved if Google came along and said, "Sorry mate, I know it's all relevent, and I know there is no other way for you to get to the top because we don't have a way of spidering flash, but we can't accept cloaked sites."

What is left for me? PPC is too expensive. I could pay for Inktomi, but as said before, what would that give me? I still am not guaranteed a top position.

Like it or not, the Search Engines have to realise that the world turns. Everyone has a job to do and one of those jobs is SEO. You take away the job and what are we left with? PPC. Google will die, because we will be left with unoptimised websites which will sit in dormant positions and the only alternative will be PPC.

I understand the stand against Spam and Scam, but that is in severe cases, such as porn sites etc. Google should be supporting the professional SEO profession, because it is us keeping it alive, providing the relevent results etc, not them, we are doing their work for them.

chiyo




msg:232701
 11:05 am on Feb 27, 2002 (gmt 0)

because it is us keeping it alive, providing the relevent results etc, not them, we are doing their work for them.

I agree with everthing you say Backus. We sympathise and are in the same position as you, from a developing country where paying in US dollars for anything is beyond us 99% of the time.

That is until that last sentence! I beleive that some Search Engines, notably Google, have the technology now to intelligently rank sites for relevance using a combination of valid external citations, freshness, sophististicated text analysis and maybe theme awareness. They dont need SEO guys to help them. SEO guys work for their clients, not the Search Engines. Another argument is that Search Engines were dependent on Webmasters registering URL's for their databases. That was repeated all the time in "the other forum" a couple of years back. Now they can spider themselves silly, and Webmasters registering URL's or optimizing sites just get in the way. Now they can ignore submissions, (as I guess almost all do - 95% of it is spam), but they have to find other ways to filter out sites that rank well not because of relevance or content, but because of our "clever optimization". I understand their problem. No good being a non-commercially ranked index if rankings depend on how much you can pay a SEO. In the end that ends up almost like a PPC driven index.

Sorry for being a devils advocate.

nutsandbolts




msg:232702
 11:10 am on Feb 27, 2002 (gmt 0)

It would be good if Google could introduce some sort of pay-per-index system. Although my main site will be lifted from it's penalty in the next update, I have many other sites still affected.

But what can I do? Do I go out and buy some new domain names to be in the clear? Or do I keep on E-mailing Google hoping for a reply?

GoogleGuy was very kind to look at one of my sites but I cannot expect him to do the same for all of my 0 pageranks. Perhaps therefore the pay-per-index is a good idea if it means a quicker refresh of my sites and a possible lifting of this horrible penalty.

gethan




msg:232703
 12:12 pm on Feb 27, 2002 (gmt 0)

Welcome incyWincey. > over 7,000 cookie-cutter doorway pages each identical to one another except for a town name

I think the days of this type of page ranking No 1 are numbered.

Google's (or other SE's) logic will be along these lines:

If this is happening on this site, how many others will be using the same tactic? Lots.
Can we tweak our algorithm/filters so that this type of practice doesn't help anyone? Yes - test it and put it in place, No - if a really horrendous spam on the index then ban the site and bring out the rocket scientist to work on the tweak for next month.

Now with that in mind an experienced SEOr will shy aware from doing the same with an established site. An unexperienced guy is likely to copy the tactic as "it's works for them". I think this is a prime example of what GoogleGuy is referring to in his SEO health warning: "SEO can at times be dangerous to the health of your site."

The same principal applies to link farms, cloaking, keyword stuffing, css tricks. The SE's will try to root out anything that lowers the quality of the SERPs... and they will try to do it in as generic a method as possible. This is where some sites will be hit by friendly fire.

But there is a game that we can play -- use every trick in the book knowing that the results will be temporary but bring in huge returns in the short term. Or play it safe and plug for consistent long term performance. This is an option that is really only available to those webmasters that can produce/afford disposable websites. Me I play it safe... it's less work and worry ;)

mr_dredd2




msg:232704
 5:08 pm on Feb 27, 2002 (gmt 0)

but guys, isn't the main thing here the lengths to which google has told us what is a proper/good site?

to make sure our site is deemed honest and true to its nature now consists of:

what and how many repetitions of certain words you should put on one of your pages

how you should design your own navigation

who you should not link to, and how you should link to them

what techniques you should use on your site, over and above plain html

this isn't just criteria for good listing, which would be fine, but it now seems to be verging on criteria for INCLUSION

doesn't this seem slightly ridiculous to anyone? i.e. shouldn't they be crawling and documenting the web instead of telling people what kind of stuff to put on the net for them to find?

its rubbish to say follow a good format anyway and you will do fine. the format changes for each sector. just make an appealing well designed site for "insurance" and i will take my hat off to you if you get within the first 200. SEO isnt good design only, more so, its about COMPETITIVENESS. Sure, the vast majority of websites haven't been touched by all the filters, but thats not the point at all. Most of the competitive sectors and keyphrases ARE fairly heavily SEO laden and this is where most of the interest is in terms of bucks for all.

The point is this: google want an objective index, but, and this ins't unprecedented, thy are attempting to subjectively alter what makes the index in order to get it "objective".

where will it stop?

Slud




msg:232705
 5:45 pm on Feb 27, 2002 (gmt 0)

Does anyone else out there pray for the end of effective SEO, so we can spend our time making pages for *users* instead of *robots*?

Too many times I've had to trade off user-experience for SEO.

Bring on the reputation management and collaborative filtering to end all this page-optimizing trickery!

Brett_Tabke




msg:232706
 1:21 am on Feb 28, 2002 (gmt 0)

>trade off user-experience for SEO.

I've rarely found that to be a problem. Convincing clients otherwise is the problem.

The Data on Flash, Dom, and JavaScript does not show an increase in user satisfaction or "user experince" - it shows just the opposite. Hence, Yahoo and Googles long term viability.

>to end all this page-optimizing trickery!

Not going to happen at all. Can't happen until data retrival engines are able to abandon text as a prime data set. Text based se's are here for 10+ years to come. Atleast until the video search engines are perfected and netbandwidth accomodates it. At that point, we move out of text based websites into all of us having basically tv stations (ya...I know - wild concept, but that is where it is headed).

Michael Weir




msg:232707
 11:38 pm on Feb 28, 2002 (gmt 0)

Wow...that was a lot of information. I don't know how relevant my reply is going to be but here goes:

1. How long has SEO been considered spam? I consider spam 2 different things: internet content that is unwanted (such as pop up ads, and unsolicited pornographic email), and optimizing a page by doing things the easy way, i.e, hiding text (repeated search phrases), and loaded meta tags. IMO my former definition should be called spam, and my latter definition should be called something else, but unfortunately it has become known as spam.

2. How does someone in the SEO field go about getting their sites ranked with so many rules? It seems like getting your site(s) out there where people can see them is about as easy as walking through a mine field blindfolded. You're going to run into trouble along the way, obviously - or you can cheat and remove the blindfold (optimize pages for SE's). Where do they draw the line and consider it spamming? Is there a comfortable place where you can optimize safely and still get good rankings? Does luck play a major role here - is it possible to have a winning strategy that is both safe and beneficial in all of this chaos? Since I'm still very new it's almost overwhelming how many do's and don'ts and how many rules there are. Not only that but things seem to change so often...These are all rhetorical questions that I asked myself as I read through this topic.

finally...

3. Search Engine trends. In my year of this line of work I've noticed the trend that not only is it getting toughter to effectively and safely optimize sites, but marketing costs have increased: Overture/Goto increasing minimum bids, yahoo and looksmart increasing their inclusion fees, and yahoo sneaking in their yearly fees. Have I missed any?

I just thought of this. With search engines being so picky with what is right and what is wrong, no wonder people resort to shady techniques in getting good listings...

This 43 message thread spans 2 pages: < < 43 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved