Welcome to WebmasterWorld Guest from 54.198.92.22

Forum Moderators: Webwork & skibum

Message Too Old, No Replies

Clean site versus Spam - What to do?

Do the directories care about hidden text?

     
2:45 am on Aug 31, 2001 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 19, 2000
posts:2501
votes: 27


Hidden text? I am so frustrated!!! More and more of my competitors have their sites listed in each of the major directories which would serve as text book examples of spam and I just don't understand why this abuse is not detected by any of the directories ... or do they even care?

These listings have been in the databases for so long, that they have become permanent and seemingly immovable fixtures across the web. It is impossible to write *clean* pages which will earn better rankings without any trickery of any kind.

How does one play fair when the playing field is not level and all rules do not apply to all sites. Do you risk playing the game their way and waiting for the axe to fall? Should I buy another domain and rewrite the whole site (with completely different design) using every underhanded trick I can think of and leave the clean site alone ... in the hopes that some day it will be rewarded?

When all the search engines claim that spam will not be tolerated, the practice of using hidden text (which is easily seen in the source code) is consistently rewarded by high rankings?

I have written to Google twice in the last year and have reported three of these sites, the answer I got back was that the Google detection system was not yet perfect but that they were working on it. I got the same answer a year ago. It seems the time line for detecting this sort of thing is agonizingly slow. In the meantime, those of us following the written policies of the se's sit and wait to reap the benefits of clean living.

Since most search engines use one directory or another for basic data, I figured I should find out if anything can be done at the source?

I have pretty much given up on the ODP because there doesn't seem to be anyone there I can write to for my categories. I submitted one of my sites back in May asking for a review, description and category change because the content had *completely* changed and the description no longer applies. The only thing that remains of the original site is the company name and the URL. No joy to date.

So I guess the real question is, do Yahoo or LookSmart care about this sort of thing and do they act on sites reported? Obviously, it had no effect with Google ... these sites are all still in the top 5 rankings for all major keyword searches and their Google PR is very high due to tons of inbound links collected over the years. Its a catch 22 situation. As long as they continue to use hidden text and get away with it, they will continue to get top rankings and more and more sites will link to them. How do you compete without breaking the rules?

Help!

3:54 am on Aug 31, 2001 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 30, 2000
posts:803
votes: 5


>I just don't understand why this abuse is not detected by any of the directories ... or do they even care?

Hidden text, in and of itself, should not affect listings in any directory. If a site has content, it should (theoretically) be listed.

However, as with any human-edited directory, you want to make sure that you don't irritate the editor, since they are, after all, human and have prejudices just like you or I. Hidden text, spammy titles, Flash, frames, glaring colors, distracting animations, embedded sounds: all of these COULD be something an editor doesn't like, and this could be a danger if you get an editor who either thinks your site is borderline as far as content is concerned or who weighs heavily on stylistic details.

I will say that I have 2 family members and at least 5 very good friends who edit at various directories, none of us look at source code while considering whether to list a site or not.

>When all the search engines claim that spam will not be tolerated, the practice of using hidden text (which is easily seen in the source code) is consistently rewarded by high rankings?

Search engines are a different matter entirely, and you will probably want to address them in one of the SE-specific forums here at WmW.

9:59 am on Aug 31, 2001 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 19, 2000
posts:2501
votes: 27


>I will say that I have 2 family members and at least 5 very good friends who edit at various directories, none of us look at source code while considering whether to list a site or not.<

Thanks Laisha ... I guess I have the answer to my burning question. It comes down to technology then and who can hide the longest.

1:15 pm on Aug 31, 2001 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 30, 2000
posts:803
votes: 5


>It comes down to technology then and who can hide the longest.

Not at all.

We're speaking of directories, where human beings review sites.

The whole reason the SEO / SE "war" continues is that the SEs are trying to determine relevance with software. Obviously, software cannot discern the actual content of a web page, so it uses words and formulas (alogrithms) to make the determination. In effect, the software emulates human reviewers as best it can.

On the other hand, SEOs (often) use manipulation in order to boost rankings, which is why hidden text was invented in the first place.

The reason hidden text is now "illegal" is because the SEs don't want artificial means to interfere with what they are trying to accomplish.

Directories, on the other hand, are doing what SEs are only imitating. They are using human beings who can evaluate and describe sites. If a human being looks at a site, the fact that keywords are stuffed into a title or text is hidden does not affect the human's discernment one way or the other.

A site with excellent content about poodles is relevant to poodle-lovers whether it has hidden text or not. The hidden text does not add nor subtract from the content. Take away the hidden text and you have exactly the same relevance.

goldm

7:02 pm on Aug 31, 2001 (gmt 0)

Inactive Member
Account Expired

 
 


I would tend to agree with laisha and I really am not sure that I would confuse the inclusion of text on a web page (hidden or otherwise) with spam.

A directory editor looks at every site that they add. If I were investigating a complaint from another website submitter, like the one that you have brought up here, I might want to know if the hidden text somehow interfered with the user's experience in viewing the site. Unless the text took the form of some javascript routine that whisked you off to another domain or such, I doubt it would have any bearing on whether or not the site would remain listed.

As far as I know, the only time ODP editors investigated the source code of the pages submitted to the directory was if they were trying to establish some connection between one page or another. There is really little or no reason to do that as laisha has pointed out.

Liane, if there is any encouragement that I can give, it might be in reply to your comment, "These listings have been in the databases for so long, that they have become permanent" There is *nothing* permanent about a directory listing. They are constantly reviewed by multiple editors. If an editor ever determined that a site was inappropriate for a category where it was listed, they have many options available including un-reviewing, deleting, re-editing, discussing with other editors, etc.

4:10 am on Sept 1, 2001 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 19, 2000
posts:2501
votes: 27



Hi Laisha, and thanks once again for your thoughtful response. It is truly wonderful to have a forum where we can all learn from other people's experiences and to speak directly to a directory editor is even more helpful. I apologize in advance for this epic post, but I have been thinking about what you had to say all day ...

<We're speaking of directories, where human beings review sites. The whole reason the SEO / SE "war" continues is that the SEs are trying to determine relevance with software. Obviously, software cannot discern the actual content of a web page, so it uses words and formulas (alogrithms) to make the determination. In effect, the software emulates human reviewers as best it can. >

I understand the importance of the human edited directories and appreciate their invaluable contribution to the internet. What I meant by my comment, "It comes down to technology then", was that since the editors do not concern themselves with hidden text, and do not consider it relevant, then it is up to the technology of the individual SE's (who do not use humans) to identify the offending sites. This technology seems to be a very long time coming and in the meantime, those of us who are trying to cooperate with the rules are losing business. How long do the SE's figure owners of clean sites will wait until we begin to fight fire with fire?

<The reason hidden text is now "illegal" is because the SEs don't want artificial means to interfere
with what they are trying to accomplish.>

You have just described a real conundrum. Humans can see that which the search engines cannot, but editors do not consider the use of hidden text as criterion for exclusion or punishment. The SEs on the other hand do not want sites to mess with their algorithms which evaluate page rank and have therefore condemned the use of hidden text and some have stated that they will penalize or ban sites using these practices ... which they do not seem able to detect. Hmmm.

If SEs don't want us to use artificial means (hidden text) to interfere with what they are trying to accomplish ... and if editors do not consider the use of hidden text when evaluating a site to be of any particular importance, then it seems the two are working at cross purposes (on this one issue) and it begs the question, why is nothing done when an offending site is reported to an SE? It seems to me that they should encourage such reports. I am convinced the problem would disappear within weeks as the public at large would solve their algorithm problems for them! Of course, they would have to employ humans to investigate these reports and they would initially have a deluge of reports on offending sites, but as soon as the news got out that the public were being encouraged to submit the URL's of offending sites, most of them would rush to clean up their act, lest they be banned or penalized!

<Directories, on the other hand, are doing what SEs are only imitating. They are using human
beings who can evaluate and describe sites. If a human being looks at a site, the fact that
keywords are stuffed into a title or text is hidden does not affect the human's discernment one
way or the other. >

Very true! However, since human beings award the sites with a page rank and since all editorial opinions are subjective by nature, the actual site content can be and often is not as deserving of high page rank as some other sites.

Case in point: One site (which gets top ranking across the board in my business category) uses cartoon characters and cute little animations on their index page. (I find them annoying beyond belief ... but obviously the DMOZ editor who reviewed this site disagrees with me.) The site content is not in and of itself anything special, well presented or even information rich. There is little copy on the index page which the user would see, but the source code is loaded up with hundreds of keywords.

<A site with excellent content about poodles is relevant to poodle-lovers whether it has hidden
text or not. The hidden text does not add nor subtract from the content. Take away the hidden
text and you have exactly the same relevance.>

True ... a rose is still a rose by any other name, but does it smell as sweet? In other words, does it really have excellent content, or did the cute little cartoon characters so amuse the editor that he/she awarded this site with an astronomical page rank because it touched his/her funny bone?

I have no idea what criterion an editor uses to determine the value or relevance of any given site, but I cannot understand why this particular site I have described has a Google page rank of 6 while mine and many other, well written sites have a page rank of 3, with *nothing* in between?

Makes my head hurt. I will keep working at it and trying to find the secret to a successful web site, but it is unbelievably difficult to know what is what when the SEs do not enforce their own rules, keeping the playing field level. Regardless of what the human editor may think of a site, I am convinced that hidden keywords in the source code has an astoundingly positive effect in the SEs, giving the offending sites a more than marginal leg up in the rankings. So another question comes to the forefront ... do humans really do it better and if so, for how much longer? If human editors did actually look at some of these problems the SEs are dealing with, then perhaps there wouldn't be the need for such complicated algorithms to begin with and maybe we wouldn't have to pay the ridiculous sums required to be listed in the various search engines.

In response to goldm

<Unless the text took the form of some javascript routine that whisked you off to another domain or such, I doubt it would have any bearing on whether or not the site would remain listed.>

Funny you should mention it! After reading your post, I checked on one of the three sites which I had my doubts about ... and sure enough, it did just that, in addition to their shameless use of hundreds of hidden keywords on the index page. So what is one supposed to do in this case?

So far, all I can glean from this discussion is that in comparison to these other sites who all share one common trait (hidden text), is that there is a good possibility that my site may be poorly written, presented and has no original content ... but everyone is too polite to say so.

<There is *nothing* permanent about a directory listing. They are constantly reviewed by multiple
editors. If an editor ever determined that a site was inappropriate for a category where it was
listed, they have many options available including un-reviewing, deleting, re-editing, discussing
with other editors, etc.>

It IS encouraging that listings are "constantly reviewed by multiple editors" ... this at least gives me some hope for my business site as well as my *purely informational site* which remains in all directories (and therefor most SEs) with a very inaccurate and outdated description. I understand that Yahoo and LookSmart want me to pay to have the description changed (which won't happen as it is a free site for all businesses related to tourism in my area) ... but what about DMOZ? I have written to them twice with no response and no change. Does anyone know what the *normal* time frame is for a site to be re-edited?

Perhaps if there were a DMOZ editor for my category, it wouldn't be quite as frustrating?

8:27 am on Sept 1, 2001 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Aug 8, 2001
posts:926
votes: 0


<Perhaps if there were a DMOZ editor for my category, it wouldn't be quite as frustrating? >

Sounds like you should apply for the position ;)

I did that for the regional category that most of my sites appear in.

Plus side - sites I produce get included in the ODP quickly, with good descriptions.

Negative side - I have to also look around for competitors sites and include them also. I also have to avoid "optimizing" descriptions in any way - grounds for dismissal.

<editors - award page rank>

Dosen't happen. The only ways editors can distinguish between sites is via 'cooling' and by writing descriptions.

A site with limited content will have shorter less inviting description as theres less to describe.

Tips on applying - declare any commercial interests, and write well written honest descriptions of a few sample sites to show that you're an expert on the topic.

As far as I know the goal of the ODP is to have the largest number of sites with original content accurately described and catalogued. So sites of pure "spam", redirects, affiliate links will be rejected, but everything else will be catalogued.

Best place to report scams, mis-catalogued sites, changes, and things that don't adhere to this philosphy is the editor (in the directory above - if none listed).

11:39 am on Sept 1, 2001 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 19, 2000
posts:2501
votes: 27


Hi Gethan,

I have too much work to do (since I have a business to run and be my own webmaster) to be a DMOZ editor too! I think it would be a terrible conflict of interest as well.

But thanks for the info about the page rank. I thought that this was in the editors domain. That means that the hidden keywords have even more influence than I had thought!

Thanks also for the tip about the editor in the directory above ... but I already did that. Still nothing. Oh well, patience is a virtue they say! :)

3:05 pm on Sept 1, 2001 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 8, 2001
posts:1402
votes: 0


Liane, sorry I am little late in on this one. This is a subject close to my heart at the moment.

I understand your frustration at the lack of reaction when you report the 'cheats'.

I had a similar problem with one of my customers.

His site was in competition with 3 other companies, all of whom were breaking every rule in the book - cloaking, javascript redirects, hidden links, hidden text, bait and switch - the lot.

They were using multiple domains and, between them, had effectively taken over the top ten for all his main phrases across the board.

I gave him the choice of playing them at their own game or playing a 'straight bat' and notifying the search engines about each of their transgressions.

He chose the 'straight bat' option. We reported those that we could identify with mixed results.

Lycos removed all the cloaked sites and some of the javascript redirects (and even replied to me - kudos guys). Google removed some of the cloaked pages only. Inktomi didn't react.

You win some, you lose some.

It has, however, created a gap for his site in the top rankings, and his 'clean' site now has a few top tens and one No.1.

I believe you are morally justified in 'shopping' any tactics that are classed as 'illegal' by the relevant search engine.

I know this might be controversial, but for most people this is business, and their livelihood - not a game.

From an SEO point of view, it does sometimes look like a game - I even saw it that way myself most of the time.

However, I can totally understand my customers point of view in being prepared to 'shop' people who are spamming in the same way that he would 'shop' a competitor who is selling stolen goods to undercut his prices.

I am paid to represent my customer, and it is my duty to stop seeing it as a 'game' and fight for him as best as I am able.

Likewise, if I suspect that someone has used a 'bait and switch' to get a Yahoo or DMOZ entry, I will report it, (only if it affects one of my customers though, I am not a mad vigilante scouring the directories for the bad guys!)

Sometimes they react, sometimes not.

If they are cheating the directories, I say keep reporting them - you are doing yourself a favour and doing the directories a favour at the same time.
(Liane, see your stickymail for a useful tip on how to report them)

Now, don't get me wrong here y'all, I am no saint, I have some sites with 'naughty' tricks on them - but none of my new ones are like that, I just try to give the engines what they want now.

If I feel the need to 'cheat' again, (as I may well need to at some stage), I will do it knowing that I am taking a risk, will brief my customer accordingly and take whatever precautions are wise. If I get 'shopped', all credit to the professional who shopped me.

However, if you need to fight on their ground (and if this isn't too hypocritical after my previous comments), I would recommend that you have another domain, build another site and target the directories with one and the engines with the other. Make the content different, and you should be OK.

Sorry if this got a bit evangelical - been wrestling the Devil on this for a while now.

Right, I've pinned the target on my chest, my blindfold is on - fire away:)

4:01 pm on Sept 1, 2001 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 19, 2000
posts:2501
votes: 27


Hi 4eyes,

Glad to hear that somebody else understands the moral issues as well as the business side of the problem. Another one of my competitors (not one of the three main offenders) has three sites! They are all very different, but they are all selling the exact same thing. That too is a frustration, but I have been considering building another site as an option for a while now.

The new site would use all the dirty tricks my competition uses and when the axe falls (if it falls) ... my clean site will remain and hopefully win the coveted 1st place for a while!

What a world we live in huh? I hate skullduggery, but one does what ones has to do I suppose!

Kind regards,
Liane

6:17 pm on Sept 1, 2001 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 8, 2001
posts:1402
votes: 0


Go for it!

(and give us feedback)