| 7:48 pm on Apr 20, 2002 (gmt 0)|
I would remove all of that code.
| 7:56 pm on Apr 20, 2002 (gmt 0)|
| 8:41 pm on Apr 20, 2002 (gmt 0)|
Best not to Cloak like that. You will be dropped faster then noodles in a noodle factory.
| 8:51 pm on Apr 20, 2002 (gmt 0)|
It would be advertising to the spiders that you were doing one of their least favorite practices.
| 9:02 pm on Apr 20, 2002 (gmt 0)|
i know this is probably a dumb question---
how in the world would they (spiders) know ???
for me it would be worth knowing -
i have put alot of work into this project and would put tremendous value any adivse---
| 9:19 pm on Apr 20, 2002 (gmt 0)|
Do you know how to play bridge? Then maybe you know what a 'practice finesse is'.
That's what that is.
UA cloaking is not going to work. These same spiders CAN and DO send out the same spider and they cloak the user agent. So you don't really know whether it is googlebot or not.
On top of that, AFAIK, google doesn't read the tags in the head section as you've given. But it might just check to see whether you're faking the tags for other search engines.
Best is to make the best <header> tags that you can, and live with it.
Definition: a practice finesse in bridge is taking a gamble that you don't need to; you could achieve the same results without finessing.
| 9:25 pm on Apr 20, 2002 (gmt 0)|
If a search engine wants to check if you are cheating. They can visit you using another IP address or using another user agent such as MSIE/Mozila or whatever. Don't try to trick them or fool them. They are smarter then you think. You only hurt yourself. Make your content properly use good titles, use relevant wording, etc... Cloaking is a game you don't want to play unless you know what you are doing.
There are good reasons to cloak, but most newbies don't understand the power and pit falls that exist. Stay far way from cloaking. The disadvantages out weight the advantages by 25:1.
Cloaking should be for applications only, like PDAs, user preferences, geographic location, referer optimizing, etc.. Not search engine optimization.
| 9:31 pm on Apr 20, 2002 (gmt 0)|
I think Lisa and bobriggs have said it well.
Only leaves one thing to add, I know how to cloak but I don't consider it as an option. It is entirely too easy to get caught unless you are on the ball 24/7 and even then it can happen. I don't entertain any delusions that I can outsmart the spiders forever.
Unless, as Lisa said, there are usability issues to alter content for then just make good content, one solid set of tags and keep on top of it and make alterations as needed and you can do a lot more for your site than if you cloak. If you get picked for cloaking it will be VERY difficult to use that domain to get traffic from then on.
| 10:26 pm on Apr 20, 2002 (gmt 0)|
obviously - an interesting topic and everybody has an opinion!
i have reverted to my initial goal of serving browser bases menue at the top of the page since opere doues not seem to support layers and for some reason the link - coded to the base dir url for each category that triggers the layer events do not work ---
visit the site and let me know it it seems to work cross/application
in the past i have lived with my competitors simply copying my hard work and sometimes using exactly the same tage come up ahead of me...
thanks for the help
(edited by: engine at 9:28 am (utc) on April 21, 2002)
| 10:39 pm on Apr 20, 2002 (gmt 0)|
PS - if i were to try this tag cloaking - the tags would be identical except for the inclusion of the cities in which we are looking for returns...
thoughts on cloaked tags that are the same except for the addition of 3 to 5 words for spiders ? the inclusion of "tri city area" or maybe "greater chicago area" - for instance - ???