|I'm checking on the rumor that this software helps solve problems related to DHTML web pages and search engines.|
Does anyone have any info?
In particular, it detects browsers and sends back a page
that's browser friendly to your particular needs. The idea I've heard is that it recognizes spider robots too, and sends back static HTML pages whenever a robot visits.
This is the part I'm interested in knowing more about.
It's a cloaking script Kim. I feel you can do more good for your site with quality W3C validated HTML than trying to get around browser specifics at the server level.
That's what I said it was, but I was shot down by the person suggesting it be used. Scary thing is that it's for a corporate site. I decided to check up and get some facts.
I would need a reference from somewhere that BH is considered cloaking, in which case they would reconsider using the software for the purpose of engines. They already bought it for its browser detection features.
Thanks for you fast response. I'm all ears for any further leads :)
Kim, I'm just back from the two day Internet Marketing Conference in Stockholm, where the topic of cloaking was discussed intensively in the presence of executives from several major SE companies. I will be posting a lengthy article later today or tomorrow in the Cloaking Forum.
Let me just say this at this time after having discussed the matter at length with some of the world's leading gurus - both publicly and in private:
The advice Brett gave you is good advice. That is exactly what I would tell a serious corporate customer. If you want one more super guru on the same issue: Detlev Johnson. He has exactly the same view as Brett on this and has said it in public on many occasions.
But if the customer was in the so called "adult industry" offering pornographic pictures for paid viewing, and thus having no or very little text content on the site, I would recommend a cloaking expert - of which there are many. The same advice would be given to anyone who has absolutely nothing to lose, but with a desparate need for ranking and lots of money to spend.
Having said that, I should also say that the one single person on this planet, who has probably done more cloaking for more porn customers than anyone else, says that he has never been caught and thus incurred no penalties from SE:s.
But the jury is still out and the SE:s are only now getting ready to debate a common policy, so I am not prepared to give cloaking the benefit of the doubt. Not yet.
>recognizes spider robots too, and sends back static HTML pages
If the pages differ from those that a human visitor [browser] is served then that's the classic definition of cloaking. If that is what they want to do there are many better ways of doing it.
Whilst in no way suggesting that cloaking is a "bad" practice, I would be uncomfortable with this technique on a corporate site, particularly bearing in mind the number of searches made for www.corporatename.com. It is absolutely essential that the site is at least listed at all the major SE's.
The simple fact is that if they want to rank well at the SE's they need to build a SE friendly site from the ground up, not just build and then bodge a fix afterwards. I am sure that you have dealt with these frustrating corporate sites far more than me, yet I guess that it doesn't get any easier to convince them that SEO *must* become part of their culture.
Good luck. ;)
This is buried on the features page
"Search engine placement: BrowserHawk enables you to achieve higher placement in search engine results by detecting the visiting crawler and returning content optimized for getting you listed above your competitors!"
at [browserhawk.com...] with a link that leads to [browserhawk.com...]
You all are helping me a great deal. Thanks! Now, the question put to me is how high risk is cloaking if the intent is to not abuse the feature, but simply deliver engine-friendly,non-DHTML pages.
I've suggested they simply submit a static version of the site to engines, but haven't won that battle yet. I've also promoted the idea of static doorways, which I can show them how to do properly.
Other than people turning in suspected cloaked pages, how in the heck do engines determine a cloaking situation? I've never known the answer to this...
Essentially, I'm being asked to assign a "risk factor" to cloaking a corporate site.
I can do the research if you guys have leads I can follow.
I'll also check out Fantomaster's site.
Thanks again! I knew I came to the right place :)
We are talking two different types of cloaking:
a) cloaking as a type of personal (browser sniffing) delivery
b) for se promotion purposes.
I don't care for A, but do accept B.
I don't like people who personal deliver, because it is the rare exception that they get it right. God help us if thise program is successful as it could ruin compatability for dozens of third party browsers - there is zero chance of a program getting the exceptions right. However, there is 100% chance of getting it right if you write w3c validated code from the word go.