|Google and Cloaking|
What do you all recommand, Cloak for google or don't cloak?
Google says "Don't do it." But then again, that's what those SE's always say, don't they?
Been doin' it pretty regular myself. That cache thing is a real pain but I've worked around it.
All you gotta do is build your site around what Google wants anyway cuz everyone else is copying them anyway. Then just feed Google a nice fat site map. Googlebot will take it from there. You really don't even need to cloak to do this.
Good point. But I've been hearing all these threats supposedly coming from Google about them kicking our behinds and banning us if we cloak. So I was just wondering.
Just wondering if you could help me out and let me know what you did to get around it.
I've been doing this for the cache thingy
<meta name="GOOGLEBOT" content="NOARCHIVE">
But don't know if it really works
It will work as far as the cache goes. But I wouldn't use it if you are using spamming techniques. It will invite investigation.
(edited by: littleman at 9:10 pm (gmt) on Aug. 10, 2001)
Nope Spammings out of the question.
I'm just trying to rank good on the SE's. But I need to use cloaking cause our site is VERY code heavy.
But nope staying away from spamming.
Here's a thought. Use user_agent cloaking like this:
Make two versions of your site, the "text version" and the "full blown code-heavy HTML/flash etc. version."
Any Mozilla type user, or unidentified user, gets the full-blown version. But Lynx users get the text version. So do Googlebot, Scooter, and the whole spider nest. And 3 out of 4 SEOs agree, a site optimized for Lynx does pretty darn well in search engines.
And what I love about it most of all is, it's NOT cheating. I don't try to supress the Google cache; I've got nothing to hide. It's an attempt to make my site 100% compatible across the board, a technique used by the biggest of big-dawgs.
And when you're done, try cruisin' your site from Lynx!
I take the page my visitors will see, tables, graphics, everything, and optimize it for the SEs. That way the visitors page and any cache pages look quite the same. Links that I may have to reduce in the optimized page are underlined text, or image. They look like they may be links but are dead.
Adding text is simple, but when I have to reduce text, I place what text I need on a low quality graphic.
NO SPAM and keeping the file as close to the same size, I believe, is key.
Not trying to initiate a whole other discussion here, but, I believe the issue with SEs and Cloaking is the fact that people abuse cloaking by spamming. If that we're not the case, what would the problem be? What the SEs want to eliminate is spamming, but they're going after the cloaker out of fear the he or she *may* be spamming. The funny (or not-so funny) part is that they still don't have a handle on those who spam the heck out of them, and don't even use cloaking.
I agree with you awoyo, good point to make.
Oh, and yeah, thats a really good idea Bolotomus.
Thank you'll for the pointers.