Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

IP Cloaking for Gbot

Use frames to avoid "Cached"?



12:44 pm on Jan 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Hi folks
Am doing UA & IP cloaking for very competitive KWs.

I've got an external javascript file on the optimised page which means than when users click on Google's "Cached" link it automatically redirects to the normal home page (thanks to "cache" being in the URL) - That takes care of normal users.

But, when the javascript is switched off of course & you scroll down the page, you can see everything. Is there any way to prevent this using frames?

A fairly recent SEO subscription service mentioned one of WW's long-time members. I had a look at the URL mentioned & they're using frames using UA cloaking only but as the whole site is framed not all the targeted KWs are doing well.

I'd like to frame JUST the cloaked pages to prevent it being viewed & can't figure it out. Would anyone be kind enough to offer some suggestions or sticky-mail me a URL?



7:27 pm on Jan 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

If you frame the page, you will need to put the cloaked page code within the frameset, which means someone can still get the code simply by viewing source of the cached page.

So why not just use <meta name="robots" content="noarchive">?


5:32 pm on Jan 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

WebGuerrilla, Thx for that. So I have my framset containing 2 pages. A blank one set at 0% and the other being the cloaked page.

Do I use the <noarchive> on the internal cloaked frame or the actual framset itself?

I want the cloaked page spidered but only the blank framset (with noframes text) to show up in Google.



Featured Threads

Hot Threads This Week

Hot Threads This Month