Welcome to WebmasterWorld Guest from 54.147.44.13

Forum Moderators: open

Message Too Old, No Replies

Example of Cloaking Code?

     
2:36 pm on May 21, 2009 (gmt 0)

New User

5+ Year Member

joined:May 21, 2009
posts: 10
votes: 0


Hi,

Would like to see an ex of a cloaking code?

I have uploaded a .html page and am wanting to make sure i have coded this properly.

Appreicate it.

T

3:02 pm on May 21, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:May 31, 2008
posts:661
votes: 0


most cloaking will not happen in html-files (which are usually interpreted by the client), but on server-level (e.g. which html-file to send for a request).
in html, you'd probably just use javascript to show things you don't want the robots to see.

be aware that it might get you banned, if google sees it.

3:05 pm on May 21, 2009 (gmt 0)

New User

5+ Year Member

joined:May 21, 2009
posts:10
votes: 0


Ok..isnt the cloak page an htm or html file which no one sees with the javascript redirect coding in it?

confused now....

3:24 pm on May 21, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:May 31, 2008
posts:661
votes: 0


that's up to your definition of cloaking.
Cloaking to me is serving different content at the same resource to users and bots, without a redirect.
redirecting users with javascript is far easier to see for bots and outdated. that said, it still works (or worked, I've seen a page recently that ranked well and used that style ... can't remember which query I used, though).
3:44 pm on May 21, 2009 (gmt 0)

New User

5+ Year Member

joined:May 21, 2009
posts:10
votes: 0


OK....that what I was afraid of...

how is it then done without a redirect based on your interpretation?

this is where i am confused...

thanks!

3:48 pm on May 21, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:May 31, 2008
posts:661
votes: 0


based on either the useragent (and other additional data from the request) or (the better way) the originating IP, a script decides wether this is a robot or a user and sends the appropriate content. there are a few commercial providers of informations and solutions out there where you can obtain such scripts.
3:57 pm on May 21, 2009 (gmt 0)

New User

5+ Year Member

joined:May 21, 2009
posts:10
votes: 0


Would like to use the IP way.....and would rather write my own script that use a commercial one (not too trustworthy of these)....

is this script easy to code?

am wondering if my current script is correct or not....

7:48 pm on May 21, 2009 (gmt 0)

New User

5+ Year Member

joined:May 21, 2009
posts:10
votes: 0


any help?

thanks!

8:07 pm on May 21, 2009 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 10+ Year Member

joined:Aug 28, 2002
posts:993
votes: 2


You aren't going to find someone to give you the cloaking code. It can be quite complex.

If you are dead set on it, which I wouldn't recommend anyway, do some searches on google for associated phrases and you'll likely find some things, however most are not going to be free.

8:09 pm on May 21, 2009 (gmt 0)

New User

5+ Year Member

joined:May 21, 2009
posts:10
votes: 0


philosopher....

can i email you my coding and you tell me if its ok to use?

thanks

8:09 pm on May 21, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


I draw the line at supplying code for something that damages user experience and the web. :)
8:14 pm on May 21, 2009 (gmt 0)

New User

5+ Year Member

joined:May 21, 2009
posts:10
votes: 0


lol....user has no clue....and SE's love them!...:)
8:26 pm on May 21, 2009 (gmt 0)

New User

5+ Year Member

joined:May 21, 2009
posts:10
votes: 0


does code also include every ip of every SE spider?
8:41 pm on May 21, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:May 31, 2008
posts:661
votes: 0


your questions can be easily answered by using the very search engines you want to fool ...
8:48 pm on May 21, 2009 (gmt 0)

New User

5+ Year Member

joined:May 21, 2009
posts:10
votes: 0


lol....and that's how i've ended here....:)

so...questions do not get answered here?...lol

8:56 pm on May 21, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


[webmasterworld.com...] -- 2.6 million prior posts says otherwise.
10:26 pm on May 21, 2009 (gmt 0)

Senior Member from ZA 

WebmasterWorld Senior Member 10+ Year Member

joined:July 15, 2002
posts:1720
votes: 1


The hardest part of writing your own cloaking script (I would imagine) is keeping the IP address list up to date, adding new ones as they come online, it would be a continuous and on going task - and certainly not an easy one.

I can only think of one cloaking script that has taken a life time to perfect!

10:36 pm on May 21, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Sept 29, 2000
posts:12095
votes: 0


Newbie cloaking primer [webmasterworld.com]

Another cloaking how-to-do primer [webmasterworld.com].

And some cloaking FAQ [webmasterworld.com].

does code also include every ip of every SE spider?

How about maintaining a complete up-to-the-minute IP database, including the ones they sneak out to catch you.

am wondering if my current script is correct or not....

Try it out, like on a spare domain. Make two pages that are substantially the same, including graphics, with very minor differences that only you know will be there, like a minor insignificant wording change: one for your browser and the other for spiders. That doesn't mean you'll have all the IP numbers you need, but it's a start.

Disclaimer: I don't know anything but the risk and some common sense. Cloaking has never been an option for me; if I don't keep up with doing my dishes, I certainly wouldn't keep up with spiders, and that's a recipe for disaster.

[edited by: Marcia at 11:11 pm (utc) on May 21, 2009]

3:52 pm on May 22, 2009 (gmt 0)

New User

5+ Year Member

joined:May 21, 2009
posts:10
votes: 0


Thanks Marcia!

This should give me some good reading material for the weekend...lol..:)