homepage Welcome to WebmasterWorld Guest from 54.161.247.22
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Marketing and Biz Dev / Cloaking
Forum Library, Charter, Moderator: open

Cloaking Forum

    
Cloaking for beginners
How best to use cloaking for the first time
engine




msg:677469
 9:45 pm on Jul 1, 2000 (gmt 0)

If I wanted to start to employ cloaking, what are the basics to get things moving?

 

rcjordan




msg:677470
 10:18 pm on Jul 1, 2000 (gmt 0)

Good man, engine! Me, too. I didn't want to be the "Tommy" on this one.

littleman




msg:677471
 12:06 am on Jul 2, 2000 (gmt 0)

The basics are something like:
1 Internally redirect pages to a script. This is usually done through ssi or by renaming an executable script so it has a normal "htm" or "html" pages.

This step isn't truly necessary to cloak, but most people do it to make things less obvious.

2 Do a couple of environmental checks to screen out whether a visitor is a spider or not. This could be done through user agent, host, or IP lookup. There are other techniques, but those are less common. Most people use either IP or a
combination of IP and user agent lookup.

3 Deliver a page depending on who (you think) is knocking on the door.

The coding for a basic script is quite simple, there are even a couple free scripts out on the web. That being said, I don't think cloaking is something anyone should take lightly, to do it right is a part time job. You are either going to spend
time or money. And there is a REAL risks of getting your site banned if you are not sensible. Air posted some guidelines which I believe are sound. I have had a half dozen domains banned from Ink and about 25 kicked out of AV, but I do not
fallow those guidelines. I've treated (some) domains like double wicked candles and they've burned bright enough when they were alive to more then pay for their replacements.

Anyway, if you are going to dip your toes - be sensible, or treat the domain like it is a speculative commodity.

Sorry for getting a bit off track...



rcjordan




msg:677472
 12:15 am on Jul 2, 2000 (gmt 0)

> Deliver a page depending on who (you think) is knocking on the door.

That touches upon a long-time question of mine; in general, how many different pages do you try to maintain for a page that is being cloaked? I know that this can vary by client and budget, I'm just looking for an average in your estimation.

littleman




msg:677473
 12:57 am on Jul 2, 2000 (gmt 0)

I currently feed nine different pages per subpage. However, I know successful SEOs who only serve two -- Human and Machine. This strategy will probably become more common as search engines and SEOs become more obsessed with links.

Smokin Joe




msg:677474
 7:41 pm on Jul 12, 2000 (gmt 0)

I'm trying to cloak page to page.

My question is this..... if I have [cloakme.com...] on my actual website, would my redirect script be named [cloakme.com...]

I mean the script would obviously serve the optimized pages depending on their IP, but as far as link pop, would something like this work?

Air




msg:677475
 11:18 pm on Jul 12, 2000 (gmt 0)

SJ,

That should work fine provided that the url stays as www.cloakme.com/keyword1.htm for your regular visitor page and for the one you feed to each of the spiders you are targetting.

Smokin Joe




msg:677476
 3:44 pm on Jul 13, 2000 (gmt 0)

I'm sorry, could you clarify?

I obviously have to name the pages a bit differently in order to keep them in the same web folder, unless of course I make a seperate directory. But then the URL would change a bit, making the cloaking more obvious.

How exactly do links get counted.... what I'm asking is: if I were to have [cloakme.com...] (actual page), [cloakme.com...] (spider redirect), I would obviously have all the pages linking to the redirect page, right? The spider then should automatically assume that even though it is directed elsewhere (alta/keyword1_av.htm) it still counts all the incoming popularity to the URL of my redirect page....

BUT! On considering my outgoing links (hopefully to relevant sites) I put those on my spider food pages so they can be properly spidered.

Ok, now tell me if this close. Although this maybe obvious to some I had to figure this out on my own.... am I warm?

Air




msg:677477
 1:41 am on Jul 14, 2000 (gmt 0)

I obviously have to name the pages a bit differently in order to keep them in the same web folder, unless of course I make a seperate directory. But then the URL would change a bit, making the cloaking more obvious.

Sorry, I actually missed the fact that the two URL's you used for examples were not the same. This is really a function of the script you are using. Optimally the script should, for the same URL determine what content to display depending on who the visitor is, without any "redirection". This is the only way to maintain link pop. In other words the only URL for keyword1.html should be [cloakme.com...] with the script determining what to display based on who is asking.


How exactly do links get counted.... what I'm asking is: if I were to have [cloakme.com...] (actual page), [cloakme.com...] (spider redirect), I would obviously have all the pages linking to the redirect page, right? The spider then should automatically assume that even though it is directed elsewhere (alta/keyword1_av.htm) it still counts all the incoming popularity to the URL of my redirect page....

This is only useful for having the spider follow links to the cloaked pages, it does not help in link pop much at all. That is why I keep on mentioning that the URL must not change for the spiders vs. visitor pages.

BUT! On considering my outgoing links (hopefully to relevant sites) I put those on my spider food pages so they can be properly spidered.

Remember though, while these count a little towards relevancy they do not count towards link popularity, it is incoming links you want and from relevant and prominent sites being the most desireable.


Ok, now tell me if this close. Although this maybe obvious to some I had to figure this out on my own.... am I warm?

I'll let you decide if you are warm ... it would be interesting to hear from a few of the other's that have experience with cloaking how they deal/have dealt with this. Anyone? (I feel like I'm in that Ferris Bueller movie :) )


redzone




msg:677478
 6:03 am on Jul 15, 2000 (gmt 0)

I guess I'll come clean here, as I've never mentioned being involved with "cloaking" at SEF..

I've been cloaking for about 3.5 years now, have gone through 4 system "re-writes", and probably tried about everything... :)

We laugh about GreenFlash, cause they got a lot of bad press, while we continued to increase our client base, and have nice deposits at the bank... :)

Our current pathing methodology utilizes a root folder that only contains information for the root mirror domain, and then a subfolder inside the root folder, for "every" keyword we optimize for a client.

This keeps organization easy, and allows for complete automation of creating folders/files, rather than any type of manual file/folder creation.

Our first two Rev's were in Cold Fusion, but scripting languages are terrible on server resources, and though I help administer a current system in PHP, my latest rev. is written in Visual Basic 6.0.. The VB DLL stays loaded hot in server memory, and the executable that kicks off is less than 75K, and is lean and mean... An executable will always process faster than a script, because no line by line interpretation is performed by a "Script processing engine"....

We don't use static pages/templates of any kind. Everything is served dynamically through database manipulation... If you have a lot of clients/keywords you are optimizing, this is the only way to fly.....

We brought our lastest system up around 4/15/2000, and with a staff of 4, have optimized over 27,000 keyword phrases in 90 days...

The really nice thing about dynamically serving the crawlers from a database, is employing link strategies. We can test multiple theories simultaneously, and tracking is all automated.

Now for the downside. Not many SEO individuals have the resources or programming knowledge to design and implement such a system, and therefore usually try one of the many PERL scripts that are out there...

These scripts will get your feet wet, and I know Littleman, and a couple of others, have refined them..

So, start simple, but stay organized in your file, pathing structures.

redzone




msg:677479
 6:08 am on Jul 15, 2000 (gmt 0)

One thing I forgot to mention, we never submit:

[anydomain.com...] or
[anydomain.com...]

We always submit the root or folder:

[anydomain.com...] <- That's obvious, but:
[anydomain.com...]

Again, this works well for organizing one folder per keyword per root mirror domain....

Brett_Tabke




msg:677480
 12:48 am on Jul 17, 2000 (gmt 0)

We used cloaking for almost two years. I've been trying to break the habit here this spring. I took it all off a couple of months ago, but that lasted all of about 60 days. We had some very high ranked pages get jacked and rankings went down within a week (about the time I noticed it). So, I'm trying to just cloak for the higher value/ranked stuff that is worth the time.

I've done cloaking in everything from straight c to shell scripts. Most of the time ending up back in Perl simply because it is easiest to maintain.

The thing that drives me nuts is the maintenance. It is double the work over night. Especially when you start feeding separate engines customized pages entirely. It is like having to maintain 4-5 websites instead of one. I don't mind on client sites where they can do the user content thing and we'll do the seo pages, but on my own sites where it is me-myself-and-I do both - it's a major pita to maintain.

Currently, I'm using a perl frontend with a c backend logger/tracker. Generally, I try to get away with as much hostname tracking as I can because ip table loading and comparison in perl is slower than the name resolution on the IP. I used to look at the agent name first and then the IP, but with the se's playing games, it is wise to look at the IP only anymore.

If you are going to start cloaking now for the first time, I think it would be wise to get your feet wet on a 'throw away' domain that you can play around with and not have any perm damage done if it should get banned.

rz, did you ever get any domains banned by Alta?

littleman




msg:677481
 2:58 am on Jul 17, 2000 (gmt 0)

Speed is always something I keep my eye on. In my opinion mod_perl offers the best of both worlds as far as speed vs. maintenance.

redzone




msg:677482
 4:38 am on Jul 17, 2000 (gmt 0)

Brett,
That's why we dynamically serve templates on the fly from a database. We optimize a template that ranks well in a specific engine, and store that file in a table, then we have unique template data (keywords, title tag, desc, etc) that is stored for every keyword in another table..

When the spider comes a crawlin, the pieces get fit together on the fly, and there is absolutely minimal maintenance.... We're in the middle of forming an International partnership with this system, so I hope it works as well for our "non-US" partners, as it has performed for us to date....

Air




msg:677483
 4:44 am on Jul 17, 2000 (gmt 0)

It is like having to maintain 4-5 websites instead of one

Yeah, that is both the advantage and disadvantage of cloaking. It does feel like a lot more work at times ('cause it is), but in return you get flexibility. I guess you have to decide when to trade one for the other.

littleman:
I would guess a lot of hosts still do not offer mod_perl, want to talk a little about the speed difference and whether it would be worth switching hosts for? Probably worth a separate thread.

redzone:
that is a very interesting setup you have, thanks for outlining it here. If you have encountered any problems with any of the engines it'd be great (obviously only if you want to) to post in the thread located
HERE [webmasterworld.com]

jaz




msg:677484
 6:58 pm on Jul 21, 2000 (gmt 0)

Has anyone used ASP and checking IP addresses using SQL statements in a Database to cloak?


Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Cloaking
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved