Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
Let's take into consideration, to properly cloak what you would need to do so:
1. At least 100 URL's to play around with - Keep in mind that it's better to use dummy URL's because you don't want your mother URL banned or reported.
2. Multiple IP's - I don't use more than 3-5 domains per IP. IP's are cheap, don't pack all of your domains on one IP, otherwise you'll get caught a lot faster than you think.
3. Don't submit directly to Google - I've found that when I dont submit directly into Google, true it may take a bit more time than usual to get indexed properly, but the sites almost never get banned. They'll just be removed from the index from that month, which means you can just list them in a different engine again, without worrying about the blacklist. I think the reason for this is because it is not directly submitted to Google, so you don't exactly have to follow their guidelines, therefor cannot be penalized for it.
4. Put keywords and paragraphs of your cloaked pages inside the index of the main cloaked URL to fool competitors into thinking there is no cloaked pages but just a keyword and content saturated website. They'll probably try and out-do you for a few months before they catch on. But thats well worth the time.
I've been successfully cloaking pages since 1997-98'ish. I've had loads of URL's banned by jealous or pissed off competitors, but I'm still doing it, because even though it's tough at times, it's WELL worth it. As long as you don't get greedy with the keywords, you'll stay under the radar for a while longer than normal. Depending on the industry you are cloaking for, you will make a lot of money if done properly. Sites of mine that were made years ago, are still on top spots, and still making a lot of money.
[edited by: Blue_Gravity at 4:42 pm (utc) on May 5, 2003]
I do want to give cloaking a try on a small scale. Here's what I was planning.
1) Select the desired keyword.
2) Go to overture and get a list of suggested terms for that keyword.
3) On my cloaked domain create a page for each of the suggested terms from step 2.
Now I have a choice. I could click on term #2 from the overture list, create another domain and repeat.
I could use the same keyword list from the first domain but just work the content, titles and keyword densities a little differently.
How's the plan sound so far?
What are the thoughts in regards to cross linking between cloaked domains?
From the domain farm you build. ;) Seriously, people set up entire networks of sites solely to build PR. A typical domain farm might have 300 sites in it, 50 of which are PR passers. That gives them 250 sites to direct PR with. Very few of those sites will be heavily interlinked. Coupled with the links that are purchased and acquired through link trades PR7 and PR8 sites can be built and maintained.
The sites that pass PR pass it using perfect anchor text. It requires a lot of work but the return is there. Once the domain farm is established it doesn't matter if the target sites gets burned occasionally. The trick is creating a domain farm that doesn't look contrived.
The 50 sites are very clean sites that people don't mind linking to. You use these 50 sites for the link trades and perhaps buying links to feed PR to these 50. I divide the 50 into groups of 10 and I only moderately cross link between sites, almost never between sites if they are in a different group of 10. I then use these 50 sites to focus PR onto a site(s).
I suppose you can also filter the traffic from these 50 and use the traffic to feed toplists but that's another topic.
In your scenario how do the 250 sites come into play?
Are these only linked to from the 50?
It sounds like your cloaked domains are not in this group of 250 sites. Hence if a cloaked domain gets burned you disconnect it from the collective. Or are the target cloaked domains in the 50?
>>Why not try both?
And a question to all-
Is it just as effective to create 50 folders on a domain and treat each one as a seperate feeder site? Let's say you have some crazy domain name like qwerty.com and you create sites which may or may not be related in subfolders like
The benefit is that by using includes you can easily make changes across all 50 sites in the blink of an eye.
As most of us here are experienced enough to know that #10 position on page-1 of google SERP for some the competitive keywords is worth at least $6000+/month so if you can grab that #10 then it has to be said that cloaking is worth the effort. With some good commercial scripts available around it does not take any extraordinary effort also.
Blue_gravity listed some of the most important precautions but missed an important one:
- Keep your list of spiders updated.
Any pointers guys?
With some good commercial scripts available around it does not take any extraordinary effort also.
Is there a way to disguise or rename the cgi-bin?
Right now it seems to me that,
"by appearances" is a cloaking URL and will result in detection.
One way is to add some code to your.htaccess file to allow .htm file to be ran as cgi even outside the cgi-bin.
Here's what mine looks like.
AddType application/x-httpd-cgi .htm
This may not work for you depending on your host and how they have the server set up.
MrSpeed has a great method to use, but if you cannot get it working, just get yourself a new virtual host for $10 a month. Make sure that when you use a cloaking script or program, that it can work from outside the cgi-bin, otherwise it's pretty detectable, and do you really see a lot of high ranked /cgi-bin/ sites?
Most of the mainstream scripts and programs do the same thing as the other. If you're looking to buy one specifically, try it before you buy it. If there's no demo available, then compare the product features with their competitors.
Cloaking for ranking: is it still worth the effort?
We have to handle thousands of products, and to do it best is to cloak our pages and to redirect directly to the pages of our customer.
aren't you afraid of getting spamreported as you redirect to the customers page? even if your redirection is a server-side one (like cloaking works) compeitors can see that the text in the SERP is not the same as on the site of your client. how do you handle this?
If cloaked properly the only way a competitior can see if a page is cloaked is to view the cache (if the search engine has one like google) or to use some sort of a tool that spoofs the site to think he is a search engine.
In google there are ways to make it more difficult for a competitor to determine if you are cloaking. I think the answers have been pointed out in this thread and this one.
I think most would agree however that you do not need to cloak in google to reach a top serp.
however that you do not need to cloak in google to reach a top serp
aren't you afraid of getting spamreported