| 4:13 pm on May 5, 2003 (gmt 0)|
At least 100 URL's to play around with - Keep in mind that it's better to use dummy URL's because you don't want your mother URL banned or reported.
Can you exlain that? - I just don't get what you mean...
| 4:21 pm on May 5, 2003 (gmt 0)|
100 URL's to play with means someone should have an arsenal of at least 100 URL's for the keywords they are targeting. When one goes down, you still have 99 more in the index.
Dummy URL's are referring to creating cloaked only URL's that point to your main URL.
| 4:23 pm on May 5, 2003 (gmt 0)|
So you mean 100 domains?
| 4:31 pm on May 5, 2003 (gmt 0)|
Yes, URL's & Domains are the same to me ;]
| 4:36 pm on May 5, 2003 (gmt 0)|
Now I'M confused:
|I don't use more than 3-5 domains per URL |
|Yes, URL's & Domains are the same to me |
So, taking those 2 together yields:
"I don't use more than 3-5 domains per domain"
Could you elaborate on my obvious misunderstanding?
| 4:42 pm on May 5, 2003 (gmt 0)|
Ahh I'm sorry, that was a typo. Its:
"I don't use more than 3-5 URL's/Domains per IP"
I'll change it.
| 4:46 pm on May 5, 2003 (gmt 0)|
Let me sum up what I think you're saying:
You have a target set of KW's for 'blue widgets'.
You set up one main site that sells blue widgets and do nothing dodgy on it.
You then set up 99 'garbage sites' - Sites with no real content, possibly just auto generated nonsense.
These 99 sites link to your main site for good KW's.
That what you're saying?
| 5:01 pm on May 5, 2003 (gmt 0)|
Pretty much, but I wouldn't list them as pure garbage sites, because they all have SOME content on them, they aren't pure spammy crap. They can definitely be categorized as regular sites if you removed the cloaking tools.
| 5:06 pm on May 5, 2003 (gmt 0)|
Care to expand on that?
Whether you do or not, thanks for a facinating thread ;-)
| 5:24 pm on May 5, 2003 (gmt 0)|
Cloaking tools = scripts, methods, approaches, tactics.
People who just use scripts and artificial page generating are bound to get banned or caught much quicker than someone who uses specific tactics for specific engines, different keyword approaches, custom methods and scripts, and less greed for particular keywords, do those, and you will have your domains on the engines for many many months to come.
| 5:27 pm on May 5, 2003 (gmt 0)|
Not to sound like a complainer or anything, but if someone else can put up their own tips, maybe we can sum up cloaking as a whole, tips, tricks, do's and dont's without actually revealing your secrets in one thread, than have many many threads to sift through. I just think it would help out the people thinking about cloaking, and perhaps, clean up the cloaking aspect of SEO little by little.
| 5:30 pm on May 5, 2003 (gmt 0)|
I agree, unfortunately I have nothing to share. I think you'll find that your willingness to share knowledge will not be reciprocated. It's a paranoid area by definition.
Saying that though, you never know what may happen and I'd love to hear some other experiences...
| 5:32 pm on May 5, 2003 (gmt 0)|
Yeah you're probably right. Perhaps I'll release new tips and tricks every month or so, to keep everyone in check.
| 5:36 pm on May 5, 2003 (gmt 0)|
Lets not forget this:
| 5:54 pm on May 5, 2003 (gmt 0)|
If I get a new client, I often get a kw-kw.com domain on a unique IP and setup about 100 manually optimised cloaked pages (with manually inserted filler text) for the top 100 keywords in that area based on what Overture/Wordtracker/Adwords are saying.
Every cloaked page is only targeted at that 1 phrase which is highly relevant to the client's actual product/service.
Theres no X-linking between cloaked sites which are each on a unique IP.
The "real" pages often lie on top of the cloaked pages and if someone clicks a cloaked page ranking high in Google they usually get taken through to the client's home page either on the kw-kw.com domain or they are redirected to the main domain using a combination of meta refresh & js redirect. I don't use frames cos I find them too much hassle.
I also use a js "cache-buster" so if "cache" appears in the address bar (as on G), the user is taken to the correct page. I don't use <noarchive>
The 100 cloaked pages are split into 3 categories: Primary KWs, secondary & other. The linking on the cloaked pages is arranged to maximise the PR of the top 6 KWs.
I also build 3rd party links to the cloaked site.
Have you ever had sites banned that you didn't submit to Google? Is a manual editor seriously likely to x-check the database of submitted URLs if they're looking at a spam report? Also - Can you elaborate on your 4th point? Not sure I'm with you?
<edit>Last sentence added</edit>
| 5:58 pm on May 5, 2003 (gmt 0)|
1. Don't Use Automated Page Generators
2. Don't neglect your cloaked pages, change them a bit now and then.
3. Don't create a 50k page for viewers and a 3k cloaked page. ;)
4. Don't use an out of the box solution as is. Change directory and file names.
5. Don't burn 100 domains on one target domain. Spread them around a bit.
1. Do use the nocache tag for Google and don't forget about Gigablast's cache feature.
2. Do spend the time necessary to create cloaked pages that resemble real pages.
3. Do create a "clean" version of the site on a clean domain and have it ready.
4. Do create a noindex, nofollow site for Google and target other SEs with it.
5. Do remember to create outbound links on your cloaked pages to sites that aren't in your shadow neighborhood.
Expect to get caught now and then and prepare for it. Once a domain is burned remove all references for that domain from all your sites and do it quickly. Switch hosts occasionally. Toss in a few free host sites. Let a few sites die natural deaths.
Change your links around. Remove a few, add a few. Create a few off-topic sites and link them to your target. Don't get carried away with perfect anchor text. Use Kartoo now and then to see what your neighborhood looks like. ;)
| 6:25 pm on May 5, 2003 (gmt 0)|
Obviously I am a newbie. I never used cloak as I even ask this question:
what is cloak?
Could you masters explain a little bit?
Thanks much for your time .....
| 6:42 pm on May 5, 2003 (gmt 0)|
Lets not forget this:
| 6:54 pm on May 5, 2003 (gmt 0)|
Thanks Quinn.... and Air!
| 7:04 pm on May 5, 2003 (gmt 0)|
Awesome post! You definitely filled in a lot of the gaps that I forgot.
| 7:10 pm on May 5, 2003 (gmt 0)|
I use Kartoo [kartoo.com] to get a visual picture of link partners and strive to make sure that the neighborhood looks "natural".
If you compare your carefully crafted cloaked sites to other "natural" sites you can sometimes spot glaring differences in link patterns. If your links don't appear natural it is well worth spending the time fixing the problem.
| 7:14 pm on May 5, 2003 (gmt 0)|
Thanks for showing me Kartoo. I was using similar java type sites, but the flash stuff just makes it easier and more informative.
| 9:21 pm on May 5, 2003 (gmt 0)|
In my opinion, for what it's worth, there's no need at all to cloak to get good rankings. Having said that we manage over 200 sites and successfully cloak them all.
Brett Tabke's 12 month plan is all you need, but if you've read it you'll realise it takes hard work and effort to make a single website successful. More than a few websites? Cloak, but be careful, it has to be done correctly, if not u could lose everything.
| 10:22 pm on May 5, 2003 (gmt 0)|
Aren't you contradicting yourself when you said this:
"In my opinion, for what it's worth, there's no need at all to cloak to get good rankings. Having said that we manage over 200 sites and successfully cloak them all."
Then why do you cloak?
| 11:19 pm on May 5, 2003 (gmt 0)|
Aren't you contradicting yourself when you said this.
I also said - more than a few websites then cloak. We have over 200 - we couldn't possibly manage them all each month using Brett's 12 month guide without some form of automation or cloaking. One or 2 websites u don't need to cloak, any more you'll find yourself looking for shortcuts, and like I said you've got to be careful.
| 12:26 am on May 7, 2003 (gmt 0)|
|I think you'll find that your willingness to share knowledge will not be reciprocated. It's a paranoid area by definition. |
I find it's that way at the affiliate board too. Too bad since there is plenty to go around for everyone. I may not be working the same niche as the next guy. And what I lack in one area I may excel in another. To paraphrase a stupid corporate slogan..
TEAM - Together Everyone Achieves More
|1. Do use the nocache tag for Google and don't forget about Gigablast's cache feature. |
Some people claim that throws up a red flag.
| 12:30 am on May 7, 2003 (gmt 0)|
>>Some people claim that throws up a red flag.
That rumor was certainly prevalent a year ago but I have seen absolutely no evidence to support it. There are valid reasons for not allowing Google to cache your pages.
What I know for certain is that if you don't use the nocache tag you better make sure your pages are so similar that no one can tell the difference. ;)
| 4:07 am on May 7, 2003 (gmt 0)|
I don't think it throws any "red flag". I've actually asked people about this today, and all agreed with me, that it does nothing of the sort, and was probably just a rumor started to scare people into not using it so they can see eachother SEO methods. I use it, and I think everyone who cloaks should use it too. It's there for a reason.
| 9:50 am on May 7, 2003 (gmt 0)|
Anyone spot the post from Googleguy approx 4/5 months ago (on this forum I think) saying that they'd looked at a cross-section of sites using the nocache/noarchive tag and 95% were cloaking?
That taken in combination with the chat WebGuerrila(?) had with Matt at pubcon last week which was along the lines that they're looking to implement a semi-automtated system to check spam reports & using <nocache> might not be so smart?
A competitor would easily spot the abscence of cache and then report you for spam & this reports would be automatically scanned.
In addition, couldn't any new system as suggested also be applied to every site that uses <nocache> & then automatically compared with results through a browser?
If someone can explain to me why its hard for Google to spot UA & IP cloakers using nocache, I'd love to know.
I'm just a bit worried about implementing it with all I've read to date. Any thoughts?
| This 51 message thread spans 2 pages: 51 (  2 ) > > |