|Themes & Cloaking|
A basis of understanding where similar tactics are useful...
| 1:53 pm on Jul 6, 2000 (gmt 0)|
I'm new to the themes deal. I mean in a sense i always used them, but never for an optimization technique. Lately I've noticed that some of you, Brett for example, has put alot of stock into this being the greater part of site rankings these days.
Here is my question. I have 7 manipulated pages which serve to different SE's via IP delivery. If I were to cross link these pages and have some sort of theme based in the words used on each page in the title, metas, and headlines, would this be a significant help?
In some cases my primitive cloaking has done me good. (I'm from Long Island, Why am I speaking hick?) Anyway, I've achieve SOME good rankings but never consistent. All of the pages I have made are optimized on KW density and KW placement, but that doesn't seem to cut it.
The only criteria, aside from the themes thing, is link pop, and I'm not to sure how to achieve that with my non visible spider served pages.... any insight?
| 12:21 am on Jul 7, 2000 (gmt 0)|
I think it would Joe. The difficult thing is adapting an existing site because you already are dealing with pages that are indexed. What I've tried with Alta over on SEW is feeding 404's on existing pages and trying to establish a new theme. As always, I've had mixed luck over there to the point - I'm starting to take it personal.
Fresh domains are another story. If you are starting with a new domain that has never been indexed before, that is the time to establish a theme from the get go. I go so far as to set the thing up before it is open to the public and not have a root page. Normally your root page will walk on theme because it is diverse by it's very nature.
I think 10 page min for Alta and 12 min for Ink. A few less just doesn't seem to stick with Ink. I think there is something going on at Alta - I look for a change real soon and don't know how themes are going to work with the new algo I've heard is coming shortly.
Rolling your own link pop should be done the old fashioned way. Go out and beat some doors and get some links before you submit. I think there is alot of prep work for the domain to be done before submitting. Also, don't be affraid to link from your other domains, even if off topic for the other domain. Just make sure that you use keywords of the new domain in the inbound link.
| 1:35 am on Jul 7, 2000 (gmt 0)|
For achieving link pop with your cloaked pages you should treat the cloaked pages as a "sub site", this will also allow you to "theme" the cloaked site. Notice I don't say "pages", but "site", treating your cloaked site as individual pages is a mistake IMO.
For example, suppose the script you use has a directory for each engine you will cloak for. So that pages for AV are in one directory, pages for Ink are in another, pages for human visitors in yet another, and so on ... treat each of those directories as a sub site with links and all.
The script must use the same URL no matter which visitor (spider or person) it is serving to for each given page. This is very important, link pop cannot be achieved if the URL is different per engine for the same page. Whew, hope that makes sense.
| 3:18 pm on Jul 7, 2000 (gmt 0)|
Ok, so let me get this straight.
The crosslinking would help a little a worst. Thats good to know.
As for the link pop, well, you kind of confused me a bit AIR. You say I should treat my pages as a cloaked site... so in other words....?
As of now. I submit my script pages which determine which IP gets what. My spiderfed pages are on a different domain, a sub web of a mispelled domain that we registered by accident. Now in order to gain link popularity I would have to assign the links to point to the redirect pages on the "dummy subdomain", right?
Excuse me if the questions seem simple, but the obvious has been avoiding me since I started this cloaking stuff. On that note, I'm going to the cloaking for beginners thread.
| 4:53 pm on Jul 7, 2000 (gmt 0)|
Well sometimes you have to make the best of what you inherit or got stuck with.
What I am describing is a domain where the real site exists, and for each search engine you cloak for the relationship is one real page, to one cloaked page, per search engine.
If we introduce redirects on separate domains, with "doorwayish" pages, and a ratio of many pages to one, then all bets are off, because you don't have to cloak to get into trouble with these methods, nor can you build link lop, 'cause I don't imagine you'll get too many willing to link to the "promotional" domain.
| 5:23 pm on Jul 7, 2000 (gmt 0)|
Wow. After all my gibberish, you decifered that and actually gave the answer I needed.
Welp. So basically there is no way I could cloak effectively unless I had the cloaking taking place on the actually domain.
I know it was obvious, but I need someone to say it.
There is one case though, where I found that this method was used how effectively, well, I've searched for them on a couple of engines but found nothing. Perhaps they have a bogus testimonial which surprises no one. These are the results I go from ALL THE WEB after searching under the URL.
Now most of these are redirects. How exactly I dont know, but I would imagine that they would be cloaking.
| 12:14 am on Jul 8, 2000 (gmt 0)|
If they are cloaking, and I don't believe they are, then they are serving up some pretty unattractive pages :-) It looks to me like they have just set up a bunch of third level domains, used frames on some pages and just plain made big text with their name in it. But their link popularity is pretty much non-existant. Do they rank well for any useful keywords in that category?
| 2:44 pm on Jul 9, 2000 (gmt 0)|
I am still a bit confused:
|The script must use the same URL no matter which visitor (spider or person) it is serving to for each given page. This is very important, link pop cannot be achieved if the URL is |
different per engine for the same page. Whew, hope that makes sense.
Whow, hmmmm . . .
All browser visible pages (for all SE's) are in one folder.
All SE-visible pages are in different folders (for each SE one).
Now, do you mean that the IP-visible pages must be located in a folder from the same url? (hmmm, al my SE-visibles are in my cgi-bin)
Or do you mean that all browser visible pages must use the same url for each SE?
|For example, suppose the script you use has a directory for each engine you will cloak for. So that pages for AV are in one directory, pages for Ink are in another, pages for |
human visitors in yet another, and so on ... treat each of those directories as a sub site with links and all.
This means cross linking the pages in the browser visible folder, right?
I am Cloaking "one-to one", but when using [u]one[/u] browser visible folder, I can't cope with giving each browser visible page the same size as the SE-visible page. How should I
Thanks in advance!
Edited by: Val
| 5:20 pm on Jul 9, 2000 (gmt 0)|
|Or do you mean that all browser visible pages must use the same url for each SE? |
I mean that each browser visible page must use the same URL for each SE. This way any links to your visible pages also count towards your cloaked pages. And as a spider follows links on your cloaked pages it can weave in and out of cloaked pages if need be, i.e. some of the pages are cloaked and others are not, maintaining link integrity throughout.
|I can't cope with giving each browser visible page the same size as the SE-visible page. How should I manage that? |
This is only important from a thwarting the snoopers point of view, the SE don't do anything with this. The way to handle it is to simply make the pages around the same size, the engines don't report the exact size. But I would not sacrifice well ranking pages in order to match page sizes.
It is not an absolute that if page sizes don't match you must be cloaking, especially with the SE's indexing schedules these days, it would take many months of tracking to determine it was not simply an updated page between indexing.
| 6:04 pm on Jul 9, 2000 (gmt 0)|
Tx Air, for your advice. I always wondered if SE were looking at that ....
Tx again :-)
| 2:27 am on Aug 3, 2001 (gmt 0)|
I'm still a little confused by this :(
I need to use a cloaked domain external to the target domain, for various reasons.
Could someone who has experience with this approach mind explaining how can I get link pop happening on the cloaked domain?
| 7:21 pm on Aug 15, 2001 (gmt 0)|
Hi - We build different themed web sites, get them indexed in the directories and then hang our special pages off of them, cross linking all themed KW/pages. We manage many domains, making sure that all their pages are thematically cross linked to every appropriate page.
We make sure that the client links to each of these special pages otherwise they void the contract. Hope this helps....
Brett, Air or anyone, if you can find any holes in this please let me know. thanks much
| 8:27 pm on Aug 15, 2001 (gmt 0)|
>Could someone who has experience with this approach mind explaining how can I get link pop happening on the cloaked domain?
You have to build it yourself. Be careful not to string too many sites together from the same class C.
| 8:38 pm on Aug 15, 2001 (gmt 0)|
Good point Littleman, we make sure that the first three octets in the IP of the linking pages are not similar.
| 9:07 pm on Aug 15, 2001 (gmt 0)|
Sevraypr, there is the road map danger. If you heavily cross link your client pages but don't have pages linking to outside of your domain web than someone could track down all your domains easily. I would mix up your client's links with links to domains outside of your sphere -- blend them in.
| 10:11 pm on Aug 15, 2001 (gmt 0)|
Thanks for your replies
This different octlet approach, can you do that from one ISP? i.e can you host multiple, different IPs for the purpose you describe and still be effective or would I need to create these sites at different ISPs?
| 2:52 am on Aug 16, 2001 (gmt 0)|
Thanks Littleman, I've actually made this request to our team earlier today. You guys are sharp!
I learned yesterday that Google did ban a cloaked domain and ended up following this domains outbound links (that were related to the domain) and ended up banning all of them too.... the person said that their outbound links were not theme related and that he suspected that someone turned him in... This person said that their cloaked pages were also "machine generated" so there were a few areas here the probably helped these domains get busted. More fun and games....
Anyone else developing multiple personalities dealing with all this stuff ?
| 3:06 am on Aug 16, 2001 (gmt 0)|
>Anyone else developing multiple personalities dealing with all this stuff ?
Absolutely! (Sorry Littleman, couldn't resist.)
| 3:58 am on Aug 16, 2001 (gmt 0)|
>...or would I need to create these sites at different ISPs?
Generally, unless you are with a huge ISP that has many ranges of IP's (multiple class C's) then you'll probably want multiple ISP's. Note that it isn't enough to just have different IP's, the starting octets should differ as well, not just the ending octet.
Of course littleman's caveat on linking just your domains together still stands, even with substantively different IP's.