| 12:01 pm on Oct 16, 2001 (gmt 0)|
Well, if the spider uses it's stock User Agent then it will see the page intended for the SE using that UA, if it uses some other User Agent, then it will likely see the page intended for regular (non SE) visitors.
| 12:13 pm on Oct 16, 2001 (gmt 0)|
Okay, but let's say Google, fast, and AV...what are their spiders likely to see?
| 2:05 pm on Oct 16, 2001 (gmt 0)|
Given they use the stock agent, they would probably see the cloaked page. And with Google, it would also cache the page unless the noarchive tag is used. The Google agent is pretty consistent, whereas fast and AV appear to change monthly! Many times with only minor changes.
Beware that useragent spoofing is simple, and "others" could find the pages too.
| 2:06 pm on Oct 16, 2001 (gmt 0)|
I don't mind cause those pages are still relevent.
| 6:25 pm on Oct 19, 2001 (gmt 0)|
Worse case, some pages might be shown to the wrong target. Depending on the source request, it may not even be noticed. However if effective, it may draw attention from competition...
I picked up two more AV agents the past few days:
| 6:27 pm on Oct 19, 2001 (gmt 0)|
Also meant to add a new FAST agent: FAST-WebCrawler/3.3
| 8:45 pm on Oct 19, 2001 (gmt 0)|
So the cloaking is only as good as the UA or IP addresses that it depends on to deliver the content. Which leads me to ask, where can an up to date list of UA's and IP's be found or this more of a "home brew" where you add your own as you catch them?
Sounds like to iffy of a proposition without good lists.
| 9:28 pm on Oct 19, 2001 (gmt 0)|
Do yourself a favor and just buy a real cloaking script. The one I use is top notch and I have complete confidence in it....it actually notifies me of new agents and ip's (how's that for service).
I get a kick out of all the non-cloaking "doomsayers" who think it's certain death to cloak a page. Nonsense. The people that fear cloaking the most(and are usually most vocal against it) are the ones who don't understand how it works. Granted, without the script I use, I wouldn't cloak. But understanding how and what it does...I can cloak with assurance that if in fact I do get busted, somebody went to great length to do so.
| 9:46 pm on Oct 19, 2001 (gmt 0)|
The technology/scripting is pretty straightforward, I'm just wondering where you find a reliable list.
I've seen the question asked here before and the response is usually ahhh...silence.
| 9:47 pm on Oct 19, 2001 (gmt 0)|
>>>>I've seen the question asked here before and the response is usually ahhh...silence
Do yourself a favor and just buy a real cloaking script...the rest will follow.
| 9:48 pm on Oct 19, 2001 (gmt 0)|
I already have a script that works fine, I think you are suggesting more of a "subscription" than a script charge.
| 9:56 pm on Oct 19, 2001 (gmt 0)|
It's a club...we even have a secret handshake.
| 9:59 pm on Oct 19, 2001 (gmt 0)|
So I guess you are doomed unless you buy the secret decoder ring and engage in ritualistic behaviour.
| 10:20 pm on Oct 19, 2001 (gmt 0)|
>>>secret decoder ring
reading the log files
You're catching on very quickly :)
| 10:26 pm on Oct 19, 2001 (gmt 0)|
awww...I was looking for something easier than that.
Thanks for the info.
| 10:29 pm on Oct 19, 2001 (gmt 0)|
>people that fear cloaking the most(and are usually most vocal against it) are the ones who don't understand how it works.
Yup. I used to HATE cloaking before I could do it. Now I can do it I LOVE it. (Can't yet claim to totally understand it, mind...working on it though.)
| 3:00 pm on Oct 20, 2001 (gmt 0)|
The biggest success or failure for cloaking will be based on how current the agent and ip addresses are. With commercial products, they typically include a subscription service which provides updates for agents and addresses.
Without such service, one would have to create the list manually. There are a few resources available to start an initial list, however as toolman points out, only frugal review of the log files and utilization of reverse lookup will be the way to keep data current.
An intelligent script might be able to include new agents and addresses on the fly. If carefully written, it could cross reference existing data and compare such items as dns names, user agent and ip addresses. So by example, if a spider utilized a new UA and the ip address was already recorded, then it would be inserted as a new spider UA. If a new ip address was detected, but the UA matched...then a simple class "c" match with existing addresses could be used determine its authenticity.
| 8:32 am on Oct 23, 2001 (gmt 0)|
I'm working on it Junior!
| 9:23 am on Oct 23, 2001 (gmt 0)|
nicebloke (or others)>>How about making a cloaking HOWTO.
I know nothing about cloaking, don't hate, don't love,
I would love to know how to do it, What tooles to use and so forth