homepage Welcome to WebmasterWorld Guest from 54.237.57.53
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Marketing and Biz Dev / Cloaking
Forum Library, Charter, Moderator: open

Cloaking Forum

    
Asp.net
New ASP.NET feature for cloaking
johnhamman

10+ Year Member



 
Msg#: 332 posted 10:41 pm on Apr 3, 2002 (gmt 0)

Hi all, Question:
Asp.net has some great new features and i was wondering if anyone has any detailed information about the new HttpBrowserCapabilities.Crawler features, which 'Indicates whether the client browser is a web-crawler/spider from a search-engine.' Does anyone have any idea how efficient this is? I can see great potential if it realy does weed out the crawlers.
anyone else out there know about this?

 

Air

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 332 posted 12:07 am on Apr 4, 2002 (gmt 0)

Hello johnhamman, welcome to wmw.

It will be no more or less effective than any other User Agent sniffing method. HttpBrowserCapabilities relies on the filters defined to detect browser capabilities and likewise search engines. I guess the short answer is it will be as good as the filter definitions, then add the normal misses attributed to User Agent spoofing and we're pretty much where we started :)

johnhamman

10+ Year Member



 
Msg#: 332 posted 2:59 am on Apr 4, 2002 (gmt 0)

So realy what I need to do is add a IP filter to this built in service and then it might be more effective?

digitalghost

WebmasterWorld Senior Member digitalghost us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 332 posted 3:13 am on Apr 4, 2002 (gmt 0)

I always recommend UA and IP identification for cloaking.

As for effectiveness, that is dependent on what the purpose of the cloak is. If you are just sniffing to redirect for specific browsers then simple techniques can be effective.

If you're interested in serving up highly optimized pages to search engines while delivering different pages to end users then scripts need to be written, templates created for specific engines, rendering times evaluated and pages carefully created to ensure that human review can't determine that cloaking is being utilized.

I'm not sure what techniques the SEs are using as their de facto method to determine cloaking, but competitors that want to report cloaked sites use a variety of different methods. It pays to be familiar with all of them before cloaking a site.

DG

johnhamman

10+ Year Member



 
Msg#: 332 posted 3:43 am on Apr 4, 2002 (gmt 0)

Great.I appreciate that. Good info. Im working on makeing a Dll file that will carfuly filter out and redirect to the appropriate page for SEO.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Cloaking
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved