Msg#: 4058654 posted 9:57 pm on Jan 11, 2010 (gmt 0)
Hi. I posted a similar topic recently. However, it applies much more to sitemap.xml file than any other file.
I'm looking for a way/ways for me/other websmasters to SHOW (obviously) the sitemap to Google's (and otehr SE) webbots, but to HIDE it from other public (competitors!) viewing. Otherwise it seems like we're essentially paying for our competitors' spyfu etc accounts.
I don't want our competitors to see our whole list of keywords, which are built into the URLs. It's a very common thing for webmasters to do this. It's like handing over your entire private list of kw and competitive business information to your competitors. Otherwise, it's like having all your private bank accounts and pins :) posted publicly.
Msg#: 4058654 posted 12:03 pm on Jan 12, 2010 (gmt 0)
it's easy to spider a site to get a list of all the urls and titles. you might consider responding with a 403 based on user agent but user agent spoofing is also fairly simple. if you want to get more sophisticated, the Search Engine Spider and User Agent Identification forum Charter [webmasterworld.com] has information about verifying Googlebot and it is a similar process for the others.
Msg#: 4058654 posted 5:23 am on Feb 2, 2010 (gmt 0)
AWESOME! Yeah, I thought I saw something like that a few weeks ago...but I guess it sort of went "in one eye and out the other". Hey, thanks. Sometimes it's those simple ideas that are the best. Obviously, we gotta understand (I know you do, but for other readers I mean), that, as phranque said, they can still spider your site and then manually (more likely with a macro) search each of those URLs one-by-one, but right now, yours is the best idea. As always I'm open to new ideas, and it would be cool to see additional solutions here. Thanks so much.