Welcome to WebmasterWorld Guest from 54.205.170.21

Forum Moderators: goodroi

Message Too Old, No Replies

deny a spider from visiting a single page

   
9:34 pm on Dec 24, 2003 (gmt 0)

10+ Year Member



Hi

I'm new to experimenting robots.txt files and I want to try something.

I've found that optimizing the content on a site for the most part works effectively across the search engines, except obviously not google - not anymore. :(

I've recently de-optimized a client's index page after it took a hit during the florida update in an attempt to make the content seem more organic and not optimized.

Unfortunately the rankings for that particular page slipped on the other engines. So what I did was make a clone of the index page and named it index2. I plan to reoptimize the index2 page back to the way it was and disallow googlebot visitation rights. Obviously, I don't want to deny googlebot from visiting the rest of the site though.

Is it possible to write a robots.txt file to keep googlebot from visiting just the index2 page?

Or is this just a stupid idea to begin with? Like I said, I have some room for experimenting here so I'm curious. Thanks in advance!

Michael

10:15 pm on Dec 24, 2003 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Michael,

Well, that would be:


User-agent: Googlebot
Disallow: /index2.html

The other alternative is cloaking. Personally, I wouldn't recommend either approach, unless it's a throw-away domain. I'd simply find a way to integrate a second, different, and useful page that would rank well where the other one does not. Others may disagree.

The main problem I see with this two-page robots approach is that you will be splitting your incoming links across two pages. Although the engines other than Google don't use the PageRank concept per-se, some of them do take into account "link popularity" and clustering. So, you'll take a hit from that angle as well. :(

Jim

3:08 pm on Dec 29, 2003 (gmt 0)

10+ Year Member



Thanks for your insight Jim! I'll try the robots.txt approach and see what happens.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month