Welcome to WebmasterWorld Guest from 54.226.46.6

Forum Moderators: goodroi

Message Too Old, No Replies

Blocking Includes (SSI)

blocking ssi

     

direktor1

2:22 pm on Jan 7, 2004 (gmt 0)

10+ Year Member



Hi,

I'm having trouble using server-side includes for a drop down menu on my site. When the robot traverses a page it reads everything in the drop down menu and since the menu is on every page, it lowers the relevancy of search results.

If I all a line:
Disallow: /include.inc

Or

Disallow: /includes/

to my robot.txt file, will that block the robot looking at the inluded file on the page or will it still read the inlcude after the code has been processed.

If that does not work, can anyone suggest a way to get the robot to not look at a drop down menu in an include other than using client-side scripting.

Thanks

Adrian

PCInk

3:46 pm on Jan 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



No that won't block it. Your server will include the file (say index.shtml), send it back to the spider and the spider will index the full file (index.shtml has not been excluded by robots.txt). The spider knows nothing about the includes, as your server did it prior to 'despatch'.

Client side is one of the few possibilities, but this is cloaking. Google do not like cloaking and can (and will) drop you out of their index for this tactic.

Some people seem to think that using CSS to position the menu at the location you want, whilst having the drop-down menu code at the end of your html file, will help. I haven't tried this yet, so I could not comment on if it is true. It uses the assumption that search engines give more weight to words higher up on the html code and that they only index the first fixed amount of the code, ignoring the end of the code.

direktor1

4:03 pm on Jan 7, 2004 (gmt 0)

10+ Year Member



Thanks for the reply. I will look into doing that.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month