homepage Welcome to WebmasterWorld Guest from 107.22.70.215
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

    
How to block original + localized pages?
robots.txt localized
vanguard2000




msg:3603148
 6:19 pm on Mar 17, 2008 (gmt 0)

Hi all,

I have a website that has localized pages in several languages. I'd like to block "static" directory since it's full of unstructured data which would raise the duplicate content flags. The URL structure is like this:

http://www.example.com/en/static/ (when your browser's lang is set to English)
and http://www.example.com/fr/static/ (when it's set to French) etc.

How do I block "static" properly?

Would this work?

Disallow: /*/static/

I have a feeling that it would block everything :) Hehe...

Any help is appreciated!

 

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved