homepage Welcome to WebmasterWorld Guest from 23.22.29.137
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

    
How to block original + localized pages?
robots.txt localized
vanguard2000




msg:3603148
 6:19 pm on Mar 17, 2008 (gmt 0)

Hi all,

I have a website that has localized pages in several languages. I'd like to block "static" directory since it's full of unstructured data which would raise the duplicate content flags. The URL structure is like this:

http://www.example.com/en/static/ (when your browser's lang is set to English)
and http://www.example.com/fr/static/ (when it's set to French) etc.

How do I block "static" properly?

Would this work?

Disallow: /*/static/

I have a feeling that it would block everything :) Hehe...

Any help is appreciated!

 

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved