Welcome to WebmasterWorld Guest from 54.145.44.134

Message Too Old, No Replies

Robots.txt & Sitemaps for multi-country website

     

sjcbayona

6:23 am on Nov 11, 2011 (gmt 0)



Hi everyone,

I just hope I posted this question in the right category.

If you are doing international SEO for one site i.e. www.website.com for multiple markets and the local sites are adopting a subfolder method, how are you going to view & implement the robots.txt & sitemap.xml files for each of these local sites:

1. www.website.com/fr
2. www.website.com/in
3. www.website.com/sg

?

Thanks in advance for all the help.

Cheers!
Steve

tedster

6:27 pm on Nov 13, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Here's how I would approach it. First, be sure to verify each subfolder as "a website" in WebmasterTools.

You only need a single robots.txt served from the domain root. You can get some economies with your Disallow rules by using pattern matching wildcards - in particular the "*" character. If the same URL pattern appears in more than one subfolder, a single rule like this would address all of the prohibited files:

Disallow: /*/[filepath]

For the xml sitemap, you could do the same thing (just use one sitemap at the domain root), but for ease of maintenance, I would probably use a master sitemap.xml at the root that lists only the filepaths for individual sitemaps - one for each of the country-specific subfolders. Then list the location of the master sitemap.xml file in the robots.txt file.

sitemap: http://www.example.com/sitemap.xml

[edited by: tedster at 4:23 pm (utc) on Nov 15, 2011]

sjcbayona

4:15 pm on Nov 15, 2011 (gmt 0)



Got it. Thanks tedster for the step by step guide. Will do it now. Cheers!
 

Featured Threads

Hot Threads This Week

Hot Threads This Month