Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: goodroi
I have a site with 58 virtual subdomains. These are fed content from a single codebase. A permissions system determines what content is visible on which subdomain. All content is visible on the main domain. I have a sitemap generator that creates a xml sitemap correctly with all URL's showing at the main domain. However, when Google crawls the site, the bots return an error saying that "the url is not allowed for a sitemap at this location" , the locations being content at the various subdomains. So somehow Googlebot is crawling the subdomain, and disqualifying the url's as belonging to the subdomains.
Can anyone explain the proper way to prevent this from happening and get the sitemap and the content to be crawlwd correctly?
Any help would be appreciated.