Welcome to WebmasterWorld Guest from 54.196.144.100

Message Too Old, No Replies

Google Sitemaps - Crossover Content in Webmaster Tools

     

backdraft7

1:30 am on Feb 17, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



I'm not sure if this is a concern or not...

In sitemaps, I have one sitemap for may main site and another separate sitemap for my blog.
The blog is located in a subdomain, however, Google is indexing my blog content right along with my main site , according to the main site reporting area.

Under my main site reports (Your site on the web > Keywords), I can see it is picking up keywords mostly from my blog. I suspect this is diluting my main site content.
Looking at the internal links report, it shows mostly my blog pages.

My blog is NOT included in the sitemap that I submitted for the main site.

Any ideas on how I can better seperate the content that google is indexing for these two same domain areas?

They seem to be disobeying the site map...

I would guess adding an robots.txt exclusion in the main site root for the blog directory, but will that prevent the other blog sitemap from indexing?

I am also somewhat concerned about the blog install directory...

The main site is : www.mydomain.com (of course)
The blog is : blog.mydomain.com - however the blog is also reachable by: www.mydomain.com/blog/home/ (that's the resultant URL when you go to blog.mydomain.com)

Any advice is appreciated.

backdraft7

6:00 pm on Feb 17, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



I'll try to elaborate a bit on this:
For my main site, in WMT when I go to the Keywords report, it is showing the names of months, January, February, etc as detected keywords. This is just my "archive" list from my wordpress blog. It is also displaying many other keywords that definitely do not appear in my main site, but do appear in the blog.

I'm not sure if this crossover over of "detected" keywords is a bug in WMT's or if it's bad direction on my part. I have a separate site map for each area. I'm pretty sure it's something I'm doing wrong. I'm thinking I need two robots.txt directives, one for each area, but this is were it gets sticky. Don't want to clobber my current fragile rankings.

(thanks for moving the post tedster, however "tolls" should read "tools"...no biggie)

tedster

7:28 pm on Feb 17, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



The blog is : blog.mydomain.com - however the blog is also reachable by: www.mydomain.com/blog/home/ (that's the resultant URL when you go to blog.mydomain.com)

You should use a 301 redirect, either one way or the other, and only allow one of those URLs to be indexed. Is that the case?

backdraft7

9:24 pm on Feb 17, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



@tedster...

I prefer the blog.mydomain.com but in doing a spot check, it looks like google is indexing the longer version. Not sure why I did the subdomain to begin with, I figured it would put a separation between my blog content and my main content.

If I do a site:blog.mydomain.com it return no results. The long version produces 1,510 results.

I already do a redirect for mydomain.com to www.mydomain.com to eliminate the possible canonical issue there.

Sorry, I know this is kinda noobie stuff, but I gotta check everything from the ground up. With all the changes afoot, taking a step in any direction can become a virtual mine field.

Thanks!