Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Any risks in using Sitemap: in robots.txt for rolling out new pages?

         

latimer

7:56 pm on Apr 16, 2008 (gmt 0)

10+ Year Member



We have stayed away from Google's sitemap submit via their webmaster tools as have had no real reason since our listings have been fairly steady in google.

Now we have created lots of new pages that google doesn't yet have and are considering the pros and cons of using the Robots.txt Sitemap: and/or google'es webmaster tools sitemap submit to put out a sitemap of the new urls.

Would this be a faster way to get the pages into google rather then list the pages as a directory linked off the homepage?

Some of the new pages are links to search results pages from searches on our site. Any issues with these pages?

We already have several directory type pages linked to from the home page that have served the sitemap function to provide google easy access to all the pages on the site. Checking today, the largest of these pages can't be found in the google index, and one of smaller ones is in google but the cache version shows the page cut off about 2/3 of the way through. These pages have been fully spidered and indexed in the past, and not sure why the problem now.

Although we are still showing most of our pages in the index, there are a few thousand missing and wondering if we are better off creating the Sitemaps: in Robots.txt for these directories, or to ride it out and see if google picks them up fully next time around?

One of the concerns is that submitting the sitemaps either via webmaster tools, or on the robots.txt file may create risk to the already indexed pages, or that google's view of our site will somehow shift with negative effect.

Would it be less risky to use the Robots.txt Sitemap: and still keep the other directory pages going? Is there less risk with google to use the Robots.txt Sitemap: and not submit via webmaster tools, or should we use the tools also?

1. Go all out and keep the directory pages we have been using, add another directory for the new pages, plus submit sitemap via webmaster tools, and also use robots.txt Sitemap:

2. Don't use either google's webmaster tools or robots.txt Sitemap: and just continue using directories linked to off the homepage.

3. Use the directories and the Robots.txt Sitemap: Don't use google's webmaster tools.

4. Use only the Robots.txt Sitemap.

5. Any variations on the above.

What would you do?

tedster

5:08 am on Apr 17, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The only difference I can think of between submitting an xml sitemap directly to Google and using the Sitemap: extension to the robots.txt protocol is that the robots.txt approach points all the search engines to the sitemap, not just Google.

Google use the sitemap to discover urls. They may or may not spider them, according to their own logic. Even if spidered, those urls may or may not be indexed, again, according to the Google team's algorithmic decisions. So the moral of the story is that good indexing for new pages - especially indexing that sticks - requires a decent link architecture within the site.

I can't think of reasons why the discovery of new legitimate urls would negatively effect what's already in the index. As long as everything is technically sound, with no duplicate urls pointing to the same content and all that, all should be well. If you're hoping to get hundreds of thousands of urls into the index with only a handful of backlinks, that won't happen - but I don't get the sense that this is what you're doing here.

[edited by: tedster at 5:52 pm (utc) on April 17, 2008]

latimer

2:30 pm on Apr 17, 2008 (gmt 0)

10+ Year Member



thanks Ted