Welcome to WebmasterWorld Guest from 107.21.149.142

Forum Moderators: mademetop

Message Too Old, No Replies

Sitemap hurting rankings?

Disallow or allow

     
5:46 am on Jan 12, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 15, 2004
posts:112
votes: 0


Hello I have a 150 page+ site indexed by yahoo as of now.

-The pages that got indexed before the sitemap all rank high.

However- pages that got indexed after the site map--all rank extremely low.

Should i disallow robots from indexing my sitemap?

I added a sitemap to this site so it would be faster and easier to index my pages.

***If i set my robots.txt file to disallow my sitemap, can the robots still scan all of its links?

I ask this because i know duplicate keywords from the same domain results in one of the pages to be penalized.

In this case i would rather have the sitemap being penalized instead of the actual page i want customers to click on.

Keep in mind that all pages in the website contain all different targetted keywords. The only duplicate content is links to all my pages on my sitemap.

6:04 pm on Jan 12, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 15, 2004
posts:112
votes: 0


Okay i have changed my robots.txt file to disallow robots from indexing my sitemap.

It looks like this:

User-Agent: *
Disallow: /sitemap.htm

My question:

Can search engines still scan this file for all my links if i changed my robots.txt file to disallow it. (from indexing)---i still want them to scan it however.

Thank you for any comments...

6:15 pm on Jan 12, 2005 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member encyclo is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Aug 31, 2003
posts:9068
votes: 4


If you disallow the site map page in a robots.txt, the spiders won't visit the page at all. If you want the spiders to visit the page and index the links, but not index the page itself, them you need remove the robots.txt ban and use a robots meta tag in the head of the site map page instead:

<meta name="robots" content="noindex,follow">
10:03 pm on Jan 12, 2005 (gmt 0)

New User

joined:July 18, 2004
posts:21
votes: 0


Always use a sitemap.

The most reasonable explanation for the low PR on the new pages is that they're new.

Sitemaps are the best way to get all of your pages indexed with the fewest hops from your root url.

In addition, the PR will probably go up once the rest of your site is re-indexed and the backlinks are recounted for the new pages. The sitemap most likely one of the only backlinks being counted to the new pages since other internal links may be further down in your website structure.

11:08 pm on Jan 12, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 15, 2004
posts:112
votes: 0


Yes i think i will allow them to follow the sitemap, but not index it.

My sitemap is getting higher rankings that the actual pages.

Does anyone think otherwise?

Has indexing sitemaps cause anyone else probs., similar to this?

3:30 am on Jan 13, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member caveman is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 17, 2003
posts:3744
votes: 0


Oy, this is the sort of thing that causes urban myths. I was tempted to post saying, 'yeah, me too,' but the voice on my left shoulder wouldn't let me do it. ;-)

Maybe they've got it indexed but not fully valued yet. Maybe they're just taking their sweet time. Maybe the floods affected their algo. Maybe the pages are not very valuable. Maybe they've instituted G-like aging factors. I dunno.

Whatever it is, site maps are a good thing.

3:36 am on Jan 13, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 13, 2004
posts:1425
votes: 0


<meta name="robots" content="noindex,follow">

I understand the "noindex" if you don't want the spiders to index / cache etc. the sitemap file,
but what is the "nofollow" for?

Isn't "nofollow" a directive to NOT follow and index the links listed from sitemap?
I would think that the guy wants noindex alone.

Do I have this all wrong? What precisely is the meaning of "nofollow"? - Larry

6:20 am on Jan 13, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 3, 2005
posts:61
votes: 0


If you read the code, it says "noindex,follow". Nowhere does it say "nofollow". So.. in essence, you're right, but it doesn't apply here.
7:09 am on Jan 13, 2005 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member pageoneresults is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 27, 2001
posts:12169
votes: 56


Sitemap = Table of Contents
Sitemap = Index of Products
Sitemap = Index of Services
Sitemap = Index of Locations
Sitemap = Index of ...

No, a site map will not harm your rankings. Site maps and/or index pages are an integral part of any well planned site. They are the glue that holds everything together.

For larger sites, well organized and segmented site maps are an added benefit. Think of your site maps as index pages for each specific area of your site. Break them down to their smallest common denominator. Then, link to those smaller entities from the main site map. Don't try to stuff everything on one page, it loses its value. Break them down and focus each site map.

8:28 pm on Jan 13, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 15, 2004
posts:112
votes: 0


So are you saying that telling the robots to follow the sitemap but not index it, wouldn't make a difference?

Or should i test it out and see if my rankins for those pages get better.

It is really weird...I have # 1 rankings for alot of keyphrases.

But i am finding on some pages (pages where i have spent hours optimizing) that the sitemap page gets a better ranking that the page. I couldn't even find some my pages in the first 10 pages. But the sitemap is on the 1st page on some of them.

3:26 am on Jan 15, 2005 (gmt 0)

New User

10+ Year Member

joined:Jan 7, 2005
posts:8
votes: 0


I run about 20+ domains and they all have sitemaps. So far 90% of them have high rankings, so I would not block sitemaps all together.

As a matter of fact I have higher rankings for my sitemaps in Google than in yahoo, etc.

So try it out and see if it works for you.

4:26 am on Jan 15, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 19, 2000
posts:2501
votes: 27


But i am finding on some pages (pages where i have spent hours optimizing) that the sitemap page gets a better ranking that the page. I couldn't even find some my pages in the first 10 pages. But the sitemap is on the 1st page on some of them.

That is because your new pages have not yet migrated to the search results, but your site map is considered an authority page for your site, so is showing up because it contains links to your new pages.

Wait for an update and then check again.

pageoneresults ... excellent post and very good advice! I intend to change my site map tomorrow using your formula. Many thanks! ;)

5:48 pm on Jan 17, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 8, 2004
posts:177
votes: 0


I think with most of this stuff patience is the best bet. Check a few times over a period of weeks before panicking, some things take time to settle down and every now and then all SEs have a wobble :)
6:27 pm on Jan 17, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:July 25, 2001
posts:141
votes: 0


I see no reason why a site map page could hurt your site - unless of course you're using black hat methods. If your site is being penalized for something it's probably for something other than your site map.