homepage Welcome to WebmasterWorld Guest from 54.166.113.249
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
Forum Library, Charter, Moderators: mademetop

General Search Engine Marketing Issues Forum

    
Sitemap hurting rankings?
Disallow or allow
IntegraGsrBalla

10+ Year Member



 
Msg#: 6008 posted 5:46 am on Jan 12, 2005 (gmt 0)

Hello I have a 150 page+ site indexed by yahoo as of now.

-The pages that got indexed before the sitemap all rank high.

However- pages that got indexed after the site map--all rank extremely low.

Should i disallow robots from indexing my sitemap?

I added a sitemap to this site so it would be faster and easier to index my pages.

***If i set my robots.txt file to disallow my sitemap, can the robots still scan all of its links?

I ask this because i know duplicate keywords from the same domain results in one of the pages to be penalized.

In this case i would rather have the sitemap being penalized instead of the actual page i want customers to click on.

Keep in mind that all pages in the website contain all different targetted keywords. The only duplicate content is links to all my pages on my sitemap.

 

IntegraGsrBalla

10+ Year Member



 
Msg#: 6008 posted 6:04 pm on Jan 12, 2005 (gmt 0)

Okay i have changed my robots.txt file to disallow robots from indexing my sitemap.

It looks like this:

User-Agent: *
Disallow: /sitemap.htm

My question:

Can search engines still scan this file for all my links if i changed my robots.txt file to disallow it. (from indexing)---i still want them to scan it however.

Thank you for any comments...

encyclo

WebmasterWorld Senior Member encyclo us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 6008 posted 6:15 pm on Jan 12, 2005 (gmt 0)

If you disallow the site map page in a robots.txt, the spiders won't visit the page at all. If you want the spiders to visit the page and index the links, but not index the page itself, them you need remove the robots.txt ban and use a robots meta tag in the head of the site map page instead:

<meta name="robots" content="noindex,follow">

Tutorialized



 
Msg#: 6008 posted 10:03 pm on Jan 12, 2005 (gmt 0)

Always use a sitemap.

The most reasonable explanation for the low PR on the new pages is that they're new.

Sitemaps are the best way to get all of your pages indexed with the fewest hops from your root url.

In addition, the PR will probably go up once the rest of your site is re-indexed and the backlinks are recounted for the new pages. The sitemap most likely one of the only backlinks being counted to the new pages since other internal links may be further down in your website structure.

IntegraGsrBalla

10+ Year Member



 
Msg#: 6008 posted 11:08 pm on Jan 12, 2005 (gmt 0)

Yes i think i will allow them to follow the sitemap, but not index it.

My sitemap is getting higher rankings that the actual pages.

Does anyone think otherwise?

Has indexing sitemaps cause anyone else probs., similar to this?

caveman

WebmasterWorld Senior Member caveman us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 6008 posted 3:30 am on Jan 13, 2005 (gmt 0)

Oy, this is the sort of thing that causes urban myths. I was tempted to post saying, 'yeah, me too,' but the voice on my left shoulder wouldn't let me do it. ;-)

Maybe they've got it indexed but not fully valued yet. Maybe they're just taking their sweet time. Maybe the floods affected their algo. Maybe the pages are not very valuable. Maybe they've instituted G-like aging factors. I dunno.

Whatever it is, site maps are a good thing.

larryhatch

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 6008 posted 3:36 am on Jan 13, 2005 (gmt 0)

<meta name="robots" content="noindex,follow">

I understand the "noindex" if you don't want the spiders to index / cache etc. the sitemap file,
but what is the "nofollow" for?

Isn't "nofollow" a directive to NOT follow and index the links listed from sitemap?
I would think that the guy wants noindex alone.

Do I have this all wrong? What precisely is the meaning of "nofollow"? - Larry

mwack

5+ Year Member



 
Msg#: 6008 posted 6:20 am on Jan 13, 2005 (gmt 0)

If you read the code, it says "noindex,follow". Nowhere does it say "nofollow". So.. in essence, you're right, but it doesn't apply here.

pageoneresults

WebmasterWorld Senior Member pageoneresults us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 6008 posted 7:09 am on Jan 13, 2005 (gmt 0)

Sitemap = Table of Contents
Sitemap = Index of Products
Sitemap = Index of Services
Sitemap = Index of Locations
Sitemap = Index of ...

No, a site map will not harm your rankings. Site maps and/or index pages are an integral part of any well planned site. They are the glue that holds everything together.

For larger sites, well organized and segmented site maps are an added benefit. Think of your site maps as index pages for each specific area of your site. Break them down to their smallest common denominator. Then, link to those smaller entities from the main site map. Don't try to stuff everything on one page, it loses its value. Break them down and focus each site map.

IntegraGsrBalla

10+ Year Member



 
Msg#: 6008 posted 8:28 pm on Jan 13, 2005 (gmt 0)

So are you saying that telling the robots to follow the sitemap but not index it, wouldn't make a difference?

Or should i test it out and see if my rankins for those pages get better.

It is really weird...I have # 1 rankings for alot of keyphrases.

But i am finding on some pages (pages where i have spent hours optimizing) that the sitemap page gets a better ranking that the page. I couldn't even find some my pages in the first 10 pages. But the sitemap is on the 1st page on some of them.

Webtoolpros

5+ Year Member



 
Msg#: 6008 posted 3:26 am on Jan 15, 2005 (gmt 0)

I run about 20+ domains and they all have sitemaps. So far 90% of them have high rankings, so I would not block sitemaps all together.

As a matter of fact I have higher rankings for my sitemaps in Google than in yahoo, etc.

So try it out and see if it works for you.

Liane

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 6008 posted 4:26 am on Jan 15, 2005 (gmt 0)

But i am finding on some pages (pages where i have spent hours optimizing) that the sitemap page gets a better ranking that the page. I couldn't even find some my pages in the first 10 pages. But the sitemap is on the 1st page on some of them.

That is because your new pages have not yet migrated to the search results, but your site map is considered an authority page for your site, so is showing up because it contains links to your new pages.

Wait for an update and then check again.

pageoneresults ... excellent post and very good advice! I intend to change my site map tomorrow using your formula. Many thanks! ;)

chrisgarrett

10+ Year Member



 
Msg#: 6008 posted 5:48 pm on Jan 17, 2005 (gmt 0)

I think with most of this stuff patience is the best bet. Check a few times over a period of weeks before panicking, some things take time to settle down and every now and then all SEs have a wobble :)

Michael Weir

10+ Year Member



 
Msg#: 6008 posted 6:27 pm on Jan 17, 2005 (gmt 0)

I see no reason why a site map page could hurt your site - unless of course you're using black hat methods. If your site is being penalized for something it's probably for something other than your site map.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved