|SiteMap? Or Site Troubles?|
One site works great, the other not...
I have been very successful with site maps on one of my sites. It has indexed all the pages that were missing in G. I have updated the sitemap file at least weekely and new crawls are coming. Very good deal for that site.
I have another site that I am trying the same thing and I get no results whatsoever. I have tried resubmitting the file, and still nothing is happening. It shows the file as being downloaded. When I do a site:mysite.com, I still get about 20% of an old site and no new pages whatsoever. It worked so well for the first site that I was kind of surprised that it seems not to be working at all for this site. I think it might have something to do with the menuing. The second site was totally rebuilt in May with new everything including all file names.
On the G admin page, it shows no errors and that the file has been downloaded. When checking the log files, I see the bot coming and grabbing the ml file, but nothing more.
I was wondering if someone would be willing to take a look at the site and see if there are any problems. The site has a PR of 5 and all the results in G seem to be either missing pages (from the old site) or pdf files that have used from the old site.
|On the G admin page, it shows no errors and that the file has been downloaded. When checking the log files, I see the bot coming and grabbing the ml file, but nothing more. |
That should read "...grabbing the xml file,..."
webdude, Can sitemaps be used to 'remove' pages? Has anyone tried it.
I have a problem, Google has dropped 70%+ of our pages. For example half of our product pages have gone, each page was very similar, the product in question, plus the 6 current top selling products below it.
I'm not changing the site for Google, Yahoo delivers visitors to our pages and they convert very well, I don't want to risk losing those when Google can't even locate our server in the world right now, especially when the G SERPs look so dumb in our field.
I've looked at robots.txt to block Google from parts but it would be too long a file. What I'd like to do is put only some of the pages in Google by giving them a subset sitemap with only some of pages listed.
If you submit a sitemap does Google index all the pages it can find, or only the pages in the specified subset?
Anyone know how this works?
I thought at first it was removing old pages. I am trying to cleanup URL only links. What has happened, after several weeks, is that the old URL only links are still in there AND regular links with title and description. I think what is happening is that the SiteMap program has nothing to do with the first bot crawl. I think the SiteMap will bring the second bot in regardless of whether the first bot has crawled yet or not.
Results? A lot of pages indexed twice, one with title and description and one as URL only. Kind of has me worried about dupe content but my site has actually gained a few spots to #4 for a very competative phrase and gained #1 for another. Go figure. I really don't think the gain in the SERPs has anything to do with SiteMap. I am watching this closely at the moment.
Meanwhile, I cannot get SiteMap to work at all on another site. It has done nothing. No new pages, old dead links still indexed, ya da ya da ya da. I need someone with a bit more expertise to take a look and see if there is something wrong on this other site. I even tried the URL removal tool on G and got "request denied" on all the links I submitted. The pages are coming up as 404 Object not found. I checked with a header checker. This has got me baffled. Like I stated, the whole site was redone and all the files renamed. I am just trying to get some pages in the index.
Can the robot.txt file hurt your website?
You betchya it can. You need to pay close attention on how you set this up. I think going to [webmasterworld.com...] would give you all the help you need.
SiteMap page is very useful for search engine to index inside pages because you cannot put all pages in your homepage for google index. It would comfortable for search engine that you build the page which is completed link of internal pages of your website.
Still no go on this. All the old pages. 3 new pages - nothing else.
I've setup Sitemap on 2 of my sites to see how it performs.
So far, it was a waste of time. Not a single new page indexed. It does grab the xml regularly, it just does nothing with it.
It is still beta though, so maybe they'll use it soon.
However, until I see a benefit for having google suck down yet more of my bandwitdh...i won't be putting it on any of my remaining domains.
mmm... This still doesn't explain why one site worked well and the other nothing. Gonna have to do some digging to see if there is a problem.