Welcome to WebmasterWorld Guest from 22.214.171.124
On the G admin page, it shows no errors and that the file has been downloaded. When checking the log files, I see the bot coming and grabbing the ml file, but nothing more.
I was wondering if someone would be willing to take a look at the site and see if there are any problems. The site has a PR of 5 and all the results in G seem to be either missing pages (from the old site) or pdf files that have used from the old site.
I have a problem, Google has dropped 70%+ of our pages. For example half of our product pages have gone, each page was very similar, the product in question, plus the 6 current top selling products below it.
I'm not changing the site for Google, Yahoo delivers visitors to our pages and they convert very well, I don't want to risk losing those when Google can't even locate our server in the world right now, especially when the G SERPs look so dumb in our field.
I've looked at robots.txt to block Google from parts but it would be too long a file. What I'd like to do is put only some of the pages in Google by giving them a subset sitemap with only some of pages listed.
If you submit a sitemap does Google index all the pages it can find, or only the pages in the specified subset?
Anyone know how this works?
I thought at first it was removing old pages. I am trying to cleanup URL only links. What has happened, after several weeks, is that the old URL only links are still in there AND regular links with title and description. I think what is happening is that the SiteMap program has nothing to do with the first bot crawl. I think the SiteMap will bring the second bot in regardless of whether the first bot has crawled yet or not.
Results? A lot of pages indexed twice, one with title and description and one as URL only. Kind of has me worried about dupe content but my site has actually gained a few spots to #4 for a very competative phrase and gained #1 for another. Go figure. I really don't think the gain in the SERPs has anything to do with SiteMap. I am watching this closely at the moment.
Meanwhile, I cannot get SiteMap to work at all on another site. It has done nothing. No new pages, old dead links still indexed, ya da ya da ya da. I need someone with a bit more expertise to take a look and see if there is something wrong on this other site. I even tried the URL removal tool on G and got "request denied" on all the links I submitted. The pages are coming up as 404 Object not found. I checked with a header checker. This has got me baffled. Like I stated, the whole site was redone and all the files renamed. I am just trying to get some pages in the index.
So far, it was a waste of time. Not a single new page indexed. It does grab the xml regularly, it just does nothing with it.
It is still beta though, so maybe they'll use it soon.
However, until I see a benefit for having google suck down yet more of my bandwitdh...i won't be putting it on any of my remaining domains.