| 9:27 am on Jan 31, 2006 (gmt 0)|
I noticed another page disppear today. Has anyone had this happen to them?
| 4:14 pm on Jan 31, 2006 (gmt 0)|
I have two sites, both use google sitemaps. My one site I put up in december, added a google site map and it did not even hit the sandbox. I have some top 10 serps for certain keyword.
Now my other site that has been around for a year, I used google site maps as well and I have been seeing the URL only for it. So I am kind of mixed on it. Googlebot is visiting it regularly but it is not wanting to index quickly.
Ironic how a new website can index completely in a week yet my old site is having some issues still.
| 4:43 pm on Jan 31, 2006 (gmt 0)|
I dont beleive sitemaps makes much difference at all, apart from on sites with paramaters in urls.
I have 4 sites in site maps, some old some new and its all same old same old really.
No better, no worse for me - even crawling looks the same in my experience.
I think your problem would lay elsewhere, not because of sitemaps - just my 2c
| 6:07 pm on Jan 31, 2006 (gmt 0)|
Have you looked at the BD datacenters to see if the problem is present there? What you describe seemed to effect a lot of sites within the past 2 weeks though not related to Site Maps.
If anything Google might be trying to force people into Site Maps, another one of their snoop tools, with poor indexing.
| 7:38 pm on Jan 31, 2006 (gmt 0)|
A lot of my pages that are not indexed are in the google base and froogle as well. Just a thought but may be since its in froogle it will not show up in google search.... Thoughts anyone?
| 9:26 pm on Jan 31, 2006 (gmt 0)|
The page disappearing would have nothing to do with google sitemaps, as all that does is sends a spider to know pages using the sitemap.xml file as a 'map' if you will of your site.
The spider still has to pick up the title, content and elements from the page to display. How it gets there is of no concern really with how it displays your information except in the case of hijacking.
| 6:56 pm on Feb 1, 2006 (gmt 0)|
outland88, I had a log on 126.96.36.199/ which I believe is a BD datacentre (is that correct?)and things are a bit different.
There are no URL only results but about 20 sub pages are missing from the index completely.
| 7:20 pm on Feb 1, 2006 (gmt 0)|
i guess you cant delete forum posts ...moving message to other thread
| 8:00 pm on Feb 1, 2006 (gmt 0)|
Try 188.8.131.52 and see if results are any different.
| 8:30 pm on Feb 1, 2006 (gmt 0)|
|Try 184.108.40.206 and see if results are any different. |
220.127.116.11 shows the same number of pages as 18.104.22.168
| 10:44 pm on Feb 1, 2006 (gmt 0)|
Do the missing pages have the same template design, and similiar content?
| 10:52 pm on Feb 1, 2006 (gmt 0)|
Don't blame the sitemap, URLs harvested from sitemaps are treated like URLs discovered elsewhere.
| 12:22 am on Feb 2, 2006 (gmt 0)|
Doubtful google would hurt a website because of sitemaps.
| 12:29 am on Feb 2, 2006 (gmt 0)|
|Do the missing pages have the same template design, and similiar content? |
The content is very different on each page.
Each document starts off in life as an XML document and is transformed using XSL, so each one shares a similar HTML structure, but I must stress that the design is very lightweight.
It's all CSS and DIVs, the pages don't even have a navigation bar (all navigation on these pages is via a breadcrumb trail). The only repetition baring a 3 word footer is in the placement of the HTML tags (H1, DIV, P etc.) There's no visible content that is repeated.
| 12:36 am on Feb 2, 2006 (gmt 0)|
The reason I asked about the sitemaps is because that was the most major change I could think of. I was wondering if others had problems with the sitemaps or if it was a known problem.
Judging by people's responses, it's clear that it's not the sitemaps that are at fault.
One thing I can think of. I changed the XSL templates slightly recently. Because of this, every document in each class of page (category, subcategor and article) now has exaxtly the same last modified time.
I'm wondering if this might set off alarm bells with Google as automated spam would show a similar characteristing (although each page on my site is hand written, I don't even have a CMS)
Does anyone agree or disagree that identical last modified dates could cause a problem?
| 1:00 am on Feb 2, 2006 (gmt 0)|
Check that the missing pages validate.
Is Googlebot visiting as normal.
Any errors showing in the log files.
Check the server headers
And try a spider simulator.
| 1:19 am on Feb 2, 2006 (gmt 0)|
are sitemaps necessary to rank in google? Please advise.
| 2:04 am on Feb 2, 2006 (gmt 0)|
|are sitemaps necessary to rank in google? Please advise. |
No, they're not. Please don't hijack other people's threads.
| 2:06 am on Feb 2, 2006 (gmt 0)|
Yes, Googlebot is visiting as normal. It visited the pages that disappeared and it got a 200 status code from those pages. All pages on the site are valid HTML4.01 Transitional.
| 2:12 am on Feb 2, 2006 (gmt 0)|
No Site maps are not necessary and if Google is finding your pages properly it's best to leave it alone. If it ain't broke etc.
MrMister it sounds like you're already doing pretty well with a small Adsense site. Few if any rank well in areas I frequent and they are competitive areas. My url info came back today after a two week hiatus. Could change tomorrow with an engine that seems to be bursting at the seams. Give it some time. Sometimes you can set off a filter when you haven't touched a site in a while.
| 3:17 am on Feb 2, 2006 (gmt 0)|
|However a week ago, I signed up to Google Sitemaps. I am wondering if this is connected in any way with the URL-only pages that I'm starting to see. |
MrMister: I don't really know what the problem is, but there have been other posts referring to the same scenario over the last few months. So, maybe it's not the sitemap that did it, but maybe it was. I've wondered if it's a canonical issue when I read the other ones. Sorry that this post is relatively useless, but I thought I'd chip in and add that it may be the sitemap (it may not be too).
| 6:44 am on Feb 2, 2006 (gmt 0)|
I believe that the sitemap just triggers the changes. More exactly the sitemap makes the changes more probable but is not by itself responsible neither for the changes nor for their direction.
When Google takes the sitemap into consideration it probably compares the new understanding of your site structure with the one it had before the site map. Two cases are possible.
1. Sitemap adds nothing to the old Google understanding of the site structure. Than nothing happens, SERP is not recalculated etc. It is actually the most typical case probably.
2. Sitemaps adds something new to the previous Google understanding of your site structure. In this case SERP is recalculated.
Whether result is positive or negative depends on your site, Google algo but not on the sitemap.
Sitemap just opens the door.