Forum Moderators: Robert Charlton & goodroi
I reported a day or two ago that my index page had reappeared again after being lost on June 27- Well now it's disappeared again.
I use sitemaps and the sitemap.xml file seems to be downloaded every few days. When my page reappeared it was listed under "site:www.mydomain.com" but now it's gone once again.
[edited by: tedster at 7:34 am (utc) on July 5, 2006]
Previously pages from my site showed the same cache date. Now they are all over the place with the most recent being the index page 28th June.
Could this be part of what Big Daddy is all about ie the cache rotation is dependant on the frequency of changes. In my case the index page changes more than inner pages in terms (minute and infrequent changes but still more than inner pages) of content and inbound links.
Sid
Possible of course.
But the reason sites have dropped and other sites risen seems to be connected to the position of the homepage on a site:domain.com check (with some exceptions of course)
Which indicates to me that they are testing/playing with a filter/tool/knob that is connected with how Google reads/treats the root and therefore the rest of the site.
Matt Cutts did mention at one stage the more intuitive site:domain.com results were a part of the Canonical url situation/problem/bug.....which we know Google can not fix.
Shortly after the 27th I checked, not only was my index not shown first in the site: results but it listed over 12,000 pages from my site when it should be more like 950, to reach 12,000 they would have had to included every file ever created over the last six years including deleted, orphaned and noindex meta tags.
Yesterday I noticed that the index and higher content pages are still not showing first rather it is the piddly ones that get very little traffic - But - they are back to showing the expected 950 pages.
At [64.233.189.107...] the site: results are what they should be including the index highest quality and most popular pages first - but - my rankings at this dc are still the lousy ones that kicked in after the 27th rather than the good ones I had before.
By the way, my site has a cache date of July 2
What is the popular opinion on [64.233.189.107...] - is it showing old data that hasn't been corrupted yet or is it showing new data that has been fixed?
my site ranks fine on Y & MSN its G that I'm having problems with
>At [64.233.189.107...] the site: results are what they should be
I'll agree with you there sweet results and I'm 10th for a single word & with site:command working fine too
[edited by: tigger at 3:42 pm (utc) on July 6, 2006]
Not something you can share with the forum rather than by Sticky?
ontrack
I think the view is that DC shows old results - who knows though.
My site:domain.com check works correctly in that dc sometimes - but not all the time - and I am the same was you - rankings are poor in that DC like all DCs eg - not pre 27th results.
As for the pagecount thing, mine has always been wrong by at least a factor of ten. However, I have noticed one odd thing as the days roll by since the 27th... that outrageous page count is slowly dropping down. First it was around 109K, then 101K, then 93K, 85K, 76K, 65K, and now 58K. My whole site only has around 6000 pages for the index.
Shrug.
At [64.233.189.107...] the site: results are what they should be
Our site came back on the 27th as others disappeared, and I'm very relieved to say it's still listed properly on the DC you mention here, with the new improved SERPs still there.
So...that's not the old listing - those results are something new.
It's showing different results to the current results for the site: search though. We're showing much less pages, but listing more of them without hitting the "similar pages" link.
I'd be happy with that.
so its agreed Google move [64.233.189.107...] over pls :0)
No, please. For us, that still gives the same, crappy listings for all of our sites, INCLUDING the bad site: listings...
Adobe-arse Dreamweaver says the DOCTYPE is good and there are no errors in our HTML, but the W3C moans and groans --
[validator.w3.org...]
over things that work in every browser on every machine I've seen in the past six years.
If Google's going to complain about W3C compliance, then 99% of pages they crawl will be wrong.
If Google's going to complain about W3C compliance, then 99% of pages they crawl will be wrong.
Not an issue.
What a surprise people are starting to find canonical issues for the effected sites. Remember even if you think it is fixed Google might come back with a split site cache from 25 million years ago.
Same old story.
i feel sitemaps are the cause of the supplementals for me. for about 6 months, i've been suffering from missing about 1/3rd of files in the google search. i decided to add sitemaps about a month ago. This helped because all the missing pages now were in the index and search volume was highest ever.. but... soon lots of pages started turning supplemental, and i've NEVER had supplementals. now, that i see half the site is supplemental, I'm going to have to say that sitemaps are to blame.
I'd rather have 2/3rd of the pages in the index, than have half the site as supplem, plus on top of that, have 10% of search volume :(
Plus, it's easy to find top-ranked pages with sloppy or old-school code--mine included. :-)
I hate to appear "stupid" but what is canonical?
Canonical: Google indexes both your www and non-www domain >> duplicate content >> penalty
I'm not sure I know what a "supplemental" is either?
Supplemental: separate index used for arcane search terms. I.e.: pages not good enough for the main index, but too good not to index at all.
Am I right?