Forum Moderators: Robert Charlton & goodroi
Good stuff.
If you have only had it since the beginning of Jagger then that is not a long time - the problems hit in Algera time.
Must be a nice feeling to get your site back :sigh:
Yep - test DC not showing test results again.
Powerofeyes - I 100% agree - I think the test DC is where it will happen.
For a start I dont see any of the issues addressed that MC has been talking about on the other 2 dcs that have been mentioned.
But I must be missing something - I cant see anything unusual/worthy of comment on the DCs at all.
If you have only had it since the beginning of Jagger then that is not a long time - the problems hit in Algera time.
Or two months before Allegra, in my case.
Not all of us agree with your mindset: I don't see.. I don't hear... I don't talk... I don't think UNLESS Google tells me to do so :-)
Nope I don have that mindset, Infact when you go way back in jagger thread i was one of the first person to bring this new DC into picture, So i do like watching DCs,
But I dont tell anyone keep watching these DCs, Keep watching these DCs, unless I am 100% sure, But you pretend to be google employee and you want people to watch the DCs which you like,
P.S the test DC is offline now.
Google employee or not ... Reseller has kept me sane during this time of uncertainty with his help and advice ... plus his Cappuccino's have been of great value to me.
Col :-)
Great WHITE morning. It has been snowing for the last 10 hours or so. Everything is white here. And I need to go out of the house and remove the snow in a cold weather, minus 3 degree C.
Still see much reshuffling on the DCs.
I also see a filter on the test DC. Try to run a query without a filter (filter=0) and you shall see different results. I like the "natural" results without that filter thing ;-)
Good morning colin_h
>>Google employee or not ... Reseller has kept me sane during this time of uncertainty with his help and advice ... plus his Cappuccino's have been of great value to me.<<
Thanks for the kind words my friend. Life is wonderful though more colourful with a cup of danish brand cappucciono :-)
Wish you all a great day.
I have seen this peak in MSN & Yahoo positions also. Ever since I got bumped by Google my sites have been well looked after by many of the other big engines. Is this a deliberate tactic by competitive engines or just a fluke do you think?
I hope G will do something as I don't like their algo at all. Though mine is not a new website, I think it is unfair for them to assume that all new sites should be sandboxed for a while or all new links need time to age. The problem with this is that some old sites with old info keep taking the top positions at the expense of newer more updated sites!
Yes it is pure relief.
I am sorry you have had (and continue to have) the canonical problem but it will hopefully be some encouragement to know that once this was corrected, our serps returned immediately ie the two are linked. Good luck to all over the next few days.
Yes, I am encouraged that some of the problems seem to have been sorted.
I dont think a site that has had the problem for a long time will come back that quickly - eg a good crawl or 2 will need to occur to correct the situation.
Anyway - test DC still not showing test results.
Hopefully next time it will show all the data crawled by Mozilla Googlebot - not just a tiny sample.
[ am sorry you have had (and continue to have) the canonical problem]
Anyone got any ideas?
"Canonical essentially means “standard” or “authoritative”, so a canonical URL for search engine marketing purposes is the URL you want people to see. Depending on how your web site was programmed or how your tracking URLs are setup for marketing campaign, there may be more than one URL for a particular web page.
Sometimes if a domain is not setup properly, the domain URL (domain.com) and the www domain URL (www.domain.com) are considered individual web pages. Since both pages maybe indexed by Google - you could get hit for duplicate content and at the very least you would be splitting your link popularity.
The easiest way to protect your site is to redirect all forms of your domain to one “standard” URL - a canonical URL."
We implemented a 301 redirect at beginning of jagger because we had a canonical problem and this has now been fixed. Hope this helps!
I have 481,000 pages on my default DC. I thought that WebmasterWorld had roboted google out.
[edited by: colin_h at 11:41 am (utc) on Dec. 29, 2005]
Going on month #4 with virtually no traffic from Google. After spending 13 months in the sandbox, this is quite discouraging.
"Has anyone the dropped in the late September timeframe made it back to previous results? "
Yes, sort of. My Sept 22 site has its homepage back to its pre-22 position (more or less) for all its important searches.
Pages within the site, (in common with my other sites post Jagger) are not ranking at all.
I have implemented 301 on all my sites, and eliminated (again) duplicate content with other sites that stole my content.
I can also report an improvement with respect to the Suppemental listings I have been mentioning. The Index.html which was supplemental is now not listed (I redirected to site.com/) instead the site.com/ is listed and all (indexed) pages are now www.
I think I said before that although I have made changes to my sites during and since Jagger, I do not expect that these changes have affected anything.
I maintain that these problems have been Googles and they are slowly but surely getting to grips.
So, In summary, although things look like they are improving for Sept22/Jagger casualties I do not want to start jumping up and down just yet.
Futhermore, the problem I have with internal pages not ranking is another seemingly unrelated (Jagger) problem which as soon as we get any stability I will be looking into.
Thanks and good luck to all in 2006!
In November, quite a few of my pages went URL only in Google. Spidering was also lackluster (1,800 pages in November, 3,700 so far this month).
The pages that are still URL only are blocked by robots.txt. I removed around 150 pages that might be deemed duplicate content (navigational pages).
I'm right on the edge of 1,000 pages (where the site: command is having a problem). Right now according to my sitemap, I've got 1,020 pages.
In November, Google showed 10,400 pages on the site. Currently it shows a more accurate value of 985.
I'm hoping all the spidering activity is a positive sign. Although nearly 1,800 of these are from Mozilla Googlebot.
If left though the bug seems more ingrained when you finally do the fix.
Hopefully when the test dc comes back more bugs are solved.