Forum Moderators: Robert Charlton & goodroi
[ am sorry you have had (and continue to have) the canonical problem]
Anyone got any ideas?
"Canonical essentially means “standard” or “authoritative”, so a canonical URL for search engine marketing purposes is the URL you want people to see. Depending on how your web site was programmed or how your tracking URLs are setup for marketing campaign, there may be more than one URL for a particular web page.
Sometimes if a domain is not setup properly, the domain URL (domain.com) and the www domain URL (www.domain.com) are considered individual web pages. Since both pages maybe indexed by Google - you could get hit for duplicate content and at the very least you would be splitting your link popularity.
The easiest way to protect your site is to redirect all forms of your domain to one “standard” URL - a canonical URL."
We implemented a 301 redirect at beginning of jagger because we had a canonical problem and this has now been fixed. Hope this helps!
I have 481,000 pages on my default DC. I thought that WebmasterWorld had roboted google out.
[edited by: colin_h at 11:41 am (utc) on Dec. 29, 2005]
Going on month #4 with virtually no traffic from Google. After spending 13 months in the sandbox, this is quite discouraging.
"Has anyone the dropped in the late September timeframe made it back to previous results? "
Yes, sort of. My Sept 22 site has its homepage back to its pre-22 position (more or less) for all its important searches.
Pages within the site, (in common with my other sites post Jagger) are not ranking at all.
I have implemented 301 on all my sites, and eliminated (again) duplicate content with other sites that stole my content.
I can also report an improvement with respect to the Suppemental listings I have been mentioning. The Index.html which was supplemental is now not listed (I redirected to site.com/) instead the site.com/ is listed and all (indexed) pages are now www.
I think I said before that although I have made changes to my sites during and since Jagger, I do not expect that these changes have affected anything.
I maintain that these problems have been Googles and they are slowly but surely getting to grips.
So, In summary, although things look like they are improving for Sept22/Jagger casualties I do not want to start jumping up and down just yet.
Futhermore, the problem I have with internal pages not ranking is another seemingly unrelated (Jagger) problem which as soon as we get any stability I will be looking into.
Thanks and good luck to all in 2006!
In November, quite a few of my pages went URL only in Google. Spidering was also lackluster (1,800 pages in November, 3,700 so far this month).
The pages that are still URL only are blocked by robots.txt. I removed around 150 pages that might be deemed duplicate content (navigational pages).
I'm right on the edge of 1,000 pages (where the site: command is having a problem). Right now according to my sitemap, I've got 1,020 pages.
In November, Google showed 10,400 pages on the site. Currently it shows a more accurate value of 985.
I'm hoping all the spidering activity is a positive sign. Although nearly 1,800 of these are from Mozilla Googlebot.
If left though the bug seems more ingrained when you finally do the fix.
Hopefully when the test dc comes back more bugs are solved.