| 11:51 pm on Oct 3, 2006 (gmt 0)|
Check which datacentre you are looking at.
If it is any of these datacntres involved, as listed at [webmasterworld.com...] , then you might not actually have a problem at all.
| 4:49 am on Oct 4, 2006 (gmt 0)|
|Supplementals can stay in there for over a year and if more than 50% of my pages in the Google index are supplemental, surely that will simply destroy any chances of ranking new pages well? |
No. Search the posts here around for g1smd's explanation of what supplemental pages there are in the index, and why.
| 5:25 am on Oct 4, 2006 (gmt 0)|
I have a theory that any page on a site that doesn't have a significant backlink(s) is going supplemental. I have 3 sites I am managing now that have gone entirely supplemental except for the index page.
All of the sites have one thing in common: all links point to the .com and not to any interior pages.
One of the sites is a brand new domain as of 10 months ago and has no dupe content issues. That I am positive of. And another has had a 301 since the first week it was up and all original content.
Those two sites are merely for very small businesses and haven't had much backlink production.
The primary site has been around for a while and has run the gamut of problems but most have been straightened out for a while. The nice thing for this site is it has moved into the top ten for most of the keywords pursued for it and keeps moving up and traffic is growing for it. So, the supp issue doesn't seem to be hurting it's growth.
| 7:49 am on Oct 4, 2006 (gmt 0)|
Check this : [18.104.22.168...]
or this one :
Same results on 22.214.171.124 , 126.96.36.199 , 188.8.131.52 , 184.108.40.206 , 220.127.116.11 , 18.104.22.168 ...
See this thread : [webmasterworld.com...]
I saw the data refresh on September 30th. Since then, major sites have been buried in supplemental.
Didn't see any flame threads about it, so I guess only some major sites are affected.
PM me if you want to share more info.
| 7:18 pm on Oct 4, 2006 (gmt 0)|
Some datacentres are showing some whacky results for some searches at the moment: [webmasterworld.com...]
| 7:52 pm on Oct 4, 2006 (gmt 0)|
|chances of ranking new pages well? |
My observation is a that a page with new, unique content on a new, unique url will get indexed fast.
It is not the site that gets marked supplemental. It is the url that gets marked supplemental. If the content is connected to a url that got marked supplemental, I haven't figured out how to recyle it yet.
| 9:27 pm on Oct 4, 2006 (gmt 0)|
|It is not the site that gets marked supplemental. It is the url that gets marked supplemental. |
Really? Many of the discussions on these boards have dabbled with the theory that, Google especially, is becoming more "total site" oriented and that things like hundreds of 404s, 50% duplicate content and even hundreds of supplementals reduce your site's overall "quality mark" and your rankings drop.
Certainly, rankings and traffic to every single page that is left non-supplemental has dropped to practically nothing since most of the site went supplemental.
| 9:42 pm on Oct 4, 2006 (gmt 0)|
|Certainly, rankings and traffic to every single page that is left non-supplemental has dropped to practically nothing since most of the site went supplemental. |
Nope. One site I observe lost 99% pages to supplemental, but kept page rank on those pages that did not go supplemental and some brand new pages got a page rank.
| 9:52 pm on Oct 4, 2006 (gmt 0)|
URLs are marked as Supplemental on a URL-by-URL basis. [webmasterworld.com...]
| 10:04 pm on Oct 4, 2006 (gmt 0)|
URLs get marked supplemental. Domains with lots of supplementals, or other issues, lose value overall and seem to become less trusted. The two things are related but different phenomena.
| 7:52 am on Oct 5, 2006 (gmt 0)|
|Nope. One site I observe lost 99% pages to supplemental, but kept page rank on those pages that did not go supplemental and some brand new pages got a page rank. |
I was talking about actual rankings, i.e. being listed in the results lower than the positions they have held strongly for two years. Pagerank has not changed on any of the pages still ranked.
|Domains with lots of supplementals, or other issues, lose value overall and seem to become less trusted. |
Yes, that's what I'd heard, any chance you've seen something solid from Google at least hinting at that? You see that's my question, if that statement is true then surely the removal tool is the only way forward? Clean slate.
| 11:58 am on Oct 5, 2006 (gmt 0)|
If you remove your site, it is at your own risk.
Here is what google webmaster help says:
|Pages removed using our automatic URL removal system are excluded from our index for at least 6 months regardless of whether they become available to our crawler during that time. Please keep in mind that we're unable to manually add pages to our search results, even those that have been removed through this automatic removal system. |
| 12:30 pm on Oct 5, 2006 (gmt 0)|
You might be interested in investigating this gem from Duplicate Content - Get it right or perish:
>> There is no possible way that google would tank webmasterworld. <<
When Brett put a noindex tag on the site Google delisted the whole lot within weeks... Once it was removed back it came. [webmasterworld.com]
That sounds like it might be some kind of magic trick.
| 6:48 pm on Oct 5, 2006 (gmt 0)|
Yeah, but pages that had been Supplemental before, came back as Supplemental again.
When the noindex tag was applied here, Google did not actually remove the data about WebmasterWorld from the index; they merely put a "don't show this to the public" tag on all of the results. Once the noindex tag was removed, the "don't show" tag was removed and everything reappeared in exactly the same state that it had disappeared.
| 8:57 pm on Oct 5, 2006 (gmt 0)|
I wouldn't use the removal tool for anything more than ten pages. It's good for cleaning clutter (like pages that actually no longer exist), not fixing a domain, since all it really does is hide the URLs, not remove them.
| 9:22 pm on Oct 5, 2006 (gmt 0)|
I am now understanding that the noindex tag will not help these situations because it only really means "don't show this."
| 9:25 pm on Oct 5, 2006 (gmt 0)|
Once you tag it as "don't show it" then it no longer counts as duplicate content, so it can be used to very good effect for some "duplicate content" issues.
| 9:29 pm on Oct 5, 2006 (gmt 0)|
|I wouldn't use the removal tool for anything more than ten pages. It's good for cleaning clutter (like pages that actually no longer exist), not fixing a domain, since all it really does is hide the URLs, not remove them. |
Really? I thought the removal tool meant a specific request that Google no longer have that URL/page in their database. Do you mean that a removal request is the same as a noindex tag? i.e. if I removed my entire site, in six months it would re-appear again exactly the same as when I asked for it to be removed? That wasn't my understanding of the removal tool ...
| 9:30 pm on Oct 5, 2006 (gmt 0)|
Hmmm... so if I put a noindex tag on the supplemental version of the url, the preferred version might get to become unduplicated and return to the regular index?
| 9:40 pm on Oct 5, 2006 (gmt 0)|
>> I thought the removal tool meant a specific request that Google no longer have that URL/page in their database. <<
No. It just removes the URL from view for six months... and then it shows up again.
>> so if I put a noindex tag on the supplemental version of the url, the preferred version might get to become unduplicated and return to the regular index? <<
There is no "supplemental version of the URL", but I do know what you mean by that.
There is some content available at multiple URLs, some or all of which get tagged as supplemental. There is one URL that you want to be indexed, and all the others are the duplicates.
If the other URL is a www vs. non-www problem or is simply on another domain, then the 301 redirect is the correct thing to apply to all of the alternative URLs. That will get the "main" URL properly indexed very quickly. The alternatives will continue to show as Supplemental for a year - and that is NOT a problem just as long as those other URLs do return a 301 status.
If the problem is through URLs with parameters in a different order, or a different number of parameters, then the redirect may be used for some of them, but adding the meta robots noindex tag on some versions may also be a good choice to get the alternative URLs deindexed.
There is no need to use the removal tool at all. Google will quickly pick up the 301 and 404 status of the alternative URLs and do what is necessary internally. The pages will show as Supplemental for a long time, but that is not a problem.
| 9:51 pm on Oct 5, 2006 (gmt 0)|
It sounds like, say I moved old_url to new_url and thus got supplemental. Then if I 301 redirect old_url to new_url, would it be effectively similar to putting a noindex tag on old_url?
| 9:59 pm on Oct 5, 2006 (gmt 0)|
It would, but the redirect will also pass some PageRank on to that new URL.
I have to ask though, why move the page to a new URL at all?
"Cool URIs never change".
| 10:03 pm on Oct 5, 2006 (gmt 0)|
Now I know. I used to didn't.
| 10:09 pm on Oct 5, 2006 (gmt 0)|
More than six of Google's existing 44 Class-C IP blocks are showing truly whacky results. The site: command is truly broken on those.
For most searches only the first three results are shown as normal results, and then everything after that is shown as Supplemental.
Last night even Matt Cutts' own site was shown as having "1 to 3 of about 770 pages" with NO link to any further results... not even the "click for omitted results" link.
If you are looking at those datacentres, then you will think you have a problem where none actually exists.
On this occasion you do need to check multiple datacentres to get an overall picture. Remember that Google has 44 Class-C blocks of IP addresses currently active.
| 10:28 pm on Oct 5, 2006 (gmt 0)|
"Once you tag it as 'don't show it' then it no longer counts as duplicate content, so it can be used to very good effect for some 'duplicate content' issues."
It does nothing of the sort. URLs removed by the tool are still there, still duplicates, and basically always drag down current content.
Don't make up myths about the removal tool. It does NOTHING to help with duplicate content, other than it won't show the deleted URL in the results, but that obviously is no benefit if the duplicate would have shown at #47 and the correct one shows at #62.
The URL removal tool is only a convenience for looking at things. It helps not at all in any way for rankings.
| 10:31 pm on Oct 5, 2006 (gmt 0)|
Webmasterworld shows three normal pages, five supplementals, and then nothing else of the 291,000 other pages, nothing omitted, nothing at all.
| 10:34 pm on Oct 5, 2006 (gmt 0)|
>> Don't make up myths about the removal tool. <<
I wasn't talking about the removal tool at all.
I was talking about the meta robots noindex tag. I even used the word "tag" in my post, in response to the post about the "noindex tag".
Please read what I wrote :-)
I realised a long time ago that the removal tool is a pile cr*p, a sort of placebo; just like the site submit tool is too, I supposed.
[edited by: g1smd at 10:43 pm (utc) on Oct. 5, 2006]
| 10:42 pm on Oct 5, 2006 (gmt 0)|
I am having the same problem too. Almost all my pages have gone supplemental except for the home page and a couple others. We have been around for over 6 years and have over 2,000 links coming back in to the site.
| 10:45 pm on Oct 5, 2006 (gmt 0)|
>> "supplemental" ... "except the home page and a couple of others" <<
The site command is BROKEN on multiple Google datacentres.
Every site is showing that same effect [webmasterworld.com] on those datacentres... and one of those is my default Google (in UK) for the last 18 hours or more.
| This 120 message thread spans 4 pages: 120 (  2 3 4 ) > > |