Forum Moderators: Robert Charlton & goodroi
Considering using the removal tool and then starting again. Can't think of a better way to do this. Supplementals can stay in there for over a year and if more than 50% of my pages in the Google index are supplemental, surely that will simply destroy any chances of ranking new pages well?
If it is any of these datacntres involved, as listed at [webmasterworld.com...] , then you might not actually have a problem at all.
Supplementals can stay in there for over a year and if more than 50% of my pages in the Google index are supplemental, surely that will simply destroy any chances of ranking new pages well?
No. Search the posts here around for g1smd's explanation of what supplemental pages there are in the index, and why.
Same results on 72.14.207.107 , 216.239.37.99 , 216.239.37.104 , 216.239.39.99 , 216.239.39.107 , 216.239.51.104 ...
See this thread : [webmasterworld.com...]
I saw the data refresh on September 30th. Since then, major sites have been buried in supplemental.
Didn't see any flame threads about it, so I guess only some major sites are affected.
PM me if you want to share more info.
chances of ranking new pages well?
My observation is a that a page with new, unique content on a new, unique url will get indexed fast.
It is not the site that gets marked supplemental. It is the url that gets marked supplemental. If the content is connected to a url that got marked supplemental, I haven't figured out how to recyle it yet.
It is not the site that gets marked supplemental. It is the url that gets marked supplemental.
Really? Many of the discussions on these boards have dabbled with the theory that, Google especially, is becoming more "total site" oriented and that things like hundreds of 404s, 50% duplicate content and even hundreds of supplementals reduce your site's overall "quality mark" and your rankings drop.
Certainly, rankings and traffic to every single page that is left non-supplemental has dropped to practically nothing since most of the site went supplemental.
Certainly, rankings and traffic to every single page that is left non-supplemental has dropped to practically nothing since most of the site went supplemental.
Nope. One site I observe lost 99% pages to supplemental, but kept page rank on those pages that did not go supplemental and some brand new pages got a page rank.
Nope. One site I observe lost 99% pages to supplemental, but kept page rank on those pages that did not go supplemental and some brand new pages got a page rank.
I was talking about actual rankings, i.e. being listed in the results lower than the positions they have held strongly for two years. Pagerank has not changed on any of the pages still ranked.
Domains with lots of supplementals, or other issues, lose value overall and seem to become less trusted.
Yes, that's what I'd heard, any chance you've seen something solid from Google at least hinting at that? You see that's my question, if that statement is true then surely the removal tool is the only way forward? Clean slate.
Here is what google webmaster help says:
Pages removed using our automatic URL removal system are excluded from our index for at least 6 months regardless of whether they become available to our crawler during that time. Please keep in mind that we're unable to manually add pages to our search results, even those that have been removed through this automatic removal system.
>> There is no possible way that google would tank webmasterworld. <<
When Brett put a noindex tag on the site Google delisted the whole lot within weeks... Once it was removed back it came. [webmasterworld.com]
That sounds like it might be some kind of magic trick.
When the noindex tag was applied here, Google did not actually remove the data about WebmasterWorld from the index; they merely put a "don't show this to the public" tag on all of the results. Once the noindex tag was removed, the "don't show" tag was removed and everything reappeared in exactly the same state that it had disappeared.
I wouldn't use the removal tool for anything more than ten pages. It's good for cleaning clutter (like pages that actually no longer exist), not fixing a domain, since all it really does is hide the URLs, not remove them.
Really? I thought the removal tool meant a specific request that Google no longer have that URL/page in their database. Do you mean that a removal request is the same as a noindex tag? i.e. if I removed my entire site, in six months it would re-appear again exactly the same as when I asked for it to be removed? That wasn't my understanding of the removal tool ...
No. It just removes the URL from view for six months... and then it shows up again.
>> so if I put a noindex tag on the supplemental version of the url, the preferred version might get to become unduplicated and return to the regular index? <<
There is no "supplemental version of the URL", but I do know what you mean by that.
There is some content available at multiple URLs, some or all of which get tagged as supplemental. There is one URL that you want to be indexed, and all the others are the duplicates.
If the other URL is a www vs. non-www problem or is simply on another domain, then the 301 redirect is the correct thing to apply to all of the alternative URLs. That will get the "main" URL properly indexed very quickly. The alternatives will continue to show as Supplemental for a year - and that is NOT a problem just as long as those other URLs do return a 301 status.
If the problem is through URLs with parameters in a different order, or a different number of parameters, then the redirect may be used for some of them, but adding the meta robots noindex tag on some versions may also be a good choice to get the alternative URLs deindexed.
There is no need to use the removal tool at all. Google will quickly pick up the 301 and 404 status of the alternative URLs and do what is necessary internally. The pages will show as Supplemental for a long time, but that is not a problem.
For most searches only the first three results are shown as normal results, and then everything after that is shown as Supplemental.
Last night even Matt Cutts' own site was shown as having "1 to 3 of about 770 pages" with NO link to any further results... not even the "click for omitted results" link.
If you are looking at those datacentres, then you will think you have a problem where none actually exists.
On this occasion you do need to check multiple datacentres to get an overall picture. Remember that Google has 44 Class-C blocks of IP addresses currently active.
It does nothing of the sort. URLs removed by the tool are still there, still duplicates, and basically always drag down current content.
Don't make up myths about the removal tool. It does NOTHING to help with duplicate content, other than it won't show the deleted URL in the results, but that obviously is no benefit if the duplicate would have shown at #47 and the correct one shows at #62.
The URL removal tool is only a convenience for looking at things. It helps not at all in any way for rankings.
I wasn't talking about the removal tool at all.
I was talking about the meta robots noindex tag. I even used the word "tag" in my post, in response to the post about the "noindex tag".
Please read what I wrote :-)
I realised a long time ago that the removal tool is a pile cr*p, a sort of placebo; just like the site submit tool is too, I supposed.
[edited by: g1smd at 10:43 pm (utc) on Oct. 5, 2006]
Saying this is not a problem is wrong. Our orders have tanked today and so has the traffic. Came on here to see what was up. This IS effecting search results. We have no duplicate content except for maybe the privacy policy and canned stuff like that.
The site command is BROKEN on multiple Google datacentres.
Every site is showing that same effect [webmasterworld.com] on those datacentres... and one of those is my default Google (in UK) for the last 18 hours or more.