I still see .co.uk with root domain problems in UK searches, but today a lot of sites are showing extra pages indexed.
Additionally, the recent "1-2 of 120 pages" problem seems to be fixed - but with a twist. With filter=1 selected I now see "1 to 30 of 150" but all results after the first are supplemental. With filter=0, I get 150 results but they are not supplemental.
Things are in the works. I'll look again in a few days and see how it settles.
P.S. Matt says "This is not an update". He is stood here while I am posting this.
I haven't seen a change. Still showing 1-2 of 260
"it still has the 3 of 260, then "similar results" opening up to around 50,000."
thats weird - i'm also getting that magic 260 number.
Results 1 - 1 of about 260
but i can click through to get all of my indexed pages.
definitely something broken with the site: command.
anyone seen this? I get the following:
1 - 3 of about 0
While other DC's show normal 1 - 10 of about 120
Thanks for the update on the error - g1smd - good to know you discussed it with Matt.
Sometimes I wonder how many other errors there are around the place that cause webmasters to jump and down, when all they need is to be reported and responded to by Google. Like a kind of "outage" report.
It sure would help put a lot of minds at rest.
|Things are in the works. I'll look again in a few days and see how it settles. |
g1smd - 4 days later and no improvements on our sites yet
I am seeing my new descriptions are getting added slowly instead of the navigation link text,
I am also tracking the google cache now,
Hopefully if all the pages get cached with the new description, I guess I should come out of omitted results.
Will report to you all soon.
I'm also getting 1-3 of 260.
Clicking to view omitted results gives me around 55 to 60k, which is what it has been for a very long time.
A couple of site: searches have showed 1 of 101, but most every time it is 1 of 2 or 1 of 3, out of 260 results.
I believe that what we are seeing is context-sensitive duplicate removal. That is - if you look for info about widgets, only that text on a page which is about widgets will be used in the duplicate content calculation. Extra text (which many sites have been using to pad out and reduce duplication) is discounted.
When you use the site:command, my suspicion is that the context is the site itself and that's why you are getting site-related snippets and extra removal of duplicates (i.e. menu sections, etc.)
|My hunch is that Google is turning up the dial on "duplicate content" again. Possibly 1) KW density 2) the % of common content between pages 3) the % of common content between different sites. |
On the site I have that is affected the home page and one other page optimised for the exact same term are the two non-filtered results -but the second (internal) page is Supplemental.
As NetMeg said, other pages that are filtered are still ranking and non-Supplemental.
Are the site: query experiences and loss of rankings even connected, or just two separate things that people are seeing concurrently?
(edited for clarity)
I am having a similar situation but it does not seem to have affected my traffic or nor my rankings. Traffic is actually up today. The pages are not removed from the index nor have they gone supplemental. They just do not appear immediately with the site: command.
My site: command is very similar in the fact that it only says 260 pages are indexed as well.
|Results 1 - 7 of about 260 |
Click on the
link and all my pages are there.
|repeat the search with the omitted results included. |
When i try the site command with more specific urls it's the same behaviour.
I have, for the most part, unique titles and meta descriptions.
I am being crawled as normal, which is usually quite heavy and often.
Plenty of google referers.
Interestingly enough, my site has site links back. These went away last feb during the Big Daddy supplemental issue, and then came back this month.
Let me know if there is anything else you would like me to test.
Also out of curiousity, how many people who are having this "problem" were also affected by the Big Daddy update last Feb?
|Also out of curiousity, how many people who are having this "problem" were also affected by the Big Daddy update last Feb? |
No, my sites haven't been affected by any of the updates for several years, except for the one in May,2005 where my main site suddenly came out of the sandbox after a couple of years in limbo.
My most popular site says "Results 1 - 1 of about 300 from (website)" when I perform a site: search. When I click on the "repeat the search with the omitted results included" link, it says "Results 1 - 50 of about 1,850 from (website)."
No wonder my site shows up at the bottom of the last page of results. At least it can't get any worse than this, right?
Our site is also displaying 1-2 of 260. When I expand the results it shows the number of pages as 1560. There is actually 1800.
I thought originally that as we've just relaunched the site using an new page and file structure that we were being temporarily penalised (incidentally our new site was doing great until our hosts messed up the 301 redirect code and then things seem to have really got worse - maybe a coincedence, maybe not).
It looks like it could be a whole lot worse. Our G traffic has more than halved and our enquiries like-wise.
The really annoying thing is that it took three of us 8 months to redesign the site from scratch. We made it a lot easier to navigate, less spammy (it wasn't even spammy in the first place) and rewrote all of our copy so that it was much more informative...makes you wonder why you bother really.
anyway, whining isn't going to get me anywhere so if anyone has any ideas please let me know, or is sitting tight the best option?
[edited by: tedster at 7:40 pm (utc) on Feb. 19, 2007]
"I believe that what we are seeing is context-sensitive duplicate removal. That is - if you look for info about widgets, only that text on a page which is about widgets will be used in the duplicate content calculation. Extra text (which many sites have been using to pad out and reduce duplication) is discounted.
When you use the site:command, my suspicion is that the context is the site itself and that's why you are getting site-related snippets and extra removal of duplicates (i.e. menu sections, etc.)"
Can you explain more about this? How could a dup. content filter discern which part of the text on a page is about widgets and which part is not? Are people using entirely off-topic text in the same way that email spam includes garbage that doesn't make sense in order to get past spam filters?
Your thought is that the duplicate filter can then remove more than it appeared to remove in the past because it can throw out a certain amount of what previously appeared to be relevant content?
|When you use the site:command, my suspicion is that the context is the site itself and that's why you are getting site-related snippets and extra removal of duplicates (i.e. menu sections, etc.) |
This might certainly be what's triggered the results . But I notice that this is not yet consistant wih all sites, and that some sites with even stronger relevance to your remark have not yet been effected.
On the other hand, g1smd standing next to Matt Cutt's at SES London ( above ) suggests that it was considered a temporary error .
I to do not see a consistent pattern to anything anyone has discussed here.
The only site that we have that is affected by this is rising steadily in the ranks and traffic is increasing by leaps an bounds.
I seriously just think it's a bug with the site: command and all the other coincidences mentioned here are just that, coincidences.
Actually I've seen the same over the last couple of days:
it started with 1 out of 260 results for the site: command with only two results showing up. Now it displays 1 out of 10000s with the site command but switches to 1 out of 270 when I access the second site: serp page and tells me the similar results story again - how strange is that?
I also witnessed - and I guess this is more important - that it has to be a duplicate content filter concerning menus and the like.
All the sites that are real sites, with navigation menu, etc. start to pop up much further down in the site: serps.
That looks a lot like a snippet dc filter (look for U.S. patent application 20060294155) gone awry somehow.
The sites have unique titles, meta descriptions etc. and unique content, only the navigation menu is always the same. And, dear Google, I'm sorry about that but this menu is for users, if you can't deal with it, why don't keep it at bay and look at the rest of the page?
Edit: ok, I've now searched for the text of my navigation menu, getting two results, rest omitted because of similarity. Which is fine for me, but it seems that this similarity filter is also applied for the site: search function. Which is just not how it should work, as a good working dc filter should only filter out the dc content, not the rest of a page.
[edited by: Spiekerooger at 12:59 pm (utc) on Feb. 22, 2007]
Just for the record .... no changes .... still 1 of 260 etc
site:Home page is still 1 of 260 , no supplementals on inside pages
site:Inside page is no longer 1 of 260 , it's changed to ALL Supplemental , with all results showing at the first attempt
Should we be concerned or is this the "bug that continues and changes shape "?
Google recognized this as a bug on the Webmaster Central blog and said they are working to fix it.
Jeremy L - Thanks for the heads up. I missed this note from Vanessa Fox at Google .
|Historically, Google has avoided showing pages that appear to be duplicate (e.g., pages with the same title and description) in search results. Our goal is to provide useful results to the searcher. However, with a site: command, searchers are likely looking for a full list of results from that site, so we are making a change to do that. In some cases, a site: search doesn't show a full list of results even when the pages are different, and we are resolving that issue as well. Note that this is a display issue only and doesn't in any way affect search rankings. If you see this behavior, simply click the "repeat the search with omitted results included" link to see the full list. The pages that initially don't display continue to show up for regular queries. The display issue affects only a site: search with no associated query. In addition, this display issue is unrelated to supplemental results. Any pages in supplemental results display "Supplemental Result" beside the URL. |
Because this change to show all results for site: queries doesn't affect search rankings at all, it will probably happen in the normal course of events as we merge this change into the next time that we push a new executable for handling the site: command. As a result, it may be several weeks or so before you start to see this change, but we'll keep monitoring it to make sure the change goes out
Anyone seeing any changes yet?
I have. Getting more pages back every day.
I wouldn't worry about it. It has nothing to do with rankings. I have a site with this and it's traffic has almost doubled over the last month, and we are talking about a popular site.
seems to be fixed now.
Not for our sites. We're now getting 1 of 1 instead of 1 of 260.
I guess it's not a priority fix for G.
thats strange. was working a few hours ago, and now it back to being borked.
i was even using my favourite DC watch tool.
Seems to be working now - but not sure how accurate it is yet
Actually it's throwing up a lot of supplementals for us which i wouldn't expect, even though the 1 to 260 has gone.
Does anyone else have a clue if it's working properly?
| This 61 message thread spans 3 pages: < < 61 ( 1  3 ) > > |