Forum Moderators: Robert Charlton & goodroi
Results 1 - 1 of about 260 from oursite.com
with only 2 page results and a note:
In order to show you the most relevant results, we have omitted some entries very similar to the 1 already displayed.
If you like, you can " repeat the search with the omitted results included. "
When clicking on the "repeat .... link " it shows most of the pages.
Is anyone else experiencing this?
What does this say about Google's interpretation of the site's pages?
Additionally, the recent "1-2 of 120 pages" problem seems to be fixed - but with a twist. With filter=1 selected I now see "1 to 30 of 150" but all results after the first are supplemental. With filter=0, I get 150 results but they are not supplemental.
Things are in the works. I'll look again in a few days and see how it settles.
P.S. Matt says "This is not an update". He is stood here while I am posting this.
Sometimes I wonder how many other errors there are around the place that cause webmasters to jump and down, when all they need is to be reported and responded to by Google. Like a kind of "outage" report.
It sure would help put a lot of minds at rest.
When you use the site:command, my suspicion is that the context is the site itself and that's why you are getting site-related snippets and extra removal of duplicates (i.e. menu sections, etc.)
My hunch is that Google is turning up the dial on "duplicate content" again. Possibly 1) KW density 2) the % of common content between pages 3) the % of common content between different sites.
As NetMeg said, other pages that are filtered are still ranking and non-Supplemental.
Are the site: query experiences and loss of rankings even connected, or just two separate things that people are seeing concurrently?
(edited for clarity)
My site: command is very similar in the fact that it only says 260 pages are indexed as well.
Results 1 - 7 of about 260
Click on the
repeat the search with the omitted results included.link and all my pages are there.
When i try the site command with more specific urls it's the same behaviour.
I have, for the most part, unique titles and meta descriptions.
I am being crawled as normal, which is usually quite heavy and often.
Plenty of google referers.
Interestingly enough, my site has site links back. These went away last feb during the Big Daddy supplemental issue, and then came back this month.
Let me know if there is anything else you would like me to test.
Also out of curiousity, how many people who are having this "problem" were also affected by the Big Daddy update last Feb?
Also out of curiousity, how many people who are having this "problem" were also affected by the Big Daddy update last Feb?
No, my sites haven't been affected by any of the updates for several years, except for the one in May,2005 where my main site suddenly came out of the sandbox after a couple of years in limbo.
No wonder my site shows up at the bottom of the last page of results. At least it can't get any worse than this, right?
I thought originally that as we've just relaunched the site using an new page and file structure that we were being temporarily penalised (incidentally our new site was doing great until our hosts messed up the 301 redirect code and then things seem to have really got worse - maybe a coincedence, maybe not).
It looks like it could be a whole lot worse. Our G traffic has more than halved and our enquiries like-wise.
The really annoying thing is that it took three of us 8 months to redesign the site from scratch. We made it a lot easier to navigate, less spammy (it wasn't even spammy in the first place) and rewrote all of our copy so that it was much more informative...makes you wonder why you bother really.
anyway, whining isn't going to get me anywhere so if anyone has any ideas please let me know, or is sitting tight the best option?
Marra
[edited by: tedster at 7:40 pm (utc) on Feb. 19, 2007]
Can you explain more about this? How could a dup. content filter discern which part of the text on a page is about widgets and which part is not? Are people using entirely off-topic text in the same way that email spam includes garbage that doesn't make sense in order to get past spam filters?
Your thought is that the duplicate filter can then remove more than it appeared to remove in the past because it can throw out a certain amount of what previously appeared to be relevant content?
When you use the site:command, my suspicion is that the context is the site itself and that's why you are getting site-related snippets and extra removal of duplicates (i.e. menu sections, etc.)
This might certainly be what's triggered the results . But I notice that this is not yet consistant wih all sites, and that some sites with even stronger relevance to your remark have not yet been effected.
On the other hand, g1smd standing next to Matt Cutt's at SES London ( above ) suggests that it was considered a temporary error .
The only site that we have that is affected by this is rising steadily in the ranks and traffic is increasing by leaps an bounds.
I seriously just think it's a bug with the site: command and all the other coincidences mentioned here are just that, coincidences.
it started with 1 out of 260 results for the site: command with only two results showing up. Now it displays 1 out of 10000s with the site command but switches to 1 out of 270 when I access the second site: serp page and tells me the similar results story again - how strange is that?
I also witnessed - and I guess this is more important - that it has to be a duplicate content filter concerning menus and the like.
Cause if I use the filter=0 option, the first result is the homepage, but then I get a bunch of urls that are not linked to anywhere and are no real sites, eg. an url for an adsense alternative adbanner or some other backend areas that have no connection with the site and are only linked to via javascript (e.g. statistics admin panel, nothing linking there), also rss feeds.
All the sites that are real sites, with navigation menu, etc. start to pop up much further down in the site: serps.
That looks a lot like a snippet dc filter (look for U.S. patent application 20060294155) gone awry somehow.
The sites have unique titles, meta descriptions etc. and unique content, only the navigation menu is always the same. And, dear Google, I'm sorry about that but this menu is for users, if you can't deal with it, why don't keep it at bay and look at the rest of the page?
Edit: ok, I've now searched for the text of my navigation menu, getting two results, rest omitted because of similarity. Which is fine for me, but it seems that this similarity filter is also applied for the site: search function. Which is just not how it should work, as a good working dc filter should only filter out the dc content, not the rest of a page.
[edited by: Spiekerooger at 12:59 pm (utc) on Feb. 22, 2007]
[googlewebmastercentral.blogspot.com...]
Historically, Google has avoided showing pages that appear to be duplicate (e.g., pages with the same title and description) in search results. Our goal is to provide useful results to the searcher. However, with a site: command, searchers are likely looking for a full list of results from that site, so we are making a change to do that. In some cases, a site: search doesn't show a full list of results even when the pages are different, and we are resolving that issue as well. Note that this is a display issue only and doesn't in any way affect search rankings. If you see this behavior, simply click the "repeat the search with omitted results included" link to see the full list. The pages that initially don't display continue to show up for regular queries. The display issue affects only a site: search with no associated query. In addition, this display issue is unrelated to supplemental results. Any pages in supplemental results display "Supplemental Result" beside the URL.Because this change to show all results for site: queries doesn't affect search rankings at all, it will probably happen in the normal course of events as we merge this change into the next time that we push a new executable for handling the site: command. As a result, it may be several weeks or so before you start to see this change, but we'll keep monitoring it to make sure the change goes out