Welcome to WebmasterWorld Guest from 188.8.131.52
I had 20,300 pages showing for a site:www.example.com search yesterday and for the past month. Today it dropped to 509 but my traffic is still pretty constant. I normally get around 4,500 - 5,000 to that site per day and today I've already got 4,000.
So, either Google doesn't account for even a small percentage of my traffic (which I doubt) or the way Google stores information about my site has changed. i.e. the 20,300 pages are still there, Google will only tell me about 509 of them. As far as I can tell, I think the other pages have been supplemented.
That resonated with something that I was talking about with the crawl/index team. internetheaven, was that post about the site in your profile, or a different site? Your post aligns exactly with one thing I've seen in a couple ways. It would align even more if you were talking about a different site than the one in your profile. :) If you were talking about a different site, would mind sending the site name to bostonpubcon2006 [at] gmail.com with the subject line of "crawlpages" and the name of your site, plus the handle "internetheaven"? I'd like to check the theory.
Just to give folks an update, we've been going through the feedback and noticed one thing. We've been refreshing some (but not all) of the supplemental results. One part of the supplemental indexing system didn't return any results for [site:domain.com] (that is, a site: search with no additional terms). So that would match with fewer results being reported for site: queries but traffic not changing much. The pages are available for queries matching the supplemental results, but just adding a term or stopword to site: wouldn't automatically access those supplemental results.
I'm checking with the crawl/index folks if this might factor into what people are seeing, and I should hear back later today or tomorrow. In the mean time, interested folks might want to check if their search traffic has gone up/down by a major amount, and see if there are fewer/more supplemental results for a site: search for their domain. Since folks outside Google couldn't force the supplemental results to return site: results, it needed a crawl/index person to notice that fact based on the feedback that we've gotten.
Anyone that wants to send more info along those lines to bostonpubcon2006 [at] gmail.com with the subject line "crawlpages" is welcome to. So you might send something like "I originally wrote about domain.com. I looked at my logs and haven't seen a major decrease in traffic; my traffic is about the same. I used to have about X% supplemental results, and now I hardly see any supplemental results with a site:domain.com query."
I've still got someone reading the bostonpubcon email alias, and I've worked with the Sitemaps team to exclude that as a factor. The crawl/index folks are reading portions of the feedback too; if there's more that I notice, I'll stop by to let you know.
[edited by: Brett_Tabke at 8:07 pm (utc) on May 8, 2006]
I see them missing from some datacentres now. Is that intentional (I hope it is), or a glitch that they have disappeared, but you plan on putting them back in? (I hope not).
GoogleGuy, can you point to any specific datacentre IP addresses where the "missing Supplemental Results problem" is most appearent? Google has very different results on some IPs at the moment.
To username Relevancy (last post in previous thread):
I am told that it isn't a problem, but I would make them more diverse than that.
Maybe "big" and "old" sites can get away with it, but newer sites cannot?
[edited by: g1smd at 8:30 pm (utc) on May 8, 2006]
Do you want results which have sites with all pages crawled as www. but have supplementals for the non-www.?
These are all 301 redirected pages now showing supplemental for the non-www. including the homepage.
The site also shows an outdated DMOZ listing as the title rather than the current tite or the current DMOZ title.
A site that had 150 pages fully indexed, has shown "1 to 120 of 150" in a site:domain.com search for very many months. All of the titles and meta descriptions on those 150 pages have been different for a long time, and the on-page content is also unique per page.
There have been no Supplemental Results for this site at the .com location in the last year or more (except for a couple of pages that were deleted a very long time ago). The old site:domain.co.uk search now shows either zero or 50 or so Supplemental Results depending on which datacentre that you look at. The .co.uk page URLs have all had a 301 redirect pointing to the matching .com pages for at least a year.
Previously, last year, when all those meta descriptions were exactly the same, Google used to show just "1 to 3 of about 120" in a site:domain.com search, and you needed to click on the "repeat this search with omitted results included" link to see any more.
A few days ago, in some DCs, the results were down to 1 to 40 of about 150, for this site, and I found that very odd. Clicking the "repeat search with omitted results included" link then revealed the rest of the pages, but every one of them now has exactly the same snippet --- the snippet of what it used to be 6 months or more ago, back when the meta descriptions were all identical.
Today, the site: search is down to "1 to 3 of about 150" again, and every result shows exactly the same (old) snippet in a site: search at 184.108.40.206 or at 220.127.116.11 for example.
I think this is a bug, or Google reverting to using old data for the snippet, or something.
Can you take a look?
[edited by: g1smd at 8:56 pm (utc) on May 8, 2006]
"Today, the site: search is down to 1 to 3 of about 150 again, and every result shows the same snippet in a site: search. I think this is a bug, or Google reverting to using old data for the snippet, or something."
Any DC IP?
[edited by: reseller] I see you have just added the DCs IP to your post. Thanks, g1smd [/edit]
We have seen a similar bug effect the way our site: search pages are displayed.
All 435 pages remain indexed but instead of a natural ordering and all pages available for view we now have only 45 available to view without clicking the "similar ommited pages" link.
This would appear to be because Google has started to use "Header" text including header image alt text as the snipet in about 350 of our pages. Obviously this is the same on these pages resulting in them not being displayed without clicking the similar pages link.
All have seperate content, title, description, keywords etc. They just use a standard Header via a template.
This is new as of yesterday, very odd and a glitch in my opinion.
Then, one person see's a slight glitch on one particular data centre on one particular day, and this - copmpletely atypical example - is what "resonates"?
My site had 100,000++ pages, as soon as Big Daddy hit, 99% of the pages were gone.
99% of the hits gone too.
I had virtually no suppliment results before big daddy.
Still heavily crawled everyday (5-15K) but very few pages added, fluctuating around only 300 pages in the index for the last 1-2 months.
when my pages dropped I saw a 30%/40% reduction in traffic and a few days ago I saw a small increase with command site:url not much from 148 to 204 but the odd thing is traffic has shot up in fact yesterday was my best day for a few months and according to my stats G traffic has increased by 50%
(from a moderator) and then someone else said
I have a solid established site (no funny business or dup etc) the has lost 75% of its pages on a site: search.
Traffic however is only fractionally down, and accountable now that the sun is coming out.
That was what I was noticing. I'm not saying that that's 100% of things. When I looked through the crawlpages feedback, I did see a few people with spam penalties, for example. That could also explain why a site would be crawled less.
g1smd, I would certainly say that the days of those older pages are numbered in that I expect a reindex of most of the supplemental results over time (although it could take a while).
optimist, I'd expect that supplemental non-www results would be refreshed then, so I wouldn't report those right now.
Thanks. I appreciate the opportunity you've extended for some specifics from our end.
I've sent an email to bostonpubcon2006 also, with some stats from both log entries and Google Analytics. I also have graphs comparing all organic search referrals oct04-thru-sep05 to oct05-thru-ytd06. Let me know if they'd be helpful.
I sent a re inclusion request via SiteMaps about 10 days ago [ sticky me if you want to ] , advising that we are receiving almost zero results from Google compared to pre Jul05, when we had strong results. At this time [ Jul05 ] we were hit by a hacker/180 day exclusion with an illicit robots.txt entry + the BD update / supplemental issues on the exit from this exclusion period around [Jan/Feb06 ].
Since then we have strange things happening on an ongoing basis. Here's just a few :
-Page No's wildly fluctuating on the DC's
-Supplementals appearing and disappearing
-Meta descriptions being ignored and restored
-Navigation replacing meta descriptions on index
-No results in preceeding positions - in fact they appear often below supplementals
Our site's have been thoroughly examined by an SEO and believed to be fully compliant since Jan06
9-10 months is a long time to be kept in silence and disruption.
I'd very much appreciate someone looking into this for us, if they are prepared to Sticky me.
I suspect that one of the reasons the "missing pages" problem is taking so long to get a handle on is because we Webmasters are blind to the root cause of the problem.
I believe that the problem is rooted in PageRank and/or Backlinks. Neither of which we can see accurately:
1. Immediately after Big Daddy was rolled out, people started to report "whacky" PRs, as reported by the Google Toolbar. I believe those funny's have now disapeared (I never saw any), but we have no way of knowing what has happened to the "real" PRs.
2. Post Big Daddy, a "link:www.mydomain.com" shows just one backlink to my site (even though there are many more). Again, since "link:" searches were changed a while back so that they no longer show the whole picture, these discrepencies can always be dismissed. I have no way of knowing if the backlinks are truly missing, or if I am simply unable to see them. I suspect that they may well be missing. Hence maybe my PR is now a lot lower than I am seeeing on the Toolbar. Certainly, before BD, a site: search showed many more backlinks for my site.
3. An incorrectly deflated PR, could explain why I see what I see. Only pages that I link directly from my Home page get indexed nowadays. As soon as I put a link to a page on my Home page, in it goes. Remove the Home page link, and out it goes. Maybe my "real" PR is now so low that it only merits indexing one level deep?
Has anyone at Google looked at the "real" PRs and the "real" backlinks lately? If I am right, and these have somehow gone wrong - maybe a lot of backlinks are now truly missing? And, therefore, the PRs are now innacurate?
interested folks might want to check if their search traffic has gone up/down by a major amount
I've seen quite a change, what may be a migration to another SE, and quite a drop. I sure hope that Google can claw its way out of the hole it has dug for itself.
Looking at just the uniques from the last year for one of my sites:
Jan 06..1,960.....92....1,275 First appearance of Supplemental pages
May 06.....48....178......355 (8 days only)
I'm glad someone from Google is paying attention. Like Clint, my problem is not with supplemental results but with the fact my site has almost totally disappeared from Google.
Basically: New site, submitted to Google in March, with about 50 pages. All original content, no spam, no black hat craziness of any kind. Site was initially indexed up to about 40 pages worth; then in April pages started to disappear from the index. Now I have only the main URL left, in two versions.
Anything I can do to fix this?
To Googleguy, I would like to say that it is great to at last have some feedback but it does sound as though you are barking up the wrong tree and still don't really recognise the scale of the problems being experienced. You appear to be of the opinion that those whose sites have all but dissappeared from the index were deserving of such treatment because you can see that they have a penalty.
First of all, did you check to see whether the penalty really was deserved or whether it was awarded in error? Did you actually see the offence yourself? How do you know that penalties are not being awarded erroneously?
I know for sure that I had no duplicate content, no spam techniques, just a simple website selling web hosting and web design. It has about 40 pages (only homepage displayed now though in site:). All page titles and descriptions are different, I even have a sitemap and can see that Googlebot has has visited them in the past.
Secondly, why can you see whether we have a penalty and we can't? Is there really any harm in putting a red dot on our sitemaps account summary page with the text "Duplicate content penalty" with a link to what this mean and even specifically what content?
Surely that would beat all the emails Google must receive daily asking why there site has been dropped.
Also, there should be a simple link to press if you believe the penalty was awarded in error. A human should then review the pages and correct the penalty if it was an error.
Honestly, I'm exhausted by Google. I don't have trouble with Yahoo or MSN, my pages are always reliably there. Why is Google such hard work? I want to spend my time writing content for my users, and designing web sites for clients not working as a Google slave.
I never considered writing any special code for the search engines until now. There was simply no need, but I'm forced now, for the first time, to do that. Since I now have nothing to lose (except the homepage), I have written a separate template which php will use for the Googlebot when he comes. It's nothing fancy, it simply removes the fancy navigation, images etc which has the effect of putting the content nearer the top of the page. That way if it's a problem with duplicate content because it's only checking the first 200 lines of my template and never sees the real content, it should then avoid awarding it a duplicate content penalty.
It also will hopefully increase my keyword density (or at least help to make Googles calculation of this, accurate).
GG, please don't think I'm getting at you. It's great that someone at Google is taking our concerns seriously. I hope you're able to kick some butt there and get things sorted out for us real soon!
Big Daddy took out dmoz clone sites and I believe a lot of crap directories (probably more types of sites) and since newer sites rely on backlinks to get indexed faster, (since they can't not build PR on thier own) that with little or no link credit form directories our sites are now not able to get indexed beyond the first level until it increases.
IT's all about authority status... Big Daddy killed what little authority newish sites had and therefore kill our indexed pages. I run lots of newer sites as well as older sites. Only the newer ones where hit. Older ones still have their pages and get new pages indexed.
No one knows the reasons behind pages that are dropping from the index, that is ok. But how come a site of mine that used to have 150.000 indexed pages to have one, with tons of inner pages retaining their PR?
To keep it short, how come a page that is not indexed in google to have its own PR?