| 11:45 pm on Aug 1, 2006 (gmt 0)|
Yes, I am seeing pages index dropping. The pages were previously indexed, now they are gone. At this time I can identify about 25 pages without a problem as being not indexed.
| 11:52 pm on Aug 1, 2006 (gmt 0)|
I'm seeing many supplementals gettting much better site: command placement then the index and stuff that follows. According to Matt, wait till the end of summer to see fix bots trying to help.
| 8:40 am on Aug 2, 2006 (gmt 0)|
I'm begining to see the same problem that I saw at the start of the month. Just noticed that my index page has drpped out again. this has happened about 3-4 times (at least) since April - May
[edited by: Scurramunga at 8:41 am (utc) on Aug. 2, 2006]
| 9:15 am on Aug 2, 2006 (gmt 0)|
I will confirm this also as well as the supplemental thing sara said.
Also seen once stable links dropping slowly.
| 3:56 pm on Aug 2, 2006 (gmt 0)|
Very frustrating, I wish googleguy or Matt would address why pages keep getting dropped.
| 1:56 pm on Aug 3, 2006 (gmt 0)|
I WAS seeing better site: data, but like mentioned above, it started to reverse again in late July (for me it was closer to the 20th than the 27th, and in some cases, early July). On a lot of sites I monitor, even pages with unique content, a lot of incoming links, and no scraper activity targetting them are getting either dropped or going supplemental.
To say that I'm anxiously awaiting some sort of fix is an understatement; the quality of the returned searches dips when informative, relevant sites are excluded.
| 2:03 pm on Aug 3, 2006 (gmt 0)|
Yes, I have noticed since yesterday 2nd August that some indexed sub-directories and pages were dropped.
| 2:38 pm on Aug 3, 2006 (gmt 0)|
I would re-write the text on the supplemental pages. I had this problem, spent 4 months re-writing text and the supplemental problem went away. It has been three months now and no supplementals. I think a supplemental page is google's way of telling you that there is something with that page it does not like. (Maybe your meta tags are duplicate and you don't need to re-write the body text, just the meta tags.) Yes, it is a big job to re-write text, but it is worth the result.
| 5:32 pm on Aug 3, 2006 (gmt 0)|
I have a number of supplemental results and have rewritten all the text and added more text. Still have to change the title attributes and meta tags for the pages. Hoping this will get the pages back into the regular index.
However, I still seeing pages drop out of the index that have been there for a couple of years. They never went supplemental, just gone! I am using the site: command - Is this command still not functioning properly?
Is anyone else experiencing this where their pages are just being de-indexed?
If so what are you doing about it?
| 5:48 pm on Aug 3, 2006 (gmt 0)|
It is happening to us and we haven't found any way to stop it yet.
| 6:34 pm on Aug 3, 2006 (gmt 0)|
Any of you using sitemaps?
| 6:37 pm on Aug 3, 2006 (gmt 0)|
Can I ask you all a question about your dropped pages? Do they have a lot of content on them? What size are they on average?
| 6:39 pm on Aug 3, 2006 (gmt 0)|
Same problem here. From hundreds indexed 3 weeks ago. Then on the 27th pages moved to supplemental. This afternoon all but 13 pages have gone to supplemental.
If the last few months are anything to go by, if I don't do anything, I expect the pages (at least some of them) to re-appear fully indexed. But then this will only last a few days!
Is this familiar?
I really wish I knew if there was something wrong with the pages, or is it google? I know Matt says wait for the summer to end, but that doesn't help when your site is about UK holidays.
| 6:41 pm on Aug 3, 2006 (gmt 0)|
i'm using Sitemaps.
Many pages are template based, but contain unique content, average a couple of hundred words.
Does anybody know whether a pretty large flash header on each page could be deemed as dupe content?
| 6:41 pm on Aug 3, 2006 (gmt 0)|
same problem here.
I noticed it last night. Then I did a site: query again later and it was back up. I figured today, well I worried that today, my pages would drop to what I saw.
I was unfortunately right.
Anyone know what's up with that?
| 6:44 pm on Aug 3, 2006 (gmt 0)|
I just looked at a competitors site which I also looked at last night.
They have doubled their number of indexed pages from 120k to 249k while ours has dropped to 5k.
At least in this example, the two sites are nearly exact replicas of each other in terms of layout, type of content, software that powers the site, forums, etc.
The only difference is that the site that went UP has a subdomain forums. whereas my site does not.
This doesn't tell me anything because my main site, which is of another niche DOES have the subdomains, and it went down as well.
| 7:04 pm on Aug 3, 2006 (gmt 0)|
Ours doubled today... Took the hit on 6/27 and today was the first significant increase. We are at about 25% of what we should be at.
We don't use sitemaps (we did for a while last year and felt it was too beasty for us)... This time around the only thing we did do since the hit was do nothing at all. We decided it to ride it out (if that helps contribute to the analysis).
[edited by: Yippee at 7:12 pm (utc) on Aug. 3, 2006]
| 7:09 pm on Aug 3, 2006 (gmt 0)|
"Any of you using sitemaps?"
Take a look at this thread regarding sitemaps:
| 7:17 pm on Aug 3, 2006 (gmt 0)|
Just noticed today I've lost 10's of thousands of pages from the index.
None of my stuff seems to be in supplemental though. I'm not sure if I'm blessed with never having supplementals ... or I just don't know how to find them.
| 7:25 pm on Aug 3, 2006 (gmt 0)|
So what your saying is, G takes a deeper adn longer look at sites submitted via sitemap programme?
| 7:29 pm on Aug 3, 2006 (gmt 0)|
Just to throw in:-
I don't know if anybody has spotted any crawl patterns that then result in changes to pages indexed or supplemental.
I normally receive a farly large crawl (1000+ hits from the bot) then a large change happens to the supplementals 2 or 3 days later. And here we are again.
The last 2 days have been quiet on the crawl front. But today it's picked up again.
| 8:32 pm on Aug 3, 2006 (gmt 0)|
I am using a sitemap on my site - I do not use the Google sitemaps.
My pages that were dropped have a size of between 10K and 20K - there are no images included with these sizes.
| 11:13 pm on Aug 3, 2006 (gmt 0)|
Please don't bite my head off here. I may be way off base, but...
Google are rolling out a new "site:" command that reports much more accurate results for large sites. So, if for example the old "site:" command reported 100,000 pages, the new one will report around 10,000 pages.
I only mention this because in the last few days they've started to roll the new version out accross a larger number of DCs - previously it was only running on one or two. Maybe this accounts for some of the lost pages for some of you?
| 12:03 am on Aug 4, 2006 (gmt 0)|
I wish that were it, but the pages were there and now they are not.
| 12:18 am on Aug 4, 2006 (gmt 0)|
CLintFC, Where are you getting this information?
"Google are rolling out a new "site:" command that reports much more accurate results for large sites."
| 1:24 am on Aug 4, 2006 (gmt 0)|
|"Any of you using sitemaps?" |
Yes I am. Although I don't know if I can attribute my problems regarding Google dropping pages and keyword result changes to sitemaps.
I will say one thing though, sitemaps query stats section is way off in regards to search terms listed in some cases. Yet in other cases within the same query stat lists, data may be correct or only partially correct.
Query stats are still showing me in # 1, 2, 3 positions for search words or phrases that no longer work. In such cases, when I try to follow sitemap query stat results using their respective links on the list (that point to the DC) my site is not within the search DC results. I also found this to be the case previously for the last three months in a row when pages dropped off the index. My sitemap is being downloaded frequently, so it's not a case of outdated data or anything.
[edited by: Scurramunga at 1:25 am (utc) on Aug. 4, 2006]
| 1:09 pm on Aug 4, 2006 (gmt 0)|
My regular bounceback out of the supplementals would have normally been underway by now. Crawling has really dropped off too.
| 2:47 pm on Aug 4, 2006 (gmt 0)|
Same here, not getting anything added back.
| 3:00 pm on Aug 4, 2006 (gmt 0)|
I've noticed something that can be taken a few different ways; in several of the sites I manage, Google is only indexing 1 level deep. What I mean by this is that if most of the IBLs point to the root, then Google is indexing and caching all pages directly linked to off of the root.
There are a couple of solutions, none of which I intend to do because it would be poor for the user experience (hopefully the Googlers read that part):
1. Slap a mini-sitemap into the footer of the index page; that would increase the number of pages directly linked to off of the root. Problem with this: ugly and very spammy looking.
2. Send the majority of IBLs to a sitemap page. Problem with this: No one wants a sitemap to rank; we want the proper page to rank.
3. Instead of creating a few hundred articles on individual pages for a domain, create a few hundred sub-domains for the domain, each housing one article page and run each through some sort of RSS parser for syndication. Problem: I've seen this working several times now, but it has to be short-lived as it is technically backwards to give that much weight and power to sub-domains.
So what am I doing with my supplementals?
Well, the content is unique and has incoming links, so not much. This is a flaw in Google that I am not willing to screw up existing site architecture for. Future sites may need to incorporate one of the above architecture changes if their problem (and it is a problem, not a feature) isn't fixed relatively soon.
| This 36 message thread spans 2 pages: 36 (  2 ) > > |