Welcome to WebmasterWorld Guest from 184.108.40.206
To say that I'm anxiously awaiting some sort of fix is an understatement; the quality of the returned searches dips when informative, relevant sites are excluded.
However, I still seeing pages drop out of the index that have been there for a couple of years. They never went supplemental, just gone! I am using the site: command - Is this command still not functioning properly?
Is anyone else experiencing this where their pages are just being de-indexed?
If so what are you doing about it?
If the last few months are anything to go by, if I don't do anything, I expect the pages (at least some of them) to re-appear fully indexed. But then this will only last a few days!
Is this familiar?
I really wish I knew if there was something wrong with the pages, or is it google? I know Matt says wait for the summer to end, but that doesn't help when your site is about UK holidays.
They have doubled their number of indexed pages from 120k to 249k while ours has dropped to 5k.
At least in this example, the two sites are nearly exact replicas of each other in terms of layout, type of content, software that powers the site, forums, etc.
The only difference is that the site that went UP has a subdomain forums. whereas my site does not.
This doesn't tell me anything because my main site, which is of another niche DOES have the subdomains, and it went down as well.
We don't use sitemaps (we did for a while last year and felt it was too beasty for us)... This time around the only thing we did do since the hit was do nothing at all. We decided it to ride it out (if that helps contribute to the analysis).
[edited by: Yippee at 7:12 pm (utc) on Aug. 3, 2006]
I don't know if anybody has spotted any crawl patterns that then result in changes to pages indexed or supplemental.
I normally receive a farly large crawl (1000+ hits from the bot) then a large change happens to the supplementals 2 or 3 days later. And here we are again.
The last 2 days have been quiet on the crawl front. But today it's picked up again.
Google are rolling out a new "site:" command that reports much more accurate results for large sites. So, if for example the old "site:" command reported 100,000 pages, the new one will report around 10,000 pages.
I only mention this because in the last few days they've started to roll the new version out accross a larger number of DCs - previously it was only running on one or two. Maybe this accounts for some of the lost pages for some of you?
"Any of you using sitemaps?"
Yes I am. Although I don't know if I can attribute my problems regarding Google dropping pages and keyword result changes to sitemaps.
I will say one thing though, sitemaps query stats section is way off in regards to search terms listed in some cases. Yet in other cases within the same query stat lists, data may be correct or only partially correct.
Query stats are still showing me in # 1, 2, 3 positions for search words or phrases that no longer work. In such cases, when I try to follow sitemap query stat results using their respective links on the list (that point to the DC) my site is not within the search DC results. I also found this to be the case previously for the last three months in a row when pages dropped off the index. My sitemap is being downloaded frequently, so it's not a case of outdated data or anything.
[edited by: Scurramunga at 1:25 am (utc) on Aug. 4, 2006]
There are a couple of solutions, none of which I intend to do because it would be poor for the user experience (hopefully the Googlers read that part):
1. Slap a mini-sitemap into the footer of the index page; that would increase the number of pages directly linked to off of the root. Problem with this: ugly and very spammy looking.
2. Send the majority of IBLs to a sitemap page. Problem with this: No one wants a sitemap to rank; we want the proper page to rank.
3. Instead of creating a few hundred articles on individual pages for a domain, create a few hundred sub-domains for the domain, each housing one article page and run each through some sort of RSS parser for syndication. Problem: I've seen this working several times now, but it has to be short-lived as it is technically backwards to give that much weight and power to sub-domains.
So what am I doing with my supplementals?
Well, the content is unique and has incoming links, so not much. This is a flaw in Google that I am not willing to screw up existing site architecture for. Future sites may need to incorporate one of the above architecture changes if their problem (and it is a problem, not a feature) isn't fixed relatively soon.