| 11:38 pm on Mar 4, 2010 (gmt 0)|
Yes it is normal since September 2009, thousands of URLs disappear from all sites I own daily, it means less traffic.
| 11:53 pm on Mar 4, 2010 (gmt 0)|
I have noticed similar activity.....and can also see that G is taking duplicate content very seriously and i am trying to monitor that a little better.
did your site have much duplicate content even if it was duplicate h1 headers or titles?
| 12:24 am on Mar 5, 2010 (gmt 0)|
|did your site have much duplicate content even if it was duplicate h1 headers or titles? |
Nope, no duplicate pages at all, on the contrary, they are all unique, before someone copies them, that is :)
I would understand if Google decides not to rank 40% of my website at all, but I just fail to understand the rationale behind deleting 40% of my website; is Google running out of room?
| 12:55 am on Mar 5, 2010 (gmt 0)|
these days i am spending allot of time trying to figure out if i know anything at all?
there is allot of seriously weird stuff going on!
| 1:16 am on Mar 5, 2010 (gmt 0)|
"Yes it is normal since September 2009, thousands of URLs disappear from all sites I own daily, it means less traffic."
September 2009 is the magic month for me too. I don't see a "loss" of content, but what I have seen is "new" content published since that time not showing up in the serps at all. It's crawled and indexed (usually quickly--I can find it by searching for the url), but it's totally MIA in the serps. Older content ranks well and high as usual.
What I'm also seeing frequently (as I'm digging through 10 pages of results trying to figure out what's going on), one url can be listed multiple times in the results (6th position, 29th position, etc.).
The theory I have atm is that google is held together by bandaids and glue right now while they try to work on the nuts and bolts of caffeine. From what I see, so far the crap out has been 6 months and it looks to be a few months longer yet before google has something that's "working" as it should. That means close to a year for Google=Fail IMO. Who woulda thunk?
| 3:07 am on Mar 5, 2010 (gmt 0)|
The site: operator isn't very useful for a report on how many pages are indexed - at least not right now. Your server logs can show you which URLs are getting search traffic, but there may be other pages thayt are in the index and don't get traffic (these might be candidates for future pruning).
The best way I know to see if a page is in the index is to put the full URL in the search box - but even then, I have seen pages getting Google Search traffic that still give "No results" when I search on the URL. That may be a data center difference, I realize. But whatever the reason, it's pretty near impossible to say how many URLs from a site are actually in the Google index if the site is of any significant size.
| 3:43 am on Mar 5, 2010 (gmt 0)|
|The site: operator isn't very useful for a report on how many pages are indexed - at least not right now. |
Well, I go with what Google WM tools are telling me and right now they do agree with the "site:" operator. I really see no reason for Google to lie to me on how many indexed pages my website has, wouldn't you agree?
For me the problem is not that those pages don't rank, I am fine with that. My website consist of a small core of pages that I do want to see ranked and the rest is just unique informative content, much like a "loss-leader". Right now the pages that I want to see ranked - do rank, but lower than usual and I cannot but think that the disappearance of 40% of my content did the damage.
| 3:54 am on Mar 5, 2010 (gmt 0)|
|I really see no reason for Google to lie to me on how many indexed pages my website has, wouldn't you agree? |
Not intentionally lie, no. But they certainly can and do have problems retrieving accurate data. A classic example, take a site with 4 internal directories. If you run these directories with separate operator searches:
...you'll often find that the total of those four numbers is significantly higher than the number you get from a basic site:example.com query.
| 1:54 pm on Mar 5, 2010 (gmt 0)|
atlrus, I have been talking about this very thing since January in several ongoing threads.
I have about 12 years under my belt in this and I have never seen anything like it. Not only is it happening to us but several of our competitors, most of which you would know by name.
No pattern can be found. Once googlebot stopped deepcrawling us mid January pages started falling out and continue to drop at a predictable rate. We went from a few 100,000 googlebot hits per week to 5000 hits per week. Search traffic now down by 15%.
No major changes to the site, same core design since 1998. I/We have given up trying to figure out what Google wants anymore. Turning our attention elsewhere now.
| 2:47 pm on Mar 5, 2010 (gmt 0)|
It just boggles my mind!
Today I accidentally stumbled on a page of my website which was wiped from Google and I just don't understand why. The page, an year old, has 100% unique content and when searching for its title in quote marks I get all the in-site references, as well as all the websites that link to that page, yet even when searching for the URL the page is not indexed.
Maybe Google experienced a massive data loss?
| 1:59 am on Mar 6, 2010 (gmt 0)|
I follow a site that has ten's of thousands of pages of unique content, currently they have about 4000 pages that got put in the supplemental index. I am sure this is causing a bit of pain. I have sites that half of the unique content on the site is in supplemental, has cut my traffic down a lot. Deff gorg is broken.
| 6:41 am on Mar 6, 2010 (gmt 0)|
I've been experiencing major problems with new content since September 2009 (on a strong site with nearly 2,000 pages of content with about 50 new pages / month). It's crawled and indexed quickly, just not ranking for squat.
Lately I've been running competitive analysis checks on some big, well-funded sites in my niche that I frequently bump heads with in the serps--their (typical) hockey stick growth flatlined in either sept/oct of 2009 and still holding (little growth but nowhere what it should typically be). No one I'm tracking seems to be "losing" traffic from old content--that's still ranking as usual--it's the new content I'm seeing as MIA. And these guys crank out massive amounts of new content a month. Goog is constipated somewhere, I have no doubt.
If this is done purposefully, the only thing I can come up with is a new traffic cap of some sort put in place? And how to bump out of it, I have no idea. I don't monetize with adsense or doubleclick but the others are major players for google and if they're getting hit too, squeezing out non-google sites because of ad $ doesn't make sense.
What we all do have in common is google analytics, that's all I can find so far.
| 6:57 am on Mar 6, 2010 (gmt 0)|
That will list you all urls that google has in its index.
[edited by: tedster at 6:21 pm (utc) on Mar 6, 2010]
[edit reason] switch to example.com - it cannot be owned [/edit]
| 7:32 am on Mar 6, 2010 (gmt 0)|
bcc1234, it's a rough estimate, it is never the number of URLs they have in their index.
| 10:31 am on Mar 9, 2010 (gmt 0)|
I have the same problem. 60% of my site is not indexed and about 50% of my images are missing as well. I checked Google Webmaster Tools and it says I have no duplicate content? So why the sudden change?