Forum Moderators: Robert Charlton & goodroi
So, either Google doesn't account for even a small percentage of my traffic (which I doubt) or the way Google stores information about my site has changed. i.e. the 20,300 pages are still there, Google will only tell me about 509 of them. As far as I can tell, I think the other pages have been supplemented.
>Well one should better tell G/Y/MS etc that
One doesn't tell Bill Gates the truth unless one wants to witness a temper tantrum. Microsoft Uber Alles!
As for Yahoo and Google, I suspect they already know that the war against e-mail spam, just like the wars against terrorism or organized crime or disease or hunger, will never be over. Victories here and there, surely -- but the war goes on. If the evil forces lose badly enough, they just change their name and start over. (How many names has VStore gone through now? Or Caldera?)
when my pages dropped I saw a 30%/40% reduction in traffic and a few days ago I saw a small increase with command site:url not much from 148 to 204 but the odd thing is traffic has shot up in fact yesterday was my best day for a few months and according to my stats G traffic has increased by 50%
site 1 had inventory pages with not enough text to differentiate from other inventory pages or other text only pages that only had one paragraph of text.
site 2 had paintings for sale with not enough description to tell Google this page is different than the other.
site 3 had descriptions and text on the page too similar to other pages.
i.e., each page needed more text to get past the duplication penalty.
[72.14.203.99...]
[72.14.203.104...]
[72.14.207.99...]
[72.14.207.104...]
Are you sure that both results came from the same datacentre? For one site I am looking at now, I see 3500 pages indexed in BigDaddy datacentres, and just 45 pages in the "experimental" datacentre.
Check the IP address of the datacentre where you see each result to be sure.
The site I'm monitoring that has the worst effect shows:
87% of pages missing on this DC: [72.14.207.104...]
29% missing from this DC: [72.14.203.104...]
That is a very telling post. I wonder how many other people who are seeing low number from the site: query are still seing no drop in traffic.
I was going to post about this the other day.
I have a solid established site (no funny business or dup etc) the has lost 75% of its pages on a site: search.
Traffic however is only fractionally down, and accountable now that the sun is coming out.
This lead me to doing random searches looking for internal pages, all of which normally rank top 5.
It did a lot but found nearly always found the page.
There are definately some missing but I could not hit the mathematical 75% failure rate.
I do wonder whether, as well as other problems in G, their site count is now also inaccurate for sub 1,000 page sites
Here is another DC set which might belong to the "29% missing" class :-)
[64.233.167.99...]
[64.233.167.104...]
Or am I wrong?
I have many selected test query phrases for which I know the returned results for the last 4 years. When I see something different I note the IP address and keep an eye on it. Googles results morph every few months.
Google always has 2 or 3 indexes in play. At the moment they have what I call BigDaddy "A" and BigDaddy "B", which have been the main results since late last year. Additionally, there is the "experimental" results at 72.14.207.99 which last week gave very bad results and completely different handling of Supplemental Results. Those results have now migrated to several other datacentres, but in doing so, some now show a "cleaned up" version of the "experiment" with all Supplemental Results from before 2005 June no longer showing in the results.
I have waited years for that to happen, and hope it sticks and spreads.
Here is another DC set which might belong to the "29% missing" class :-)[64.233.167.99...]
[64.233.167.104...]Or am I wrong?
Yes, both of those show about the same amount but now it's 28%
thanks,
Google used to have in thier database all of our pages, and for some reason they dropped everything except for 24 pages? Why is it so?
once upon a time we had a "link:" command that worked ... just saying that it may be the "site:" command that's been tempered with, not the index...
However:
- it is ranking better than ever in google.co.uk and google.ca
- when I use the old trick of searching for [keyword -adfadfs] , the page appears in google.com at around the same position of google.ca and google.co.uk
This happened to me in the past with other sites and keywords, and eventually the site reappeared with better rankings.
I hope the same happens now.
I have noticed that thier are three sets of datacenters. In two sets when I do a search for site: 24 results (pages) come up for our site.
In the third set which is our default 24 come up as regular results the rest app. 600-1000 come up as supplemental results?
If I show 24 pages on all datacenter does that mean our site has a problem?
For me google.co.uk was serving stuff from 72.14.207.104 for a while earlier then 66.249.93.104 later on and right now -- which means that what I now see at google.co.uk is totally different to what I was seeing earlier.
But what concerns me is, if Google is doing this on purpose, for duplicate contents (which I don't think we have) I would want to know,If thier is something I need to do to resolve this issue, every day that passes, is a pity..
I feel it's a crime what Google is doing to us..Google speak up.. You are getting people so annoyed..We are loosing our entire trust in you..It's a shame..
>> When I search for terms on pages that used to be in the index
You mean a quoted search (search terms in quotes)?
I guess nothing is ever 100% sure but it looks to me like the pages are no longer in this particular index. On the bright side the pages are slowly returning so I have fewer and fewer pages the check.