| 9:02 pm on May 15, 2006 (gmt 0)|
|Maybe the site: command is broke, not unlike the link: command. Perhaps the pages are actually there. |
Here we go again...
| 9:13 pm on May 15, 2006 (gmt 0)|
Hmm, we had this exact comment less than 2 weeks ago from someone else, and then it got fixed.
Are you saying that they broke it again?
| 9:24 pm on May 15, 2006 (gmt 0)|
There is nothing we can do, but do what we are doing.. crying about it.
Sit back and lets see what happens.
| 10:28 pm on May 15, 2006 (gmt 0)|
this site that has been deindexed is mainly about a trade show. It does have one page devoted to a private collection of items for sale and only adds up to 12 or so pages. No ads or anything. Straight html, all pages have their own descriptions and metas. All validate, done xenu, and 100% original content.
Why the heck would Google decide that it is a problem and deindex it. It has NO seo. Heck, I don't think I even used h1 tags or anything.
Just makes me kind of sick.
On the other hand, msn has indexed it nicely.
| 4:15 am on May 16, 2006 (gmt 0)|
One of the sites I watch but have not checked for a while recently jumped and mean in the last 2 weeks from about 500k in pages to about 1.4m pages indexed. I checked it again today and it was at 2.2m pages in google. So over the last say 4-5 months they have about 4 times as many pages IN.
| 6:08 am on May 16, 2006 (gmt 0)|
yet again my page count moved upwards from yesterday! moved from 217 to 219 still on target for a fully indexed site in the year 2008.
what happened to the old Gbot of sucking dry just about anything on site that could be crawled
| 6:26 am on May 16, 2006 (gmt 0)|
|what happened to the old Gbot of sucking dry just about anything on site that could be crawled |
sigh...the good old days of the google dance driven by a full moon ;)
| 8:03 am on May 16, 2006 (gmt 0)|
I don't know whether this board allows "reply with quote", but to the individual who posted about osCommerce (I think it was directed to me) - THANKS - you've reassured me that Google is being equally cruel to all dynamic sites, and that what I have done is good - in the long run!
May as well optimise the site even more, I can hardly lose any more pages!
| 8:11 am on May 16, 2006 (gmt 0)|
Yesterday i have lost 60%, now its 74% of all indexed pages on all my domain, traffic and serps are stil the same.
I do have one new domain which is index today, 10 totally unique pages with unique title's and meta's and unique content, all under omitted results.
| 9:39 am on May 16, 2006 (gmt 0)|
Just an update on my site which had 100,000 pages indexed back in feb/march and then went down as low as 170. It is now back up to 376. Magic.
I have made some changes to the site to make sure printer friendly pages do not cache and that I have a 1 url = 1 page situation. I estimate that if Google were to crawl my site and list every page it should be around 10,000 pages. Jan 2009 and I should be fully indexed.
| 10:53 am on May 16, 2006 (gmt 0)|
Although my site has been reindexed I still follow the threads to see if others have recovered and to read of any other problems that may arise out of the current situation.
Last night I visited the url from earlier in this thread
I sat and went through a fair few sites that people had mentioned and posted url's for just for a look.
I had several reasons for doing this.
One was to see if there was anything obvious or easy to spot that they all had in common.
Out of the random ones I picked over 80% did, a problem (IMO) my site had until recently.
I posted in another thread that I was set to tinker with several sections of my site and see how it went, I did and seen improvement.
Now that could be coincidence, I know that. I could have changed parts and struck lucky at the same time by getting reindexed. Now certain parts of my site are still the same as they were when those pages went missing but a lot have changed, the % of changed pages is greater than those not.
This was a "look" I took, not a study. I never took notes either but perhaps it's worth taking a look and comparing those mentioned sites with your own.
The one problem with this is we all optimise our sites differently and what I seen other may not.
Another thing I must add is that I'm not suggesting Google doesn't have issues but at a time when sites and pages are MIA anything is worth a look and if Google does have problems no one knows when they will be resolved.
Sorry I can't offer more or better advice, I wish I could.
| 11:03 am on May 16, 2006 (gmt 0)|
So we can sum up your post like this..
"I noticed what might be causing problems for 80% of the sites"
"But I'm not going to share it with you so bad luck"
| 11:08 am on May 16, 2006 (gmt 0)|
|So we can sum up your post like this.. |
"I noticed what might be causing problems for 80% of the sites"
"But I'm not going to share it with you so bad luck"
Sorry if it seemed that way, it wasn't meant to be. I was just suggesting other could take a look and see if they spot any common traits as I did.
I was only sharing a thought that may benefit others.
Perhaps in future I should keep my thoughts to myself.
I really didn't mean the post to read like "Im alright Jack........."
| 11:12 am on May 16, 2006 (gmt 0)|
think the problem is a lot of people myself included are getting REALLY hacked of with G messing around so if someone posts something "could" be read another way tend to bite! I just walk around the garden
| 12:06 pm on May 16, 2006 (gmt 0)|
Is it just me or is Googlebot spidering like crazy today. Looking at my site stats its taking a page nearly every second?
| 12:43 pm on May 16, 2006 (gmt 0)|
I see the same thing. Googlebot is spidering like crazy today.
| 12:45 pm on May 16, 2006 (gmt 0)|
I have dynamic sites, google dropped some pages, but then they came back. As far as session id's go, I have a switch whick does not allow spiders to start sessions and it seems to work fine. Just make sure there are no session id's in your site map.
| 12:48 pm on May 16, 2006 (gmt 0)|
I am also seeing alot of spidering this week.
From googlebot 2.1 and mozzila bot
| 12:50 pm on May 16, 2006 (gmt 0)|
Same here 10K so far
| 1:11 pm on May 16, 2006 (gmt 0)|
Last night, half of my indexed pages dissapeared. The kicker is that for the first time, many of my pages have gone supplemental. I can't explain this... I've worked a purely white hat strategy in a black hat topic and got burned...
Anyone know the Cutts email for reporting supplemental pages?
| 3:32 pm on May 16, 2006 (gmt 0)|
I've been following this topic with interest. I see people have been looking for trends. I have two fansites, one of which is affected, one of which is not. They make good contrasting examples, so I thought I would offer them.
The website that is unaffected has thousands of pages and went online in 2001. However, I have not updated it on a regular basis for several months. Its format is very basic. It is still #1 for our keyword and when I do the Site: search all 10 pages of results are non-supplemental. This has been the case for many months; there was no improvement or change at all regarding its positioning or indexing.
My other website has been drastically affected. It went online in November of 2005. I update it several times a week. I have been experimenting with .css and am not yet good at it; therefore its coding is not as clean. Pre-Big Daddy, its position for our keyword had reached #12. Now, it radically changes daily, up and down, and completely unpredictably - it doesn't rise every day or fall every day. One day it is #33, one day it is #72, the next day it is #55 - you get the idea. Initially after Big Daddy, 4 pages were non-supplemental out of over 600 total. After a couple weeks, that had gone up to 12 - all of which were older versions (one of which had actually been deleted). A site: check today reveals that the site now has only 7 pages that are non-supplemental. Again, this is completely unpredictable, and I have done nothing differently on the pages that are indexed than I have on the pages that are supplemental.
I'm afraid I have no answers, just these strikingly different examples to contribute to the pot.
[edited by: jatar_k at 3:40 pm (utc) on May 16, 2006]
| 4:38 pm on May 16, 2006 (gmt 0)|
Does this come to show, that newer sites are affected due to Big Daddy and older untouched sites is being untouched with the Bid Daddy Infrastructure?
| 4:40 pm on May 16, 2006 (gmt 0)|
its hitting both, my friends site is 2 years older than mine and he got hit as well
| 4:43 pm on May 16, 2006 (gmt 0)|
Our domain name is 6 years old, however we did a redesign that launched in Jan. '06. I have a feeling that redesigned sites and new site are especially hit hard with Big Daddy.
Correct me if I am wrong..
| 4:44 pm on May 16, 2006 (gmt 0)|
I've got client sites from many years ago that I still monitor. They haven't been hit. My theory is that if you made major changes to your site due to last years Google update madness, then you're likely to get hit with supplimentals etc. If your site has remained unchanged for many years, no matter how much spam is in it, it will probably stay where it was.
p.s. The older sites that I monitor still are crammed with pointless, repeating keywords ... and they are still getting number 1 listings for massive keywords (250 million +).
Oh well, wadyagonndo?
| 4:48 pm on May 16, 2006 (gmt 0)|
I am beginning to think that what Google is doing is intentional. Some recent things done - PR in the toolbar may not be accurate, backlinks only a sampling. Could they be changing the site: command, so that only a sampling of the pages indexed are shown? Obviously, this would throw off many web site owners attempting to optimise their sites. After all the less to analyze out of the Googleplex the harder to gain a higher ranking in the SERP's.
Another thought is that if Google is having problems with indexing pages of a site - wouldn't it affect the SERP's - number of pages of a site and lost backlinks thereby causing a SERP's change?
| 4:50 pm on May 16, 2006 (gmt 0)|
Funnily enough the same thought hit me this afternoon - that this is the shape of the new Google. Fluctuating results, no-analyzable algorithm - the only question is "Would the searchers notice?" No - why should they?
| 4:58 pm on May 16, 2006 (gmt 0)|
My older sites are holding up well but newer sites are not. Newer for me means anything less than 4 years old. My sites older than 5 years are well indexed (without Google sitmaps thank you). I do find it odd that a site 3-1/2 years old is down to 20 pages.
| 5:05 pm on May 16, 2006 (gmt 0)|
So my theory is correct. Google is mostly giving problems for new and major redesigned site..
| 5:14 pm on May 16, 2006 (gmt 0)|
|My theory is that if you made major changes to your site due to last years Google update madness, then you're likely to get hit with supplimentals etc. If your site has remained unchanged for many years, no matter how much spam is in it, it will probably stay where it was. |
I disagree with this, and have proof. I completely redid the URL's on two sites in the beginning of March using the same format. One of the sites has been re-indexed correctly and is getting more traffic. The second has dropped off, is mostly supplemental, and gets about 20% of the traffic it use to.
| 5:27 pm on May 16, 2006 (gmt 0)|
>So my theory is correct. Google is mostly giving problems for new and major redesigned site
I also disagree I did a major rebuild for a couple site of sites late last year and as we assuming BD started life around November time both sites have gone through this fine, in fact one site is sat at number one for a very competitive keyword across both .com & UK so rebuilding a site will not trigger this, don't ask me what does though as I have no idea other than the one the bulk think here - G is broke
>>Another thought is that if Google is having problems with indexing pages of a site - wouldn't it affect the SERP's - number of pages of a site and lost backlinks thereby causing a SERP's change?
maybe hence the reason the emails from G are saying use the sm
| This 249 message thread spans 9 pages: < < 249 ( 1 2 3 4 5 6  8 9 ) > > |