Though it seems I've banned (or 20k droped) I've seen in my log yesterday it came to my website and got 365 pages! That's good, 'cause many of the URL are not on google index today, so it's seems some of us dropped on last update are going to be "taken" again.
Or not? :)
Regards (and sorry for my English)
Google dropped my site in the garbage like a box of used Kleenex last update, even though I was crawled. My site was just again crawled by Googlebot today. However, I'm not getting my hopes up. :(
I seem to have had a similar experience
My site was orignally PR5-6 went to PRO on update which I couldn't understand, later went grey,now I note Gogglebot visited 550 pages in last days.
Although none of my pages are in index yet I hope I may get back in sometime and I wasn't banned.
One hopeful Google fan
Please tell us your home page PR and high/low estimate if you see crawling. That data is of interest to a few of us out here.
|brotherhood of LAN|
nice timing, ive just been uploading those fancy absolute divs and light CSS, hopefully google will find the newer slim code much more tasty food to eat :)
For once, google has crawled at a convenient time.
PR 6. If someone is interested, can see googlebot evolution on my website (see my profile) /logs/googlebot_log
If I see some change on www2 or www3, I'll tell you
PR7 being crawled as of late last night.
To those who feel hopeless cause they were dropped last update, don't despair. It's very important for my google referrals to have a lot of pages in the index. Two or so months ago, I got cut by some 70% of pages. Then things turned around and the latest update has made a huge difference.
When you get cut from the index or have a lot of pages get dropped, assuming you aren't doing anything nasty, take a good look at your code and your internal navigation. Get a few more inbound links from anywhere you can and maybe in a month or two you might be singin a happy tune.
Go Googlebot, slurp lots and lots of pages!
Thanks Cyberdark, that was interesting. Nice looking site you have there! It looks like my crawl will be 12-24 hours behind yours as I've just had a few more pages taken. Hey, hey, hey!
I'm hoping the bot will see my newly installed forum and go take some of those pages as well.
On nav structure, I have both a site map (aimed at Googlebot more than users), and a nav bar link to all my main tier 2 pages from every page on the site. That seems to work fairly well for me, and nearly all my pages are indexed regularly. Thanks to WebmasterWorld and all you helpful users for the site map tip - it helped to get far more of my pages indexed.
What may also help my site to be indexed is that I have quite a few external deep links.
Googlebot grabbed a 1000 pages last night.
PR7 and crawling nonstop many sites.
One has PR 4
and other that was hit is PR 5
I dont have Low PR sites really so those are some results
All it's done is grab my robots.txt files...I'm waiting for the crawl now.
And I'm impatient!
Does anyone have any suggestions on the best freeware tracking programs? I'm currently using extreme tracking, but I'm sure there has to be something better. I want to be able to watch for the googlebot too!
PR5 site - Googled crawled the whole thing early this morning - about 75 pages.
pr0 site. Many new incoming links.
Googlebot has been crawling all over me since 2002-07-31 05:08:37.000
and last entry on 2002-08-04 03:35:25.000
I have a dynamic page the SQL Server generates here www.hemsell.com/resources/googlebot.htm if anyone is interested
Update: up to 20000 pages crawled. It hung my SQL server (more than 500 petitions/hour) :(
Let's hope, on August update, site reapears there :)
Googlebot grabbed robots.txt on friday morning and looks like it finished spidering the entire site last night....with the help fo a new site map....thanks WW.
stuntdubl, dig through the Tracking forum [webmasterworld.com], you'll find a lot of them recommended.
PR5, and it has been crawling for the last several days
if it actually hung your SQL server, you might want to look into using a template engine that is capable of caching repetitive SQL queries. The Smarty template engine is extremely easy to use if you have PHP on your system, and does a whole lot more than caching - but if google is halting your server, you should get some kind of query caching going ASAP to avoid losing traffic etc. or burning out your CPU ;D