Picking up new pages? So what? Freshbot has always picked up any new page I have put up.
Just for the record, last month I raised the idea here that Deep Crawl might now be taking place with IPs that were previously thought to be Fresh Bots, but no one was willing to contemplate this scenario (see [webmasterworld.com ]). However, after reading this discussion, it sounds like people may be warming up to the idea. That said, my site has been pounded by 64.* for the past ten days. I've not added up the daily totals, but suffice it to say that it's approaching 1 million pages and still shows no sign of slowing down. In parallel with this, normal Fresh Bot behavior has been taking place, with Freshie eating up around 50k pages during the same period.
You may be right and that is good info. My point is that it is not useful to say that the deepcrawl is happening because new pages are being picked up. It certainly is a strong possibility that the 64 range is now deepbot.
I can't think of any reason why GoogleGuy can't confirm or deny this.
It is possible that googlebot is crawling from freshbots old IP range - but I have not yet seen any reasonable explanation as to why they would do that.
What is a fact, and has been shown by several webmasters logs (including matching of SID's where applicable) is that freshbot has crawled the original deepcrawl index from April.
It may be that has been the case for you also. In doing so, inevitably, the behaviour looks like deepbot because it is, in effect, following deepbots footsteps.
I think Google's bots must be on a diet or something. They just nibble a bit here and there and come back a few days later... Before, they would eat the whole plate and you didn't see it come back until a few weeks later...
Must be that new algo again
Looks like it. The bot is tending to eat the robots.txt and then come back for half the site after about 30 mins then the rest 30 mins after that and then anything it missus about an hour later.
do general public ever use 64. etc and 216. etc as I get google proxy wap visits everyday? and get 64. etc most days from various places yet i have no listing on the web of my site etc as it is new?
Normal surfers can come in on 64 or 216 etc and often do as you have stated.
Following Fridays crawl my pages were live Saturday/Sunday.
Following todays I would expect them to be live tomorrow/wendesday..
Looks like normal freshbot activity to me :)
It would help G to do an accurate/deep crawl, if for no other reason, just to get rid of the old pages that don't work anymore and have no links pointing to them for months.
While freshbot adds pages, it does not take them away and this badly affects searches.
freshbot was very active the entire last week on my sites, but i did not see any new additions and a lot that was added 10 days ago dropped out. i did not see any freshtags throughout all serps that i have checked. hm..
Funny thing I've noticed,
I've been hit by freshy, and if I view my sites listing the fresh stuff is there, but on the serps our fresh stuff is awol, but our competitors isn't. We just lost another spot ;/
Same thing here - a new site and new pages on existing sites were added by freshbot last week, then dropped a few days ago. Dónde está el UPDATE?
Re: Freshbot was very active last week but I did not see any new additions.
Fresh data was certianly added however fresh tag have not always been shown :)
Dayo_UK - i am not sure about the addition of fresh stuff, my site is pr8 and all the fresh pages (which are crawled daily) are not getting listed. the fresh pages that were listed are out now.
i mean it is normal behaviour for fresh pages to be listed and dropped again. but while lately almost everyday new pages were added (with fresh tag) for the last 4-5 nothing was added.
i am probably just spoilt expecting google adding everyday new pages of mine ;)
I had new pages show up today in Google. (site:domain.com -asd). These pages have been up for 4 months but have had minor changes. My URL's were not being visited by google at the april deepcrawl time because they were too long. This was fixed about a month ago. It's kind of random which pages were added. Of course every time I do a site:domain.com -asdf today I get a different number of links. They are doing something.
I think they just haven't been showing the fresh dates in the serps. I haven't seen any lately yet the cache on a page I've updated is now showing the update.
My site, despite fresh content, hasn't been boosted in the SERPS for about 2 weeks. My cached pages are two weeks old. In the past, when I have played with content, I have seen the new results in about 24 hours. Is anyone else seeing sluggish freshbot results? Or less frequent updates to their cached pages?
Also, I have access to my visitor IP Addresses(through my webstat program). Is this the correct place to look for freshbot/deepbot addresses? I do not have access to my raw server logs.
Freshbot normally for my site is every 2-3 days. The last fresh listing I have was in May 27-28. I have been expecting it this June 1st or so but have not seen it.
On the other hand, deepbot is currently busy, since Sunday and up to this morning it has been to my site 678 times already. So we'll see if this is the new deep crawl.
Freshbot is around. Don't be alarmed. My site was not freshbotted for a week or so then it returned. Sometimes if there's a glitch on getting to your site for some reasons (maybe the internet is slow etc) then it seems to skip your site that run.
Just one of those things.
It's been to one of my sites almost daily and visits another one every few days though it seems to really like the robots.txt file more than the content as of late. LOL.
Leave out google-bait (fresh content)
I had a bunch of new pages crawled on Friday/Saturday and I've been expecting them to show up in the index, but so far no dice.
I've had freshie update the main page, PR6, recently, but the new pages it's found aren't making it into the datacentres.
If I could stray off topic...
My hosting company had my site down for 19 hours, yesterday to today. I'm in the process of changing companies because of it having happened too many times. Does anyone know if this will deter freshie from coming back if the site was down when it visited? (Freshbot has been dropping by regularly the last week or so).
Stefan, glad that freshbot found you. :)
One of our main sites benefitted greatly from Freshbot, Ive got page 1 rankings across the board for very relative content and it's held steady for several days now. Odd thing is though is that they are new static pages on a well established PR7 site that are still greyed out yet still beating out PR5 and in some cases even beating out Yahoo.
Freshbot is visiting .. i keep updating my index page every week .. the new updates are indexed and stay in the results for 2 days max then turn back to mid-march cache .. rest of the internal pages remain the same throughout.
I am getting various address's from my logs that start with 216. Are these definatly spiders or could they be normal traffic etc?
Could be normal traffic or other spiders. (Scooter starts with 216.)
I believe Freshbot is usually 64.68.82.* . Deepbot is the one that usually starts with 216.239. And, lately, there has been speculation that freshbot is doing the job of deepbot.
216.x.x.x is a a big chunk of territory... my dynamic ISP in Canada often assigns me that IP#.
Fresh is/was 64.68.x.x and and there's a new one that starts with 64 (can't remember the rest, saw it in the logs...)
Deep is/was 216.239.x.x
Scooter is 216.39.x.x
I only saw 2 days that the freshbot refreshed pages during this months update. Has anyone else noticed this? I remember, when google was updating regularly, you would have a couple weeks of fresh updates.
Not even the news sites have fresh tags lately.
I added pages and pages of fresh content 3 days ago - it is all showing up in the SERPS now so freshbot is definitely crawling.
| This 211 message thread spans 8 pages: < < 211 ( 1 2 3 4 5  7 8 ) > > |