Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
I agree with the Scooter comment about not showing up in the index...have the same problem (I think) with Ink. Ink crawled 3,500 odd pages and as far as I can see 13 are in the index--boneheaded.
Now, I may be wrong because it hasn't been a full month or so since the crawl, but we'll see.
*Superlol* - scooter calls itself a crawler. Ok, yah, they crawl. However, new pages take ages to make it into the av index ... at least i hope they will get listed some day ...
Why do you think it's more efficient than google? To me, efficient means - crawl frequently - index continiously - show fresh results within days or hours. Sorry, that's what google does - afaik, no other engine currently does the same.
My newest site (est. April 2) has been crawled by google after it's been online for 5 hours and was listed after less than 36 hours. THAT is efficient!
Google's DB holds the results of the crawled pages & URL's to be crawled. So a server reset should not mess anything up -- as long as you're not relying to much on session variables. Pages heavily driven by session vars are not very friendly to a multi-spider crawl anyway, as far as I understand.
Can I perhaps blame my PR NULL <grey bar> for why they won't crawl down into my product pages, or should I start retooling my entire URL framework?
That would be my guess. I have a brand new site just indexed last crawl. The ONLY backlink is the $300 Yahoo listing which gives the site a PR3.
Deepbot grabbed all 100 product pages (*.asp?myid=99) on the first crawl. However, there are only 5 main static pages so maybe it's good to keep Deepbot bored on the static end of a new dynamic site.
I think Google may have changed servers or something.
Also agree that it seems better with dynamic pages - had 200 dynamic pages crawled on a brand new site this crawl.