Forum Moderators: open
I have one more question. I have the root site /, and two subsites /dir1/, /dir2/ which are all 3 completely different sites. The root / gets hit by freshbot on a few pages (like mentioned above) every couple of days, /dir2/ doesn't get any hits and it's not indexed while /dir1/ gets hit every day by freshbot on ALL the pages.
It's true that /dir1/ has no less than 8 links in dmoz BUT when the directory structure of the site changed the links in dmoz became dead. Other than those links the /dir1/ doesn't have any more backlinks. I've submitted the new urls for dmoz and I'm waiting for the url update (it didn't happen just yet).
Can someone enlighten me? Why does /dir1/ have preferential treatment from freshbot compared to / and /dir2/. Both /dir1/ and /dir2/ have been uploaded at the same time (before the beginning of the december crawl) and are equally accessible from the same page within the root site.
The /dir2/ site has even quality inbound links in pages freshly indexed by Google and both / and /dir2/ have had fresh pages on a regular basis compared to /dir1/.
I just don't get it. Google works in mysterious ways :)
DoU
Have you got several links going to the new pages? I add a couple of pages a week, and make sure there are at least two links on already-indexed pages to each one (plus an extra link on my sitemap page). Everything is in the root directory for simplicity. Freshbot seems to love it and I get a good crawling most days.
Now I'm worried that if I slow down the addition of new content, Freshbot will get bored and go away for good. I get twitchy if I haven't added content for 3 days or more...