Forum Moderators: Robert Charlton & goodroi
How do I get it to go for a deep crawl as the pages that it is crawling aren't exactly the best that my site has to offer.
Are they static (looking) URIs or dynamic? For example, here is a static (looking) URI...
www.example.com/product/widget/
Here is an example of dynamic...
www.example.com/product.asp&id=777&cat=widget&id=888...
Not to sound like a newbie, but what is a deep link? How does it differ from a regular link to my home page?
Um, I don't have a sitemap. Should I get one? The pages that googlebot has been crawling have a link to my index and it has links to many of the other sections of my site.
If your site is a bunch of reviews of movies you have seen, you want to get a link from a page that is already in the index, to your review of Yellowbeard, and another one from a different page to your review of Traxx.
That will bring Googlebot directly to those pages.
You also want to have good internal navigation back up through your tree, not just back to the home page.
You still haven't told us how old your site is.
Advanced warning - the instructions are slightly difficult for a novice webmaster. However you can search for "create google sitemap" and look at the results to find some more information.
I got very good results using Google sitemaps.
It has been online for about 6 months, but I haven't had much content on it until recently.
That may be one of the main issues there. If you haven't had much content on it until recently, then you are pretty much starting from scratch. And, if the site doesn't have a decent level of PageRank, then the crawl pattern of Googlebot may be erratic and not as frequent as a site that has PR and content.
If you've got links from the pages that Googlebot has crawled and indexed that go to the other pages on your site, once you have content on them, they will get crawled eventually, it takes some time, especially with a new site that has little to no PR.