We went live with a new site in November. Google Deep Crawled the site for the first time in December. According to the log files Googlebot only "viewed" 1394 pages although when the December(Jan 1st) update was finished we had 4400 pages indexed. I can't find in the logs where they visted all of those pages but they are definetly indexed?
How does Google do this? Do they have a cloaked spider? Any idea's :)