dHex - 12:07 pm on Mar 16, 2013 (gmt 0)
Thank you for the tips guys!
Here is more information: I have a custom logging module that records every page request and most of the requests that are coming from Googlebot are 404 pages (I requested removal of these pages from Google index and the pages were removed around December last year). I know a little bit about how Google uses two sets of programs to crawl and index pages. And I guess that takes time but 3 months is quite a long time for Google not to re-crawl and try to index all my pages. I have made substantial changes on these pages. I only let Google index pages that have decent amount of content to avoid panda.
As for backlinks, I have contacted some colleges in my niche and managed to get 5 links. But the links are either page rank 2 or they are on pages with no page rank. I have also managed to get one site wide link on a hobby blog with page rank 1.
My website has close to 10,000 pages and I'm blocking more than 9,500 of them by robots.txt and meta tag.