Not sure if this is a known issue or if it's just my site. But my ranking has plummetted in the last week, and, for some reason, Google doesn't seem to notice the sites that are linking to me.
To repro: 1. I search my site in Google. I get my results and the results of other (established, well known) sites that link to me. 2. I search for allinurl:<my site> or link:<my site> and get no results.
The link: command intentionally returns only a sampling of links -- that's not an error, that's what Google wants to do. In earlier years the link: operator reported only links from pages above PR4 or so. Then a couple years ago they changed it to a less predictable sample that includes lower PR pages.
I don't currently see a problem with allinurl -- your report is the first that I've noticed.
I am seeing something weird with the allinurl also.
when I do my site allinurl it only shows 2 listings with the rest giving the message to see the omitted results. It has never done this before and I find it strange, I wonder if it could have something to do with the issues with the BD etc.
Have changed nothing on my site recently, did use the 301 redirect from non www to www a few months back. All pages shows have enough dissimilar content so there would not be a dup content issue.
When you look at the omitted results, I even see a url that has been blocked by the robots.txt file. I have checked the robots.txt file with several of the different checkers available and all say it is in the correct format etc. I do not how or why the bot picked up that url, because the robots txt file has not been modified in quite some time and I can go to the google sitemaps page and use the robots txt file checker and enter that url and it says it is blocked by the txt file.
I am also only showing the Home page and one directly below that for allinurl In some DCs. In others it acts as expected. My site is recently out of supplemental hell, this may be a remaining artifact.