I certainly hope so, why do you think I'm announcing this? This was a great secret that I could use to get my pages ranked months before I release them/write them.
But I'm sick of Sandbox and anything I can do to point out to Google that it sucks (like telling everyone ways around it) might get them to actually try fighting spam instead of simply delaying it.
Good point but obviously I wouldn't be placing the links in user's reach, just in Googlebot's. Another example is one of my current sites. It has over 500,000 pages which were only uploaded two months ago. Googlebot is taking forever to find them and currently I'm only at 3,000 pages in their index. (BTW - Yahoo is storming away at 30,000!)
Now six months ago when I started that project I already knew what all the page names were going to be. I could have created sitemap pages linked to from the main page and left them there until the site was ready to go online. Therefore, right now Google would have that majority of the site in it's index and ranked already.