What I did was password protect my domains as I worked on them online. Robots.txt is only obeyed by good bots not humans or bad bots. IMO the latter two are more of a problem when you're trying to put things together. A friend of mine was using robots.txt only, and a human happened to stumble on his new site. The visitor thought the info was great and placed a link to his site on a busy forum. He was getting a lot of traffic before he was ever ready.
Advantage of pp: I didn't have to worry about checking logs every day to see if some bad bot/surfer wondered in and was taking the info as I placed it online. I didn't have to wrestle with the problem that an SE might pick up a splash page and not come back for the rest when it was ready. I was also able to easily identify several bad covert bots and write to their ISPs/hosts to make them go away.
Disadvantage of pp: Any bots that came by when my site was closed eventually went away. Some of the good bots would stop by every 4-6 weeks to check on it. So, when the sites went live I had to lure them back again. I did this by getting a couple high PR sites to deep link.