Yeah, when Googlebot visits your robots.txt file, it needs to know first whether its allowed or not. So if you enter the "dis-allow all robots code" first, it will simple read just that and not whats after it. But if it knows that its allowed, it will read and follow that, forgetting about the next line of code which dis-allows all robots.
Here's the valid code which dis-allows others while allowing only Googlebot:
User-agent: googlebot Disallow:
User-agent: * Disallow: /
Alternatively, you can use this one [searchengineworld.com] which will allow only the spiders which are "nice".