You can also specify a meta robots tag inside the page HTML. If you want to block a lot of pages, your best bet would be to add it to your master layout or template. You get the same effect of blocking on robots.txt but without exposing a list of blocked pages.
The downside is that Google will still crawl the page and use your bandwidth, but the page won't be indexed.
I would suggest both these two:
Check the user agent for bots and if it is a bot send a 404 header and exit before page needs to load.
Also add a meta noindex just in case.
Robots.txt DOES NOT prevent indexing, just crawling.