Robots.txt files are used to request search engines (or any web spidering program) not include parts of your website in their database.
You can tell engines to skip over a certian directory, file, or even the whole website, depending on what you put there. (And if you don't have one, or just put a blank one in, it'll just do anything in can find.) People can still can get into those pages you've filtered out, but they just won't turn up as results on Google.
I thought I'd start a new thread rather than hijack the Windows 7 one. I took your suggestion because I had been wanting to play around with Ubuntu, and running it off the live CD was so slow, but I didn't want to mess with dual boot.