Robots.txt files are used to request search engines (or any web spidering program) not include parts of your website in their database.
You can tell engines to skip over a certian directory, file, or even the whole website, depending on what you put there. (And if you don't have one, or just put a blank one in, it'll just do anything in can find.) People can still can get into those pages you've filtered out, but they just won't turn up as results on Google.
I've not yet run into this one, though I think I did see the AntiSpyware 2009 a while back. This post on Spybot has detailed file/registry removal instructions. Looks like it calls itself Search Guard or Search Guard Assist in some places.