Please check out http://www.robotstxt.org/wc/norobots.html. This is the standard for writing and consuming robots.txt. This page also provides several good examples.
The robot.txt file is basically an instruction set that informs web-crawlers what they should / shouldn`t crawl in your site.
There are some sites that provide free, online wizards for this tasks. This is a good one: http://www.mcanerin.com/EN/search-engine/robots-txt.asp
Second that. If you want to look at the robots.txt file of a heavily used site check out Wikipedia`s robot.txt file.
As for the Google/Yahoo Sitemap Generator, try GSiteCrawler this sucker will crawl a given website and create sitemaps for Google and Yahoo.
San Diego Dream Weavers
New and improved! Now with blogging goodness!