There may be parts of your website that you want to avoid crawling for inclusion in user search results, such as admin pages. You can add these pages to the file to be explicitly ignored.
Robots.txt files use something called the Robots Exclusion Protocol. This site will quickly generate a file for you, and enter the pages you want to exclude.