This free tool -Robots.txt Generator - is designed to help webmasters, bloggers, and marketers generate their robots.txt files without a lot of technical knowledge.
robots.txt is a file that can be placed in the root folder of your website to help search engines index your site more appropriately. Search engines such as Google use website crawlers, or robots that review all the content on your website.
How to Use this tool :
1- you can allow or disallow all web crawlers to access your website. This option allows you to decide whether you want your website to be crawled; however, there may be reasons why you might choose not to have your website indexed by Google.
2- Add your XML sitemap file, or leave it blank if you don't have one.
3 -You can choose from a couple of options for search engines if you want search engines bots to crawl or not (Allowed/ Refused) and choosing Crawl-Delay.