Robots.txt Generator

Use Robots.txt Generator For Free

Robotstxt Generator

Robots.txt Generator For Free

Generated robots.txt:

Download robots.txt
Robots.txt Generator User Guide

Robots.txt Generator User Guide

Step 1: Input Your Data

  • User-agent: Specify which web crawlers the rules will apply to. Use * for all crawlers, or specify names like Googlebot.
  • Disallow Paths: List paths on your website that you don't want crawlers to access. Example: /private
  • Allow Paths: Optionally, specify paths that you want to allow crawlers to access. Example: /public
  • Sitemap URL: If your website has a sitemap, provide the URL. Example: https://example.com/sitemap.xml

Step 2: Generate the robots.txt File

  • Click the Generate robots.txt button to create your file. The result will appear in a preview section.

Step 3: Review the Output

  • Check the previewed content to ensure all rules are correct and reflect what you want for your website.

Step 4: Download the robots.txt File

  • Click the Download robots.txt button to save the file to your computer as robots.txt.

Example Usage

Scenario: You want to allow all crawlers except for the /admin section of your site, and you have a sitemap to include.

User-agent: *
Disallow: /admin
Disallow: /private
Allow: /public
Sitemap: https://example.com/sitemap.xml

Tips

  • Disallow everything: To block all crawlers from indexing your site, set Disallow: / for all agents.
  • Allow everything: To allow all crawlers to index your entire site, use Disallow: without any value.
  • One file for all crawlers: Use * as the User-agent to apply the same rules for all crawlers.

Troubleshooting

  • If the file doesn’t download, ensure all required fields are filled out.
  • Double-check rules in Disallow/Allow if certain paths aren’t being respected by crawlers.

Together, Online.

We're Here To Help Your Small Business Or Remote Work Team.

Skip to content