Provide site and click Generate

Robots.txt generator

The robots.txt file instructs search engine crawlers which parts of a site should be crawled and which should be ignored, helping to control crawler behaviour.

A well-formed set of directives (User-agent, Allow, Disallow, Sitemap) simplifies indexation management and prevents accidental indexing of administrative pages.

This tool generates a properly structured file tailored for common scenarios, useful when launching a site or restructuring content.

How to use

  • 1

    Add directives

    Add User-agent, Disallow, Allow lines and an optional Sitemap line.

  • 2

    Check syntax

    Ensure each directive is on its own line and there are no extra spaces.

  • 3

    Download file

    Copy the result to the site root as robots.txt.

FAQ